Tag Archives: Avid Media Composer

Bluefish444 adds edit-while-record, REST API to IngeSTore

Bluefish444, makers of uncompressed UltraHD SDI, ASI, video over IP and HDMI I/O cards, and mini converters, has released IngeSTore version 1.1.2. This free update of IngeSTore adds support for new codecs, edit-while-record workflows and a REST API.

Bluefish444 developed IngeSTore software as a complementary multi-channel ingest tool enabling Bluefish444 hardware to capture multiple independent format SDI sources simultaneously.

In IngeSTore 1.1.2, Bluefish444 has expanded codec support to include the popular formats OP1A MPEG-2 and DNxHD within the BlueCodecPack license. Edit-while-record workflows are supported through both industry standard growing files and through Bluefish444’s BlueRT plug-in for Adobe Premiere Pro and Avid Media Composer. BlueRT allows Adobe and Avid NLEs to access media files as they are still being recorded by IngeSTore multi-channel capture software, increasing production efficiency via immediate access to recorded media during live workflows.

Editing Roundtable

By Randi Altman

The world of the editor has changed over the years as a result of new technology, the types of projects they are being asked to cut (looking at you social media) and the various deliverables they must create. Are deadlines still getting tighter and are budgets still getting smaller? The answer is yes, but some editors are adapting to the trends, and companies that make products for editors are helping by making the tools more flexible and efficient so pros can get to where they need to be.

We posed questions to various editors working in TV, short form and indies, who do a variety of jobs, as well as to those making the tools they use on a daily basis. Enjoy.

Cut+Run Editor/Partner Pete Koob

What trends do you see in commercial editing? Good or bad?
I remember 10 years ago a “colleague,” who was an interactive producer at the time, told me rather haughtily that I’d be out of work in a few years when all advertising became interactive and lived online. Nothing could have been further from the truth, of course, and I think editors everywhere have found that the viewer migration from TV to online has yielded an even greater need for content.

The 30-second spot still exists, both online and on TV, but the opportunities for brands to tell more in-depth stories across a wide range of media platforms mean that there’s a much more diverse breadth of work for editors, both in terms of format and style.

For better or worse, we’ve also seen every human being with a phone become their own personal brand manager with a highly cultivated and highly saturated digital presence. I think this development has had a big impact on the types of stories we’re telling in advertising and how we’re telling them. The genre of “docu-style” editing is evolving in a very exciting way as more and more companies are looking to find real people whose personal journeys embody their brands. Some of the most impressive editorial work I see these days is a fusion of styles — music video, fashion, documentary — all being brought to bear on telling these real stories, but doing it in a way that elevates them above the noise of the daily social media feed.

Selecting the subjects in a way that feels authentic — and not just like a brand co-opting someone’s personal struggle — is essential, but when done well, there are some incredibly inspirational and emotional stories to be told. And as a father of a young girl, it’s been great to show my daughter all the empowering stories of women being told right now, especially when they’re done with such a fresh and exciting visual language.

What is it about commercial editing that attracted you and keeps attracting you?
Probably the thing that keeps me most engaged with commercial editing is the variety and volume of projects throughout the year. Cutting commercials means you’re on to the next one before you’ve really finished the last.

The work feels fresh when I’m constantly collaborating with different people every few weeks on a diverse range of projects. Even if I’m cutting with the same directors, agencies or clients, the cast of characters always rotates to some degree, and that keeps me on my toes. Every project has its own unique challenges, and that compels me to constantly find new ways to tell stories. It’s hard for me to get bored with my work when the work is always changing.

Conoco’s Picnic spot

Can you talk about challenges specific to short-form editing?
I think the most obvious challenge for the commercial editor is time. Being able to tell a story efficiently and poignantly in a 60-, 30-, 15- or even six-second window reveals the spot editor’s unique talent. Sometimes that time limit can be a blessing, but more often than not, the idea on the page warrants a bigger canvas than the few seconds allotted.

It’s always satisfying to feel as if I’ve found an elegant editorial solution to telling the story in a concise manner, even if that means re-imagining the concept slightly. It’s a true testament to the power of editing and one that is specific to editing commercials.

How have social media campaigns changed the way you edit, if at all?
Social media hasn’t changed the way I edit, but it has certainly changed my involvement in the campaign as a whole. At its worst, the social media component is an afterthought, where editors are asked to just slap together a quick six-second cutdown or reformat a spot to fit into a square framing for Instagram. At its best, the editor is brought into the brainstorming process and has a hand in determining how the footage can be used inventively to disperse the creative into different media slots. One of the biggest assets of an editor on any project is his or her knowledge of the material, and being able to leverage that knowledge to shape the campaign across all platforms is incredibly rewarding.

Phillips 76 “Jean and Gene”

What system do you edit on, and what else other than editing are you asked to supply?
We edit primarily on Avid Media Composer. I still believe that nothing else can compete when it comes to project sharing, and as a company it allows for the smoothest means of collaboration between offices around the world. That being said, clients continue to expect more and more polish from the offline process, and we are always pushing our capabilities in motion graphics and visual effects in After Effects and color finessing in Blackmagic DaVinci Resolve.

What projects have you worked on recently?
I’ve been working on some bigger campaigns that consist of a larger number of spots. Two campaigns that come to mind are a seven-spot TV campaign for Phillips 76 gas stations and 13 short online films for Subaru. It’s fun to step back and look at how they all fit together, and sometimes you make different decisions about an individual spot based on how it sits in the larger group.

The “Jean and Gene” spots for 76 were particularly fun because it’s the same two characters who you follow across several stories, and it almost feels like a mini TV series exploring their life.

Earlier in the  year I worked on a Conoco campaign, featuring the spots Picnic, First Contact and River, via Carmichael Lynch.

Red Digital Cinema Post and Workflow Specialist Dan Duran

How do you see the line between production and post blurring?
Both post and on set production are evolving with each other. There has always been a fine line between them, but as tech grows and becomes more affordable, you’re seeing tools that previously would have been used only in post bleed onto set.

One of my favorite trends is seeing color-managed workflows on locations. With full color control pipelines being used with calibrated SDR and HDR monitors, a more accurate representation of what the final image will look like is given. I’ve also seen growth in virtual productions where you’re able to see realtime CGI and environments on set directly through camera while shooting.

What are the biggest trends you’ve been facing in product development?
Everyone is always looking for the highest image quality at the best price point. As sensor technology advances, we’re seeing users ask for more and more out of the camera. Higher sensitivity, faster frame rates, more dynamic range and a digital RAW that allows them to effortlessly shape the images into a very specific creative look that they’re trying to achieve for their show. 8K provides a huge canvas to work with, offering flexibility in what they are trying to capture.

Smaller cameras are able to easily adapt into a whole new myriad of support accessories to achieve shots in ways that weren’t always possible. Along with the camera/sensor revolution, Red has seen a lot of new cinema lenses emerge, each adding their own character to the image as it hits the photo sites.

What trends do you see from editors these days. What enables their success?
I’ve seen post production really take advantage of modern tech to help improve and innovate new workflows. Being able to view higher resolution, process footage faster and playback off of a laptop shows how far hardware has come.

We have been working more with partners to help give pros the post tools they need to be more efficient. As an example, Red recently teamed up with Nvidia to not only get realtime full resolution 8K playback on laptops, but also allow for accelerated renders and transcode times much faster than before. Companies collaborating to take advantage of new tech will enable creative success.

AlphaDogs Owner/Editor Terence Curren

What trends do you see in editing? Good or bad.
There is a lot of content being created across a wide range of outlets and formats, from theatrical blockbusters and high-end TV shows all the way down to one-minute videos for Instagram. That’s positive for people desiring to use their editing skills to do a lot of storytelling. The flip side is that with so much content being created, the dollars to pay editors gets stretched much thinner. Barring high-end content creation, the overall pay rates for editors have been going down.

The cost of content capture is a tiny fraction of what it was back in the film days. The good part of that is there is a greater likelihood that the shot you need was actually captured. The downside is that without the extreme expense of shooting associated with film, we’ve lost the disciplines of rehearsing scenes thoroughly, only shooting while the scene is being performed, only printing circled takes, etc. That, combined with reduced post schedules, means for the most part editors just don’t have the time to screen all the footage captured.

The commoditization of the toolsets, (some editing systems are actually free) combined with the plethora of training materials readily available on the Internet and in most schools means that video storytelling is now a skill available to everyone. This means that the next great editors won’t be faced with the barriers to entry that past generations experienced, but it also means that there’s a much larger field of editors to choose from. The rules of supply and demand tell us that increased availability and competition of a service reduces its cost. Traditionally, many editors have been able to make upper-middle-class livings in our industry, and I don’t see as much of that going forward.

To sum it up, it’s a great time to become an editor, as there’s plenty of work and therefore lots of opportunity. But along with that, the days of making a higher-end living as an editor are waning.

What is it about editing that attracted you and keeps attracting you?
I am a storyteller at heart. The position of editor is, in my opinion, matched with the director and writer for responsibility of the structural part of telling the story. The writer has to invent the actual story out of whole cloth. The director has to play traffic cop with a cornucopia of moving pieces under a very tight schedule while trying to maintain the vision of the pieces of the story necessary to deliver the final product. The editor takes all those pieces and gives the final rewrite of the story for the audience to hopefully enjoy.

Night Walk

As with writing, there are plenty of rules to guide an editor through the process. Those rules, combined with experience, make the basic job almost mechanical much of the time. But there is a magic thing that happens when the muse strikes and I am inspired to piece shots together in some way that just perfectly speaks to the audience. Being such an important part of the storytelling process is uniquely rewarding for a storyteller like me.

Can you talk about challenges specific to short-form editing versus long-form?
Long-form editing is a test of your ability to maintain a fresh perspective of your story to keep the pacing correct. If you’ve been editing a project for weeks or months at a time, you know the story and all the pieces inside out. That can make it difficult to realize you might be giving too much information or not enough to the audience. Probably the most important skill for long form is the ability to watch a cut you’ve been working on for a long time and see it as a first-time viewer. I don’t know how others handle it, but for me there is a mental process that just blanks out the past when I want to take a critical fresh viewing.

Short form brings the challenge of being ruthless. You need to eliminate every frame of unnecessary material without sacrificing the message. While the editors don’t need to keep their focus for weeks or months, they have the challenge of getting as much information into that short time as possible without overwhelming the audience. It’s a lot like sprinting versus running a marathon. It exercises a different creative muscle that also enjoys an immediate reward.

Lafayette Escadrille

I can’t say I prefer either one over the other, but I would be bored if I didn’t get to do both over time, as they bring different disciplines and rewards.

How have social media campaigns changed the way you edit, if at all? Can you talk about the variety of deliverables and how that affects things?
Well, there is the horrible vertical framing trend, but that appears to be waning, thankfully. Seriously, though, the Instagram “one minute” limit forces us all to become commercial editors. Trying to tell the story in as short a timeframe as possible, knowing it will probably be viewed on a phone in a bright and noisy environment is a new challenge for seasoned editors.

There is a big difference between having a captive audience in a theater or at home in front of the TV and having a scattered audience whose attention you are trying to hold exclusively amid all the distractions. This seems to require more overt attention-grabbing tricks, and it’s unfortunate that storytelling has come to this point.

As for deliverables, they are constantly evolving, which means each project can bring all new requirements. We really have to work backward from the deliverables now. In other words, one of our first questions now is, “Where is this going?” That way we can plan the appropriate workflows from the start.

What system do you edit on and what else other than editing are you asked to supply?
I primarily edit on Media Composer, as it’s the industry standard in my world. As an editor, I can learn any tool to use. I have cut with Premiere and FCP. It’s knowing where to make the edit that is far more important than how to make the edit.

When I started editing in the film days, we just cut picture and dialogue. There were other editors for sound beyond the basic location-recorded sound. There were labs from which you ordered something as simple as a dissolve or a fade to black. There were color timers at the film lab who handled the look of the film. There were negative cutters that conformed the final master. There were VFX houses that handled anything that wasn’t actually shot.

Now, every editor has all the tools at hand to do all those tasks themselves. While this is helpful in keeping costs down and not slowing the process, it requires editors to be a jack-of-all-trades. However, what typically follows that term is “and master of none.”

Night Walk

One of the main advantages of separate people handling different parts of the process is that they could become really good at their particular art. Experience is the best teacher, and you learn more doing the same thing every day than occasionally doing it. I’ve met a few editors over the years that truly are masters in multiple skills, but they are few and far between.

Using myself as an example, if the client wants some creatively designed show open, I am not the best person for that. Can I create something? Yes. Can I use After Effects? Yes, to a minor degree. Am I the best person for that job? No. It is not what I have trained myself to do over my career. There is a different skill set involved in deciding where to make a cut versus how to create a heavily layered, graphically designed show open. If that is what I had dedicated my career to doing, then I would probably be really good at it, but I wouldn’t be as good at knowing where to make the edit.

What projects have gone through the studio recently?
We work on a lot of projects at AlphaDogs. The bulk of our work is on modest-budget features, documentaries and unscripted TV shows. A recent example is a documentary on World War I fighter pilots called The Lafayette Escadrille and an action-thriller starring Eric Roberts and Mickey Rourke, called Night Walk.

Unfortunately for me I have become so focused on running the company that I haven’t been personally working on the creative side as much as I would like. While keeping a post house running in the current business climate is its own challenge, I don’t particularly find it as rewarding as “being in the chair.”

That feeling is offset by looking back at all the careers I have helped launch through our internship program and by offering entry-level employment. I’ve also tried hard to help editors over the years through venues like online user groups and, of course, our own Editors’ Lounge events and videos. So I guess that even running a post house can be rewarding in its own way.

Luma Touch Co-Founder/Lead Designer Terri Morgan

Have there been any talks among NLE providers about an open timeline? Being able to go between Avid, Resolve or Adobe with one file like an AAF or XML?
Because every edit system uses its own editing paradigms (think Premiere versus FCP X), creating an open exchange is challenging. However, there is an interesting effort by Pixar (https://github.com/PixarAnimationStudios/OpenTimelineIO) that includes adapters for the wide range of structural differences of some editors. There are also efforts for standards in effects and color correction. The core editing functionality in LumaFusion is built to allow easy conversion in and out to different formats, so adapting to new standards will not be challenging in most cases.

With AI becoming a popular idea and term, at what point does it stop? Is there a line where AI won’t go?
Looking at AI strictly as it relates to video editing, we can see that its power is incrementally increasing, and automatically generated movies are getting better. But while a neural network might be able to put together a coherent story, and even mimic a series of edits to match a professional style, it will still be cookie-cutter in nature, rather than being an artistic individual endeavor.

What we understand from our customers — and from our own experience — is that people get profound joy from being the storyteller or the moviemaker. And we understand that automatic editing does not provide the creative/ownership satisfaction that you get from crafting your own movie. You only have to make one automatic movie to learn this fact.

It is also clear that movie viewers feel a lack of connection or even annoyance when watching an automatically generated movie. You get the same feeling when you pay for parking at an automated machine, and the machine says, “Thank you, have a nice day.”

Here is a question from one of our readers: There are many advancements in technology coming in NLEs. Are those updates coming too fast and at an undesirable cost?
It is a constant challenge to maintain quality while improving a product. We use software practices like Agile, engage in usability tests and employ testing as robust as possible to minimize the effects of any changes in LumaFusion.

In the case of LumaFusion, we are consistently adding new features that support more powerful mobile video editing and features that support the growing and changing world around us. In fact, if we stopped developing so rapidly, the app would simply stop working with the latest operating system or wouldn’t be able to deliver solutions for the latest trends and workflows.

To put it all in perspective, I like to remind myself of the amount of effort it took to edit video 20 years ago compared to how much more efficient and fun it is to edit a video now. It gives me reason to forgive the constant changes in technology and software, and reason to embrace new workflows and methodologies.

Will we ever be at a point where an offline/online workflow will be completely gone?
Years ago, the difference in image quality provided a clear separation between offline and online. But today, online is differentiated by the ability to edit with dozens of tracks, specialized workflows, specific codecs, high-end effects and color. Even more importantly, online editing typically uses the specialized skills that a professional editor brings to a project.

Since you can now edit a complex timeline with six tracks of 4K video with audio and another six tracks of audio, basic color correction and multilayered titles straight from an iPad, for many projects you might find it unnecessary to move to an online situation. But there will always be times that you need more advanced features or the skills of a professional editor. Since not everybody wants to understand the complex world of post production, it is our challenge at Luma Touch to make more of these high-end features available without greatly limiting who can successfully use the product.

What are the trends you’re seeing in customer base from high-end post facility vs. independent editor/contractor?
High-end post facilities tend to have stationary workstations that employ skilled editor/operators. The professionals that find LumaFusion to be a valuable tool in their bag are often those who are responsible for the entire production and post production, including independent producers, journalists and high-end professionals who want the flexibility of starting to edit while on location or while traveling.

What are the biggest trends you’ve been seeing in product development?
In general, moving away from lengthy periods of development without user feedback. Moving toward getting feedback from users early and often is an Agile-based practice that really makes a difference in product development and greatly increases the joy that our team gets from developing LumaFusion. There’s nothing more satisfying than talking to real users and responding to their needs.

New development tools, languages and technologies are always welcome. At WWDC this year, Apple announced it would make it easier for third-party developers to port their iOS apps over to the desktop with Project Catalyst. This will likely be a viable option for LumaFusion.

You come from a high-end editing background, with deep experience editing at the workstation level. When you decided to branch off and do something on your own, why did you choose mobile?
Mobile offered a solution to some of the longest running wishes in professional video editing: to be liberated from the confines of an edit suite, to be able to start editing on location, to have a closer relationship to the production of the story in order to avoid the “fix it in post” mentality, and to take your editing suite with you anywhere.

It was only after starting to develop for mobile that we fully understood one of the most appealing benefits. Editing on an iPad or iPhone encourages experimentation, not only because you have your system with you when you have a good idea, but also because you experience a more direct relationship to your media when using the touch interface; it feels more natural and immersive. And experimentation equals creativity. From my own experience I know that the more you edit, the better you get at it. These are benefits that everyone can enjoy whether they are a professional or a novice.

Hecho Studios Editor Grant Lewis

What trends do you see in commercial editing? Good or bad.
Commercials are trending away from traditional, large-budget cinematic pieces to smaller, faster, budget-conscious ones. You’re starting to see it now more and more as big brands shy away from big commercial spectacles and pivot toward a more direct reflection of the culture itself.

Last year’s #CODNation work for the latest installment of the Call of Duty franchise exemplifies this by forgoing a traditional live-action cinematic trailer in favor of larger number of game-capture, meme-like films. This pivot away from more dialogue-driven narrative structures is changing what we think of as a commercial. For better or worse, I see commercial editing leaning more into the fast-paced, campy nature of meme culture.

What is it about commercial editing that attracted you and keeps attracting you?
What excites me most about commercial editing is that it runs the gamut of the editorial genre. Sometimes commercials are a music video; sometimes they are dramatic anthems; other times they are simple comedy sketches. Commercials have the flexibility to exist as a multitude of narrative genres, and that’s what keeps me attracted to commercial editing.

Can you talk about challenges specific to short form versus long form?
The most challenging thing about short-form editing is finding time for breath. In a 30-second piece, where do you find a moment of pause? There’s always so much information being packed into smaller timeframes; the real challenge is editing at a sprint, but still having it feel dynamic and articulate.

How have social media campaigns changed the way you edit, if at all? Can you talk about the variety of deliverables and how that affects things?
All campaigns will either live on social media or have specific social components now. I think the biggest thing that has changed is being tasked with telling a compelling narrative in 10 or even five or six seconds. Now, the 60-second and 90-second anthem film has to be able to work in six seconds as well. It is challenging to boil concepts down to just a few seconds and still maintain a sense of story.

#CODNation

All the deliverable aspect ratios editors are asked to make now is also a blossoming challenge. Unless a campaign is strictly shot for social, the DP probably shot for a traditional 16×9 framing. That means the editor is tasked with reframing all social content to work in all the different deliverable formats. This makes the editor act almost as the DP for social in the post process. Shorter deliverables and a multitude of aspect ratios have just become another layer to editing and demand a whole new editorial lens to view and process the project through.

What system do you edit on and what else other than editing are you asked to supply?
I currently cut in Adobe Premiere Pro. I’m often asked to supply graphics and motion graphic elements for offline cuts as well. That means being comfortable with the whole Adobe suite of tools, including Photoshop and After Effects. From type setting to motion tracking, editors are now asked to be well-versed in all tangential aspects of editorial.

What projects have you worked on recently?
I cut the launch film for Razer’s new Respawn energy drink. I also cut Toms Shoes’ most recent campaign, “Stand For Tomorrow.”

EditShare Head of Marketing Lee Griffin

What are the biggest trends you’ve been seeing in product development?
We see the need to produce more video content — and produce it faster than ever before — for social media channels. This means producing video in non-broadcast standards/formats and, more specifically, producing square video. To accommodate, editing tools need to offer user-defined options for manipulating size and aspect ratio.

What changes have you seen in terms of the way editors work and use your tools?
There are two distinct changes: One, productions are working with editors regardless of their location. Two, there is a wider level of participation in the content creation process.

In the past, the editor was physically located at the facility and was responsible for assembling, editing and finishing projects. However, with the growing demand for content production, directors and producers need options to tap into a much larger pool of talent, regardless of their location.

EditShare AirFlow and Flow Story enable editors to work remotely from any location. So today, we frequently see editors who use our Flow editorial tools working in different states and even on different continents.

With AI becoming a popular idea and term, at what point does it stop?
I think AI is quite exciting for the industry, and we do see its potential to significantly advance productions. However, AI is still in its infancy with regards to the content creation market. So from our point of view, the road to AI and its limits are yet to be defined. But we do have our own roadmap strategy for AI and will showcase some offerings integrated within our collaborative solutions at IBC 2019.

Will we ever be at a point where an offline/online workflow will be completely gone?
It depends on the production. Offline/online workflows are here to stay in the higher-end production environment. However, for fast turnaround productions, such as news, sports and programs (for example, soap operas and reality TV), there is no need for offline/online workflows.

What are the trends you’re seeing in customer base from high-end post facility vs, independent editor. How is that informing your decisions on products and pricing?
With the increase in the number of productions thanks to OTTs, high-end post facilities are tapping into independent editors more and more to manage the workload. Often the independent editor is remote, requiring the facility to have a media management foundation that can facilitate collaboration beyond the facility walls.

So we are seeing a fundamental shift in how facilities are structuring their media operations to support remote collaborations. The ability to expand and contract — with the same level of security they have within the facility — is paramount in architecting their “next-generation” infrastructure.

What do you see as untapped potential customer bases that didn’t exist 10 to 20 years ago, and how do you plan on attracting and nurturing them? What new markets are you seeing.
We are seeing major growth beyond the borders of the media and entertainment industry in many markets. From banks to real estate agencies to insurance companies, video has become one of the main ways for them to communicate to their media-savvy clientele.

While EditShare solutions were initially designed to support traditional broadcast deliverables, we have evolved them to accommodate these new customers. And today, these customers want simplicity coupled with speed. Our development methodology puts this at the forefront of our core products.

Puget Systems Senior Labs Technician Matt Bach

Have there been any talks between NLE providers about an open timeline. Essentially being able to go between Avid, Resolve, or Adobe with one file like an AAF or XML?
I have not heard anything on this topic from any developers, so keep in mind that this is pure conjecture, but the pessimistic side of me doesn’t see an “open timeline” being something that will happen anytime soon.

If you look at what many of the NLE developers are doing, they are moving more and more toward a pipeline that is completely contained within their ecosystem. Adobe has been pushing Dynamic Link in recent years in order to make it easier to move between Premiere Pro and After Effects. Blackmagic is going even a step further by integrating editing, color, VFX and audio all within DaVinci Resolve.

These examples are both great advancements that can really improve your workflow efficiency, but they are being done in order to keep the user within their specific ecosystem. As great as an open timeline would be, it seems to be counter to what Adobe, Blackmagic, and others are actively pursuing. We can still hold out hope, however!

With AI becoming a popular idea and term, at what point does it stop?
There are definitely limitations to what AI is capable of, but that line is moving year by year. For the foreseeable future, AI is going to take on a lot of the tedious tasks like tagging of footage, content-aware fill, shot matching, image enhancement and other similar tasks. These are all perfect use cases for artificial intelligence, and many (like content-aware fill) are already being implemented in the software we have available right now.

The creative side is where AI is going to take the longest time to become useful. I’m not sure if there is a point where AI will stop from a technical standpoint, but I personally believe that even if AI was perfect, there is value in the fact that an actual person made something. That may mean that the masses of videos that get published will be made by AI (or perhaps simply AI-assisted), but just like furniture, food, or even workstations, there will always be a market for high-quality items crafted by human hands.

I think the main thing to keep in mind with AI is that it is just a tool. Moving from black and white to color, or from film to digital, was something that at the time, people thought was going to destroy the industry. In reality, however, they ended up being a huge boon. Yes, AI will change how some jobs are approached — and may even eliminate some job roles entirely —but in the end, a computer is never going to be as creative and inventive as a real person.

There are many advancements in technology coming in NLEs seemingly daily, are those updates coming too fast and at an undesirable cost?
I agree that this is a problem right now, but it isn’t limited to just NLEs. We see the same thing all the time in other industries, and it even occurs on the hardware side where a new product will be launched simply because they could, not because there is an actual need for it.

The best thing you can do as an end-user is to provide feedback to the companies about what you actually want. Don’t just sit on those bugs, report them! Want a feature? Most companies have a feature request forum that you can post on.

In the end, these companies are doing what they believe will bring them the most users. If they think a flashy new feature will do it, that is what they will spend money on. But if they see a demand for less flashy, but more useful, improvements, they will make that a priority.

Will we ever be at a point where an offline/online workflow will be completely gone?
Unless we hit some point where camera technology stops advancing, I don’t think offline editing is ever going to fully go away. It is amazing what modern workstations can handle from a pure processing standpoint, but even if the systems themselves could handle online editing, you also need to have the storage infrastructure that can keep up. With the move from HD to 4K, and now to 8K, that is a lot of moving parts that need to come together in order to eliminate offline editing entirely.

With that said, I do feel like offline editing is going to be used less and less. We are starting to hit the point that people feel their footage is higher quality than they need without having to be on the bleeding edge. We can edit 4K ProRes or even Red RAW footage pretty easily with the technology that is currently available, and for most people that is more than enough for what they are going to need for the foreseeable future.

What are the trends you’re seeing in customer base from high-end post facility vs. independent editor, and how is that informing your decisions on products and pricing?
From a workstation side, there really is not too much of a difference beyond the fact that high-end post facilities tend to have larger budgets that allow them to get higher-end machines. Technology is becoming so accessible that even hobbyist YouTubers often end up getting workstations from us that are very similar to what high-end professionals use.

The biggest differences typically revolves not around the pure power or performance of the system itself, but rather how it interfaces with the other tools the editor is using. Things like whether the system has 10GB (or fiber) networking, or whether they need a video monitoring card in order to connect to a color calibrated display, are often what sets them apart.

What are the biggest trends you’ve been seeing in product development?
In general, the two big things that have come up over and over in recent years are GPU acceleration and artificial intelligence. GPU acceleration is a pretty straight-forward advancement that lets software developers get a lot more performance out of a system for tasks like color correction, noise reduction and other tasks that are very well suited for running on a GPU.

Artificial intelligence is a completely different beast. We do quite a bit of work with people that are on the forefront of AI and machine learning, and it is going to have a large impact on post production in the near future. It has been a topic at conferences like NAB for several years, but with platforms like Adobe Sensei starting to take off, it is going to become more important

However, I do feel that AI is going to be more of an enabling technology rather than one that replaces jobs. Yes, people are using AI to do crazy things like cut trailers without any human input, but I don’t think that is going to be the primary use of it anytime in the near future. It is going to be things like assisting with shot matching, tagging of footage, noise reduction, and image enhancement that is going to be where it is truly useful.

What do you see as untapped potential customer bases that didn’t exist 10-20 years ago, and how do you plan on attracting and nurturing them? What new markets are you seeing?
I don’t know if there are any customer bases that are completely untapped, but I do believe that there is going to be more overlap between industries in the next few years. One example is how much realtime raytracing has improved recently, which is spurring the use of video game engines in film. This has been done for previsualization for quite a while, but the quality is getting so good that there are some films already out that include footage straight from the game engine.

For us on the workstation side, we regularly work with customers doing post and customers who are game developers, so we already have the skills and technical knowledge to make this work. The biggest challenge is really on the communication side. Both groups have their own set of jargon and general language, so we often find ourselves having to be the “translator” when a post house is looking at integrating realtime visualization in their workflow.

This exact scenario is also likely to happen with VR/AR as well.

Lucky Post Editor Marc Stone

What trends do you see in commercial editing?
I’m seeing an increase in client awareness of the mobility of editing. It’s freeing knowing you can take the craft with you as needed, and for clients, it can save the ever-precious commodity of time. Mobility means we can be an even greater resource to our clients with a flexible approach.

I love editing at Lucky Post, but I’m happy to edit anywhere I am needed — be it on set or on location. I especially welcome it if it means you can have face-to-face interaction with the agency team or the project’s director.

What is it about commercial editing that attracted you and keeps attracting you?
The fact that I can work on many projects throughout the year, with a variety of genres, is really appealing. Cars, comedy, emotional PSAs — each has a unique creative challenge, and I welcome the opportunity to experience different styles and creative teams. I also love putting visuals together with music, and that’s a big part of what I do in 30-or 60-second… or even in a two-minute branded piece. That just wouldn’t be possible, to the same extent, in features or television.

Can you talk about challenges specific to short-form editing?
The biggest challenge is telling a story in 30 seconds. To communicate emotion and a sense of character and get people to care, all within a very short period of time. People outside of our industry are often surprised to hear that editors take hours and hours of footage and hone it down to a minute or less. The key is to make each moment count and to help make the piece something special.

Ram’s The Promise spot

How has social media campaigns changed the way you edit, if at all?
It hasn’t changed the way I edit, but it does allow some flexibility. Length isn’t constrained in the same way as broadcast, and you can conceive of things in a different way in part because of the engagement approach and goals. Social campaigns allow agencies to be more experimental with ideas, which can lead to some bold and exciting projects.

What system do you edit on, and what else other than editing are you asked to supply?
For years I worked on Avid Media Composer, and at Lucky Post I work in Adobe Premiere. As part of my editing process, I often weave sound design and music into the offline so I can feel if the edit is truly working. What I also like to do, when the opportunity presents, is to be able to meet with the agency creatives before the shoot to discuss style and mood ahead of time.

What projects have you worked on recently?
Over the last six months, I have worked on projects for Tazo, Ram and GameStop, and I am about to start a PSA for the Salvation Army. It gets back to the variety I spoke about earlier and the opportunity to work on interesting projects with great people.

Billboard Video Post Supervisor/Editor Zack Wolder

What trends do you see in editing? Good or bad.I’m noticing a lot of glitch transitions and RGB splits being used. Much flashier edits, probably for social content to quickly grab the viewers attention.

Can you talk about challenges specific to short-form editing versus long-form?
With short-form editing, the main goal is to squeeze the most amount of useful information into a short period of time while not overloading the viewer. How do you fit an hour-long conversation into a three-minute clip while hitting all the important talking points and not overloading the viewer? With long-form editing, the goal is to keep viewers’ attention over a long period of time while always surprising them with new and exciting info.

What is it about editing that attracted you and keeps attracting you?
I loved the fact that I could manipulate time. That hooked me right away. The fact that I could take a moment that lasts only a few seconds and drag it out for a few minutes was incredible.

Can you talk about the variety of deliverables for social media and how that affects things?
Social media formats have made me think differently about framing a shot or designing logos. Almost all the videos I create start in the standard 16×9 framing but will eventually be delivered as a vertical. All graphics and transitions I build need to easily work in a vertical frame. Working in a 4K space and shooting in 4K helps tremendously.

Rainn Wilson and Billie Eilish

What system do you edit on, and what else other than editing are you asked to supply?
I edit in Adobe Premiere Pro. I’m constantly asked to supply design ideas and mockups for logos and branding and then to animate those ideas.

What projects have you worked on recently?
Recently, I edited a video that featured Rainn Wilson — who played Dwight Schrute on The Office — quizzing singer Billie Eilish, who is a big-time fan of the show.

Main Image: AlphaDogs editor Herrianne Catolos


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Review: Avid Media Composer Symphony 2018 v.12

By Brady Betzel

In February of 2018, we saw a seismic shift in the leadership at Avid. Chief executive officer Louis Hernandez Jr. was removed and subsequently replaced by Jeff Rosica. Once Rosica was installed, I think everyone who was worried Avid was about to be liquidated to the highest bidder breathed a sigh of temporary relief. Still unsure whether new leadership was going to right a tilting ship, I immediately wanted to see a new action plan from Avid, specifically on where Media Composer and Symphony were going.

Media Composer with Symphony

Not long afterward, I was happily reading how Avid was taking lessons from its past transgressions and listening to its clients. I heard Avid was taking tours around the industry and listening to what customers and artists needed from them. Personally, I was asking myself if Media Composer with Symphony would ever be the finishing tool of Avid DS was. I’m happy to say, it’s starting to look that way.

It appears from the outside that Rosica is indeed the breath of fresh air Avid needed. At NAB 2019, Avid teased the next iteration of Media Composer, version 2019, with overhauled interface and improvements, such as a 32-bit float color pipeline workflow complete with ACES color management and a way to deliver IMF packages; a new engine with a distributed processing engine; and a whole new product called Media Composer|Enterprise, all of which will really help sell this new Media Composer. But the 2019 update is coming soon and until then I took a deep dive into Media Composer 2018 v12, which has many features editors, assistants, and even colorists have been asking for: a new Avid Titler, shape-based color correction (with Symphony option), new multicam features and more.

Titling
As an online editor who uses Avid Media Composer with Symphony option about 60% of the time, titling is always a tricky subject. Avid has gone through some rough seas when dealing with how to fix the leaky hole known as the Avid Title Tool. The classic Avid Title Tool was basic but worked. However, if you aligned something in the Title Tool interface to Title Safe zones, it might jump around once you close the Title Tool interface. Fonts wouldn’t always stay the same when working across PC and MacOS platforms. The list goes on, and it is excruciatingly annoying.

Titler

Let’s take a look at some Avid history: In 2002, Avid tried to appease creators and introduced the, at the time, a Windows-only titler: Avid Marquee. While Marquee was well-intentioned, it was extremely difficult to understand if you weren’t interested in 3D lighting, alignment and all sorts of motion graphics stuff that not all editors want to spend time learning. So, most people didn’t use it, and if they did it took a little while for anyone taking over the project to figure out what was done.

In December of 2014, Avid leaned on the New Blue Titler, which would work in projects higher than 1920×1080 resolution. Unfortunately, many editors ran into a very long render at the end, and a lot bailed on it. Most decided to go out of house and create titles in Adobe Photoshop and Adobe After Effects. While this all relates to my experience, I assume others feel the same.

In Avid Media Composer 2018, the company has introduced the Avid Titler, which in the Tools menu is labeled: Avid Titler +. It works like an effect rather than a rendered piece of media like in the traditional Avid Title Tool, where an Alpha and a Fill layer worked. This method is similar to how NewBlue or Marquee functioned. However, Avid Titler works by typing directly on the record monitor; adding a title is as easy as marking an in and out point and clicking on the T+ button on the timeline.

You can specify things like kerning, shadow, outlines, underlines, boxes, backgrounds and more. One thing I found peculiar was that under Face, the rotation settings rotate individual letters and not the entire word by default. I reached out to Avid and they are looking into making the entire word rotation option the default in the mini toolbar of Avid Titler. So stay tuned.

Also, you can map your fast forward and rewind buttons to “Go To Next/Previous Event.” This allows you to jump to your next edits in the timeline but also to the next/previous keyframes when in the Effect Editor. Typically, you click on the scrub line in the record window and then you can use those shortcuts to jump to the next keyframe. In the Avid Titler, it would just start typing in the text box. Furthermore, when I wanted to jump out of Effect Editor mode and back into Edit Mode, I usually hit “y,” but that did not get me out of Effects Mode (Avid did mention they are working on updates to the Avid Titler that would solve this issue). The new Avid Titler definitely has some bugs and/or improvements that are needed, and they are being addressed, but it’s a decent start toward a modern title editor.

Shape-based color correction

Color
If you want advanced color correction built into Media Composer, then you are going to want the Symphony option. Media Composer with the Symphony option allows for more detailed color correction using secondary color corrections as well as some of the newer updates, including shape-based color correction. Before Resolve and Baselight became more affordable, Symphony was the gold standard for color correction on a budget (and even not on a budget since it works so well in the same timeline the editors use). But what we are really here for is the 2018 v.12 update of Shapes.

With the Symphony option, you can now draw specific regions on the footage for your color correction to affect. It essentially works similarly to a layer-based system like Adobe Photoshop. You can draw shapes with the same familiar tools you are used to drawing with in the Paint or AniMatte tools and then just apply your brightness, saturation or hue swings in those areas only. On the color correction page you can access all of these tools on the right-hand side, including the softening, alpha view, serial mode and more.

When using the new shape-based tools you must point the drop-down menu to “CC Effect.” From there you can add a bunch of shapes on top of each other and they will play in realtime. If you want to lay a base correction down, you can specify it in the shape-based sidebar, then click shape and you can dial in the specific areas to your or your client’s taste. You can check off the “Serial Mode” box to have all corrections interact with one another or uncheck the box to allow for each color correction to be a little more isolated — a really great option to keep in mind when correcting. Unfortunately, tracking a shape can only be done in the Effect Editor, so you need to kind of jump out of color correction mode, track, and then go back. It’s not the end of the world, but it would be infinitely better if you could track efficiently inside of the color correction window. Avid could even take it further by allowing planar tracking by an app like Mocha Pro.

Shape-based color correction

The new shape-based corrector also has an alpha view mode identified by the infinity symbol. I love this! I often find myself making mattes in the Paint tool, but it can now be done right in the color correction tool. The Symphony option is an amazing addition to Media Composer if you need to go further than simple color correction but not dive into a full color correction app like Baselight or Resolve. In fact, for many projects you won’t need much more than what Symphony can do. Maybe a +10 on the contrast, +5 on the brightness and +120 on the saturation and BAM a finished masterpiece. Kind of kidding, but wait until you see it work.

Multicam
The final update I want to cover is multicam editing and improvements to editing group clips. I cannot emphasize enough how much time this would have saved me as an assistant editor back in the pre-historic Media Composer days… I mean we had dongles, and I even dabbled in the Meridian box. Literally days of grouping and regrouping could have been avoided with the Edit Group feature. But I did make a living fixing groups that were created incorrectly, so I guess this update is a Catch 22. Anyway, you can now edit groups in Media Composer by creating a group, right-clicking on that group and selecting Edit Group. From there, the group will now open in the Record Monitor as a sequence, and from there you can move, nudge and even add cameras to a previously created group. Once you are finished, you can update the group and refresh any sequences that used that group to update if you wish. One issue is that with mixed frame rate groups, Avid says committing to that sequence might produce undesirable effects.

Editing workspace

Cost of Entry
How much does Media Composer cost these days? While you can still buy it outright, it seems a bit more practical to go monthly since you will automatically get updates, but it can still be a little tricky. Do you need PhraseFind and/or ScriptSync? Do you need the Symphony option? Do you need to access shared storage? There are multiple options depending on your needs. If you want everything, then Media Composer Ultimate for $49 per month is what you want. If you want Media Composer and just one add-on, like Symphony, it will cost $19 per month plus $199 per year for the Symphony option. If you want to test the water before jumping in, you can always try Media Composer First.

For a good breakdown of the Media Composer pricing structure, check out KeyCode Media  page (a certified reseller). Another great link with tons of information organized into easily digestible bites is this. Additionally, www.freddylinks.com is a great resource chock full of everything else Avid, written by Avid technical support specialist Fredrik Liljeblad out of Sweden.

Group editing

Summing Up
In the end, I use and have used Media Composer with Symphony for over 15 years, and it is the most reliable nonlinear editor supporting multiple editors in a shared network environment that I have used. While Adobe Premiere Pro, Apple Final Cut Pro X and Blackmagic Resolve are offering fancy new features and collaboration modes, Avid seems to always hold stabile when I need it the most. These new improvements and a UI overhaul (set to debut in May), new leadership from Rosica, and the confidence of Rosica’s faithful employees all seem to be paying off and getting Avid back on the track they should have always been on.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Avid offers rebuilt engine and embraces cloud, ACES, AI, more

By Daniel Restuccio

During its Avid Connect conference just prior to NAB, Avid announced a Media Composer upgrade, support for ACES color standard and additional upgrades to a number of its toolsets, apps and services, including Avid Nexis.

The chief news from Avid is that Media Composer, its flagship video editing system, has been significantly retooled: sporting a new user interface, rebuilt engine, and additional built-in audio, visual effects, color grading and delivery features.

In a pre-interview with postPerspective, Avid president/CEO Jeff Rosica said, “We’re really trying to leap frog and jump ahead to where the creative tools need to go.”

Avid asked themselves, what did they need to do “to help production and post production really innovate?” He pointed to TV shows and films, and how complex they’re getting. “That means they’re dealing with more media, more elements, and with so many more decisions just in the program itself. Let alone the fact that the (TV or film) project may have to have 20 different variants just to go out the door.”

Jeff Rosica

The new paneled user interface simplifies the workspace, has redesigned bins to find media faster, as well as task-based workspaces showing only what the user wants and needs to see.

Dave Colantuoni, VP of product management at Avid, said they spent the most amount of time studying the way that editors manage and organize bins and content within Media Composer. “Some of our editors use 20, 30, 40 bins at a time. We’ve really spent a lot of time so that we can provide an advantage to you in how you approach organizing your media. “

Avid is also offering more efficient workflow solutions. Users, without leaving Media Composer, can work in 8K, 16K or HDR thanks to the newly built-in 32-bit full float color pipeline. Additionally, Avid continues to work with OTT content providers to help establish future industry standards.

“We’re trying to give as much creative power to the creative people as we can, and bring them new ways to deal with things,” said Rosica. “We’re also trying to help the workflow side. We’re trying to help make sure production doesn’t have to do more with less, or sometimes more with the same budget. Cloud (computing) allows us to bring a lot of new capabilities to the products, and we’re going to be cloud powering a lot of our products… more than you’ve seen before.”

The new Media Composer engine is now native OP1A, can handle more video and audio streams, offers Live Timeline and background rendering, and a distributed processing add-on option to shorten turnaround times and speed up post production.

“This is something our competitors do pretty well,” explained Colantuoni. “And we have different instances of OP1A working among the different Avid workflows. Until now, we’ve never had it working natively inside of Media Composer. That’s super-important because a lot of capabilities started in OP1A, and we can now keep it pristine through the pipeline.”

Said Rosica, “We are also bringing the ability to do distributive rendering. An editor no longer has to render or transcode on their machine. They can perform those tasks in a distributed or centralized render farm environment. That allows this work to get done behind the scenes. This is actually an Avid Supply solution, so it will be very powerful and reliable. Users will be able to do background rendering, as well as distributive rendering and move things off the machine to other centralized machines. That’s going to be very helpful for a lot of post workflows.”

Avid had previously offered three main flavors of Media Composer: Media Composer First, the free version; Media Composer; and Media Composer Ultimate. Now they are also offering a new Enterprise version.

For the first time, large production teams can customize the interface for any role in the organization, whether the user is a craft editor, assistant, logger or journalist. It also offers unparalleled security to lock down content, reducing the chances of unauthorized leaks of sensitive media. Enterprise also integrates with Editorial Management 2019.

“The new fourth tier at the top is what we are calling the Enterprise Edition or Enterprise. That word doesn’t necessarily mean broadcast,” says Rosica. “It means for business deployment. This is for post houses and production companies, broadcast, and even studios. This lets the business, or the enterprise, or production, or post house to literally customize interfaces and customize work spaces to the job role or to the user.”

Nexis Cloudspaces
Avid also announced Avid Nexis|Cloudspaces. So Instead of resorting to NAS or external drives for media storage, Avid Nexis|Cloudspaces allows editorial to offload projects and assets not currently in production. Cloudspaces extends Avid Nexis storage directly to Microsoft Azure.

“Avid Nexis|Cloudspaces brings the power of the cloud to Avid Nexis, giving organizations a cost-effective and more efficient way to extend Avid Nexis storage to the cloud for reliable backup and media parking,” said Dana Ruzicka, chief product officer/senior VP at Avid. “Working with Microsoft, we are offering all Avid Nexis users a limited-time free offer of 2TB of Microsoft Azure storage that is auto-provisioned for easy setup and as much capacity as you need, when you need it.”

ACES
The Academy Color Encoding System (ACES) team also announced that Avid is now part of the ACES Logo Program, as the first Product Partner in the new Editorial Finishing product category. ACES is a free, open, device-independent color management and image interchange system and is the global standard for color management, digital image interchange and archiving. Avid will be working to implement ACES in conformance with logo program specifications for consistency and quality with a high quality ACES-color managed video creation workflow.

“We’re pleased to welcome Avid to the ACES logo program,” said Andy Maltz, managing director of the ACES Council. “Avid’s participation not only benefits editors that need their editing systems to accurately manage color, but also the broader ACES end-user community through expanded adoption of ACES standards and best practices.”

What’s Next?
“We’ve already talked about how you can deploy Media Composer or other tools in a virtualized environment, or how you can use these kind of cloud environments to extend or advance production,” said Rosica. “We also see that these things are going to allow us to impact workloads. You’ll see us continue to power our MediaCentral platform, editorial management of MediaCentral, and even things like Media Composer with AI to help them get to the job faster. We can help automate functions, automate environments and use cloud technologies to allow people to collaborate better, to share better, to just power their workloads. You’re going to see a lot from us over time.”

Review: Boris FX’s Continuum and Mocha Pro 2019

By Brady Betzel

I realize I might sound like a broken record, but if you are looking for the best plugin to help with object removals or masking, you should seriously consider the Mocha Pro plugin. And if you work inside of Avid Media Composer, you should also seriously consider Boris Continuum and/or Sapphire, which can use the power of Mocha.

As an online editor, I consistently use Continuum along with Mocha for tight blur and mask tracking. If you use After Effects, there is even a whittled-down version of Mocha built in for free. For those pros who don’t want to deal with Mocha inside of an app, it also comes as a standalone software solution where you can copy and paste tracking data between apps or even export the masks, object removals or insertions as self-contained files.

The latest releases of Continuum and Mocha Pro 2019 continue the evolution of Boris FX’s role in post production image restoration, keying and general VFX plugins, at least inside of NLEs like Media Composer and Adobe Premiere.

Mocha Pro

As an online editor I am alway calling on Continuum for its great Chroma Key Studio, Flicker Fixer and blurring. Because Mocha is built into Continuum, I am able to quickly track (backwards and forwards) difficult shapes and even erase shapes that the built-in Media Composer tools simply can’t do. But if you are lucky enough to own Mocha Pro you also get access to some amazing tools that go beyond planar tracking — such as automated object removal, object insertion, stabilizing and much more.

Boris FX’s latest updates to Boris Continuum and Mocha Pro go even further than what I’ve already mentioned and have resulted in a new version naming, this round we are at 2019 (think of it as Version 12). They have also created the new Application Manager, which makes it a little easier to find the latest downloads. You can find them here. This really helps when jumping between machines and you need to quickly activate and deactivate licenses.

Boris Continuum 2019
I often get offline edits effects from a variety plugins — lens flares, random edits, light flashes, whip transitions, and many more — so I need Continuum to be compatible with offline clients. I also need to use it for image repair and compositing.

In this latest version of Continuum, BorisFX has not only kept plugins like Primatte Studio, they have brought back Particle Illusion and updated Mocha and Title Studio. Overall, Continuum and Mocha Pro 2019 feel a lot snappier when applying and rendering effects, probably because of the overall GPU-acceleration improvements.

Particle Illusion has been brought back from the brink of death in Continuum 2019 for a 64-bit keyframe-able particle emitter system that can even be tracked and masked with Mocha. In this revamp of Particle Illusion there is an updated interface, realtime GPU-based particle generation, expanded and improved emitter library (complete with motion-blur-enabled particle systems) and even a standalone app that can design systems to be used in the host app — you cannot render systems inside of the standalone app.

While Particle Illusion is a part of the entire Continuum toolset that works with OFX apps like Blackmagic’s DaVinci Resolve, Media Composer, After Effects, and Premiere, it seems to work best in applications like After Effects, which can handle composites simply and naturally. Inside the Particle Illusion interface you can find all of the pre-built emitters. If you only have a handful make sure you download additional emitters, which you can find in the Boris FX App Manager.

       
Particle Illusion: Before and After

I had a hard time seeing my footage in a Media Composer timeline inside of Particle Illusion, but I could still pick my emitter, change specs like life and opacity, exit out and apply to my footage. I used Mocha to track some fire from Particle Illusion to a dumpster I had filmed. Once I dialed in the emitter, I launched Mocha and tracked the dumpster.

The first time I went into Mocha I didn’t see the preset tracks for the emitter or the world in which the emitter lives. The second time I launched Mocha, I saw track points. From there you can track where you want your emitter to track and be placed. Once you are done and happy with your track, jump back to your timeline where it should be reflected. In Media Composer I noticed that I had to go to the Mocha options and change the option from Mocha Shape to no shape. Essentially, the Mocha shape will act like a matte and cut off anything outside the matte.

If you are inside of After Effects, most parameters can now be keyframed and parented (aka pick-whipped) natively in the timeline. The Particle Illusion plugin is a quick, easy and good-looking tool to add sparks, Milky Way-like star trails or even fireworks to any scene. Check out @SurfacedStudio’s tutorial on Particle Illusion to get a good sense of how it works in Adobe Premiere Pro.

Continuum Title Studio
When inside of Media Composer (prior to the latest release 2018.12), there were very few ways to create titles that were higher resolution than HD (1920×1080) — the New Blue Titler was the only other option if you wanted to stay within Media Composer.

Title Studio within Media Composer

At first, the Continuum Title Studio interface appeared to be a mildly updated Boris Red interface — and I am allergic to the Boris Red interface. Some of the icons for the keyframing and the way properties are adjusted looks similar and threw me off. I tried really hard to jump into Title Studio and love it, but I really never got comfortable with it.

On the flip side, there are hundreds of presets that could help build quick titles that render a lot faster than New Blue Titler did. In some of the presets I noticed the text was placed outside of 16×9 Title Safety, which is odd since that is kind of a long standing rule in television. In the author’s defense, they are within Action Safety, but still.

If you need a quick way to make 4K titles, Title Studio might be what you want. The updated Title Studio includes realtime playback using the GPU instead of the CPU, new materials, new shaders and external monitoring support using Blackmagic hardware (AJA will be coming at some point). There are some great pre-sets including pre-built slates, lower thirds, kinetic text and even progress bars.

If you don’t have Mocha Pro, Continuum can still access and use Mocha to track shapes and masks. Almost every plugin can access Mocha and can track objects quickly and easily.
That brings me to the newly updated Mocha, which has some new features that are extremely helpful including a Magnetic Spline tool, prebuilt geometric shapes and more.

Mocha Pro 2019
If you loved the previous version of Mocha, you are really going to love Mocha Pro 2019. Not only do you get the Magnetic Lasso, pre-built geometric shapes, the Essentials interface and high-resolution display support, but BorisFX has rewritten the Remove Module code to use GPU video hardware. This increases render speeds about four to five times. In addition, there is no longer a separate Mocha VR software suite. All of the VR tools are included inside of Mocha Pro 2019.

If you are unfamiliar with what Mocha is, then I have a treat for you. Mocha is a standalone planar tracking app as well as a native plugin that works with Media Composer, Premiere and After Effects, or through OFX in Blackmagic’s Fusion, Foundry’s Nuke, Vegas Pro and Hitfilm.

Mocha tracking

In addition (and unofficially) it will work with Blackmagic DaVinci Resolve by way of importing the Mocha masks through Fusion. While I prefer to use After Effects for my work, importing Mocha masks is relatively painless. You can watch colorist Dan Harvey run through the process of importing Mocha masks to Resolve through Fusion, here.

But really, Mocha is a planar tracker, which means it tracks multiple points in a defined area that works best in flat surfaces or at least segmented surfaces, like the side of a face, ear, nose, mouth and forehead tracked separately instead of all at once. From blurs to mattes, Mocha tracks objects like glue and can be a great asset for an online editor or colorist.

If you have read any of my plugin reviews you probably are sick of me spouting off about Mocha, saying how it is probably the best plugin ever made. But really, it is amazing — especially when incorporated with plugins like Continuum and Sapphire. Also, thanks to the latest Media Composer with Symphony option you can incorporate the new Color Correction shapes with Mocha Pro to increase the effectiveness of your secondary color corrections.

Mocha Pro Remove module

So how fast is Mocha Pro 2019’s Remove Module these days? Well, it used to be a very slow process, taking lots of time to calculate an object’s removal. With the latest Mocha Pro 2019 release, including improved GPU support, the render time has been cut down tremendously. In my estimation, I would say three to four times the speed (that’s on the safe side). In Mocha Pro 2019 removal jobs that take under 30 seconds would have taken four to five minutes in previous versions. It’s quite a big improvement in render times.

There are a few changes in the new Mocha Pro, including interface changes and some amazing tool additions. There is a new drop-down tab that offers different workflow views once you are inside of Mocha: Essentials, Classic, Big Picture and Roto. I really wish the Essentials view was out when I first started using Mocha, because it gives you the basic tools you need to get a roto job done and nothing more.

For instance, just giving access to the track motion objects (Translation, Scale, Rotate, Skew and Perspective) with big shiny buttons helps to eliminate my need to watch YouTube videos on how to navigate the Mocha interface. However, if like me you are more than just a beginner, the Classic interface is still available and one I reach for most often — it’s literally the old interface. Big Screen hides the tools and gives you the most screen real estate for your roto work. My favorite after Classic is Roto. The Roto interface shows just the project window and the classic top toolbar. It’s the best of both worlds.

Mocha Pro 2019 Essentials Interface

Beyond the interface changes are some additional tools that will speed up any roto work. This has been one of the longest running user requests. I imagine the most requested feature that BorisFX gets for Mocha is the addition of basic shapes, such as rectangles and circles. In my work, I am often drawing rectangles around license plates or circles around faces with X-splines, so why not eliminate a few clicks and have that done already? Answering my need, Mocha now has elliptical and rectangular shapes ready to go in both X-splines and B-splines with one click.

I use Continuum and Mocha hand in hand. Inside of Media Composer I will use tools like Gaussian Blur or Remover, which typically need tracking and roto shapes created. Once I apply the Continuum effect, I launch Mocha from the Effect Editor and bam, I am inside Mocha. From here I track the objects I want to affect, as well as any objects I don’t want to affect (think of it like an erase track).

Summing Up
I can save tons of time and also improve the effectiveness of my work exponentially when working in Continuum 2019 and Mocha Pro 2019. It’s amazing how much more intuitive Mocha is to track with instead of the built-in Media Composer and Symphony trackers.

In the end, I can’t say enough great things about Continuum and especially Mocha Pro. Mocha saves me tons of time in my VFX and image restoration work. From removing camera people behind the main cast in the wilderness to blurring faces and license plates, using Mocha in tandem with Continuum is a match made in post production heaven.

Rendering in Continuum and Mocha Pro 2019 is a lot faster than previous versions, really giving me a leg up on efficiency. Time is money right?! On top of that, using Mocha Pro’s magic Object removal and Modules takes my image restoration work to the next level, separating me from other online editors who use standard paint and tracking tools.

In Continuum, Primatte Studio gives me the leg up on greenscreen keys with its exceptional ability to auto analyze a scene and perform 80% of the keying work before I dial-in the details. Whenever anyone asks me what tools I couldn’t live without, I without a doubt always say Mocha.
If you want a real Mocha Pro education you need to watch all of Mary Poplin’s tutorials. You can find them on YouTube. Check out this one on how to track and replace a logo using Mocha Pro 2019 in Adobe After Effects. You can also find great videos at Borisfx.com.

Mocha point parameter tracking

I always feel like there are tons of tools inside of the Mocha Pro toolset that go unused simply because I don’t know about them. One I recently learned about in a Surfaced Studio tutorial was the Quick Stabilize function. It essentially stabilizes the video around the object you are tracking allowing you to more easily rotoscope your object with it sitting still instead of moving all over the screen. It’s an amazing feature that I just didn’t know about.

As I was finishing up this review I saw that Boris FX came out with a training series, which I will be checking out. One thing I always wanted was a top-down set of tutorials like the ones on Mocha’s YouTube page but organized and sent along with practical footage to practice with.

You can check out Curious Turtle’s “More Than The Essentials: Mocha in After Effects” on their website where I found more Mocha training. There is even a great search parameter called Getting Started on BorisFX.com. Definitely check them out. You can never learn enough Mocha!


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Editor Wyatt Smith talks Mary Poppins Returns, Marvel Universe

By Amy Leland

Wyatt Smith’s career as an editor is the kind that makes for a great story. His unintended path began with an unusual opportunity to work with Mariah Carey and a chance meeting with director Rob Marshall. He has since collaborated on big musicals and action films with Marshall, which opened the door to superhero movies. His latest project — in which he was reunited with Marshall — saw him editing a big musical with a title character who is, in her own Disney way, also a superhero.

Smith’s resume is impressive: Doctor Strange, Into the Woods, 300: Rise of an Empire, Thor: The Dark World, Pirates of the Caribbean: On Stranger Tides. When I had a chance to talk with him about Mary Poppins Returns, I first had to ask him how his fascinating journey began.

Wyatt Smith at the Mary Poppins Returns premiere.

Can you talk about what led you to editing?
Some things just happen unexpectedly. Opportunities arise and you just have to hear the knock and not be afraid to open the door. When they were building the now-closed Sony Music Studios in New York City, I knew a lot about computers. Avid was first coming in, and there were all these video engineers who weren’t as savvy with Macs and things like that because they were used to linear, old-school tape editing. I worked in the maintenance department at the studio, servicing online editing suites, as well as setting up their first Avid Media Composer and giving people some tutorials on how to use that.

Then a very odd circumstance came up — they were working on a Mariah Carey concert video and needed an additional editor to work at her house at night (she was working during the day with another editor). My father is in the music business and had ties to Mariah — we had met before — so they thought it would be a comfortable situation. It came out of nowhere, and while I certainly knew, technically, how to edit, creatively I had no idea.

That was my first opportunity to edit, and I never went back to anything else. That was the day. That was it. I started to edit music videos and concerts and little music documentaries. Years and years later that led me to work with Rob Marshall on a music project.

The Tony Bennett American Classic special?
Exactly. I had known the Bennett family and worked with them since Tony Bennett’s “Unplugged.” When Rob was brought on to direct an NBC special celebrating Tony’s career, he wanted to bring his whole film team with him, but the TV network and the Bennett family wanted somebody who knew the music world, and that style of deadline, which is quite different from film.

I was brought in to interview with Rob, and we had a wonderful experience making that show. When it was done, he said, “Next time I make a film, I want you to come along.” To be completely honest, I didn’t believe him. I thought it was very kind of him, and he is a very nice man, but I was like, yeah, sure. In 2008, I think it was the Friday before they started shooting Nine, he called and said, “You gotta get to London.” I immediately quit my job and got on a plane.

I’m guessing the music world was a heavy influence on you, but were you drawn toward movies as well?
I have always been a movie junkie. At an early age, I saw a lot of the big epics, including David Lean’s films — Lawrence of Arabia, A Passage to India — which just transported me to another place and another culture. I loved that.

That was back in the early VHS days, and I had just about every Bond film that had been released. I watched them obsessively. In high school, my closest friend worked in a video rental store, so we constantly had movies. It was always a huge thing for me, but never in my life did I dream of pursuing it. The language of film was never anything I studied or thought about until I was kind of thrust into it.

What was it like coming into this film with Rob Marshall, after so many years of working with him? Do your collaborations now feel different from when you first started working together?
The most important part is trust. When I first met Rob, aside from just not having any confidence, I didn’t remotely know what I was doing. We all know that when you have your actors and your sets if something’s not quite right that’s the time to bring it up. But 12 years ago, the thought of me going to Rob and saying, “I don’t know if that really works, maybe you should grab a shot like…” I’d never, ever. But over the years we’ve developed that trust. I’m still very cautious with things like that, but I now know I can talk to him. And if he has a question, he’ll call me to set and say, “Quickly put this together,” or, “Stay here and watch this with me,” and he’ll explain to me exactly what he’s going for.

Then, once we reach post, unquestionably that relationship changes. We used to cut everything from scratch and start re-watching all the material and rebuilding the film again. Now we can work through existing cuts because I kind of know his intentions. It’s easier for me to see in the scene work what he’s going for, and that only comes from collaborating. Now I’m able to get the movie that’s in his head on screen a lot faster.

Mary Poppins Returns

You were working with complex animations and effects, and also combining those with elaborate choreography and live action. Was there more preplanning for this than you might normally have done?
I wasn’t really involved in the preplanning. I came in about a month before shooting to mostly to catch up with the schedules of the second unit, because I’m always going to work closely with them. I also went through all the storyboards and worked with visual effects and caught up on their look development. We did have a previz team, but we only really needed to previz two of the sequences in the film — the underwater bath time and the balloon sequence.

While previz gives you methodology, shot count, rough lenses and things, it’s missing the real emotion of the story because it is a video game and often cut like a music video. This is no disrespect to previz editors — they’re very good — but I always want to come in and do a pass before we start shooting because I find the timings are very different.

Doctor Strange

Take a film like Marvel’s Doctor Strange. So much of it had been prevized to figure out how to do it. When I came into the Doctor Strange previz cuts early on, they were exciting, psychedelic, wild and really imaginative, but I was losing actors. I found that something that was running at four minutes wasn’t representing any of the dialogue or the emotional content of the actors. So I asked them to give me stills of close-ups to cut them in. After putting in the dialogue, that four-minute sequence becomes seven minutes and you realize it’s too long. Before we go shoot it, how do we make it something that’s more manageable for the ultimate film?

Were you on set during most of the filming?
There were days where Rob would pull me onto set, and then days or weeks where I wouldn’t even see him. I did the traditional assembly process. Even the film I’m cutting right now, which has a very short schedule, four days after they were done shooting I had a cut of the film. It’s the only way for me to know that it’s working. It’s not a great cut, but I know that the movie’s all there. And, most importantly, I need to know, barring the last day of shooting, that I’ve seen every single frame of every take before they wrap. I need the confidence of knowing where it’s all going. I don’t want to discover any of that with a director in post.

On a project this complex, I imagine you must work with multiple assistants?
When I worked on the second Thor movie, The Dark World, I had a friend who was my first assistant, Meagan Costello. She has worked on many Marvel films. When Doctor Strange came up — I think it was almost a year before shooting that I got the call from the director saying I was in —within five seconds, I called Meagan because of her experience, her personality and her incredible skill set. Toward the end of Doctor Strange, when the schedule for Poppins was starting to lock in, she said, “I’ve always wanted to live in New York, and I’ve always wanted to work in a music hall.” I said, “We can make that happen.”

Thor: The Dark World

She is great at running the cutting room, taking care of all of my little, and many, prima donna bugaboos — how things are set up and working, technically, cutting in surround, having the right types of monitors, etc. What’s also important is having someone spiritually and emotionally connected into the film… someone I can talk to and trust.

We had two second assistant editors on Mary Poppins once we were in post — two in the US and two in London. It’s always interesting when you have two different teams. I try to keep as much consistency as I can, so we had Meagan all the way through London and New York. For second assistants in London, we had Gemma Bourne, Ben Renton and Tom Lane. Here in the states we had Alexander Johnson and Christa Haley. Christa is my first assistant on the film I’m currently doing for Focus Features, called Harriet.

On huge films like these, so much of the assistant editor’s time is dealing with the vast deliveries for the studio, the needs of a huge sound and music team as well as a lot of visual effects. In the end, we had about 1,300 hundred visual effect shots. That means a lot of turnovers, screenings and quality control so that nothing is ever coming in or going out without being meticulously watched and listened to.

The first assistant runs the cutting room and the stuff I shouldn’t be thinking about. It’s not stuff I would do well either. I want to be solely focusing on the edit, and when I’m lost in the movie, that’s the greatest thing. Having a strong editorial team allows me to be in a place where I’m not thinking about anything but the cut.

Mary Poppins Returns

That’s always good to hear. Most editors I talk to also care about making sure their assistants are getting opportunities.
When I started out, I had assistants in the room with me. It was very much film-style — the assistant was in the room helping me out with the director and the producers every day. If I had to run out of the room, the assistant could step in.

Unfortunately, the way the world has evolved, with digital post, the assistant editor and editor positions have diverged massively. The skill sets are very different. I don’t think I could do a first assistant editor’s job, but I know they could do my job. Also, the extra level of material keeps them very busy, so they’re not with me in the room. That makes for a much harder path, and that bothers me. I don’t quite know how to fix that yet, but I want to.

This industry started with apprentices, and it was very guild-like. Assistants were very hands on with the editor, so it was very natural to become an editor. Right now, that jump is a little tricky, and I wish I knew how to fix it.

Even if the assistants cut something together for you, it doesn’t necessarily evolve into them getting to work with a director or producer. With Poppins, there’s certainly a scene or two in the film that I asked Meagan to put together for that purpose. Rob works very closely in the cutting room each day, along with John DeLuca, our producer and choreographer. I was wondering if there would be that moment when maybe they’d split off, like, “Oh, go with Meagan and work on this, while I work on this with Rob.” But those opportunities never really arose. It’s hard to figure out how to get that door open.

Do you have any advice for editors who are just starting out?
I love the material I’m working on, and that’s the most important part. Even if something’s not for you, your job is not to make it what you want it to be. The job is to figure out who the audience is and how you make it great for them. There’s an audience for everything, you just have to tap into who that audience is.


Amy Leland is a film director and editor. Her short film, “Echoes”, is now available on Amazon Video. She also has a feature documentary in post, a feature screenplay in development, and a new doc in pre-production. She is an editor for CBS Sports Network and recently edited the feature “Sundown.” You can follow Amy on social media on Twitter at @amy-leland and Instagram at @la_directora.

Review: Picture Instruments’ plugin and app, Color Cone 2

By Brady Betzel

There are a lot of different ways to color correct an image. Typically, colorists will start by adjusting contrast and saturation followed by adjusting the lift, gamma and gain (a.k.a. shadows, midtones and highlights). For video, waveforms and vectorscopes are great ways of measuring color values and are about the only way to get the most accurate scientific facts on the colors you are manipulating.

Whether you are in Blackmagic Resolve, Avid Media Composer, Adobe Premiere Pro, Apple FCP X or any other nonlinear editor or color correction app, you usually have similar color correction tools across apps — whether you color based on curves, wheels, sliders or even interactively on screen. So when I heard about the way that Picture Instruments Color Cone 2 color corrects — via a Cone (or really a bicone) — I was immediately intrigued.

Color Cone 2 is a standalone app but also, more importantly, a plugin for Adobe After Effects, Adobe Premiere Pro and FCP X. In this review I am focusing on the Premiere Pro plugin, but keep in mind that the standalone version works on still images and allows you to export a 3dl or cube LUTs — a great way for a client to see what type of result you can get quickly from just a still image.

Color Cone 2 is literally a color corrector when used as a plugin for Adobe Premiere. There are no contrast and saturation adjustments, just the ability to select a color and transform it. For instance, you can select a blue sky and adjust the hue, chromanance (saturation) and/or luminance of the resulting color inside of the Color Cone plugin.

To get started you apply the Color Cone 2 plugin to your clip — the plugin is located under Picture Instruments in the Effects tab. Then you click the little square icon in the effect editor panel to open up the Color Cone 2 interface. The interface contains the bicone image representation of the color correction, presets to set up a split-tone color map or a three-point color correct, and the radius slider to adjust the effect your correction has on surrounding color.

Once you are set on a look you can jump out of the Color Cone interface and back into the effect editor inside of Premiere. There you can keyframe all of the parameters you adjusted in the Color Cone interface. This allows for a nice and easy way to transition from no color correction to color correction.

The Cone
The Cone itself is the most interesting part of this plugin. Think of the bicone as the 3D side view of a vectorscope. In other words, if the vectorscope view from a traditional scope is the top view — the bicone in Color Cone would be a side view. Moving your target color from the top cone to the bottom cone will adjust your lightness to darkness (or luminance). At the intersection of the cones is the saturation (or chromanance) and when moving from the center outwards saturation is increased. When a color is selected using the eye dropper you will see a square, which represents the source color selection, a circle representing the target color and an “x” with a line for reference on the middle section.

Additionally, there is a black circle on the saturation section in the middle that shows the boundaries of how far you can push your chromanance. There is a light circle that represents the radius of how surrounding colors are affected. Each video clip can have effects layered on them and one instance of the plugin can handle five colors. If you need more than five, you can add another instance of the plugin to the same clip.

If you are looking to export 3dl and Cube LUTs of your work you will need to use the standalone Color Cone 2 app. The one caveat to using the standalone app is that you can only apply color to still images. Once you do that you can export the LUT to be used in any modern NLE/color correction app.

Summing Up
To be honest, working in Color Cone 2 was a little weird for me. It’s not your usual color correction workflow, so I would need to sit with the plugin for a while to get used to its setup. That being said, it has some interesting components that I wish other color correction apps would use, such as the Cone view. The bicone is a phenomenal way to visualize color correction in realtime.

In my opinion, if Picture Instruments would sell just the Cone as a color measurement tool to work in conjunction with Lumetri, they would have another solid tool. Color Cone 2 has a very unique and interesting way to color correct in Premiere that acts as an advanced secondary color correct tool to the Lumetri color correction tools.

The Color Cone 2 standalone app and plugin costs $139 when purchased together, or $88 individually. In my opinion, video people should probably just stick to the plugin version. Check out Picture Instrument’s website for more info on Color Cone 2 as well as their other products. And check them out on Twitter @Pic_instruments.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Crazy Rich Asians editor Myron Kerstein

By Amy Leland

When the buzz started in anticipation of the premiere of Crazy Rich Asians, there was a lot of speculation about whether audiences would fill the theaters for the first all-Asian cast in an American film since 1993’s Joy Luck Club. Or whether audiences wanted to see a romantic comedy, a format that seemed to be falling out of favor.

The answer to both questions was a resounding, “Yes!” The film grossed $35 million during its opening weekend, against a $30 million budget. It continued going strong its second weekend, making another $28M, the highest Labor Day weekend box office in more than a decade. It was the biggest opening weekend for a rom-com in three years, and is the most successful studio rom-com in nine. All of this great success can be explained pretty simply — it’s a fun movie with a well-told story.

Not long ago, I had the great fun of sitting down with one of its storytellers, editor Myron Kerstein, to discuss this Jon M. Chu-directed film as well as Kerstein’s career as an editor.

How did you get started as an editor?
I was a fine arts major in college and stumbled upon photography, filmmaking, painting and printmaking. I really just wanted to make art of any kind. Once I started doing more short films in college, I found a knack for editing.

When I first moved to New York, I needed to make a living, so I became a PA, and I worked on a series called TV Nation one of Michael Moore’s first shows. It was political satire. There was a production period, and then slowly the editors needed help in the post department. I gravitated toward these alchemists, these amazing people who were making things out of nothing. I really started to move toward post through that experience.

I also hustled quite a bit with all of those editors, and they started to hire me after that job. Slowly but surely I had a network of people who wanted to hire me again. That’s how I really started, and I really began to love it. I thought, what an amazing process to read these stories and look at how much power and influence an editor has in the filmmaking process.

I was not an assistant for too long, because I got to cut a film called Black & White. Then I quickly began doing edits for other indies, one being a film called Raising Victor Vargas, and another film called Garden State. That was my big hit in the indie world, and slowly that lead to more studio films, and then to Crazy Rich Asians.

Myron Kerstein and Crazy Rich Asians actor Henry Golding.

Your first break was on a television show that was nothing like feature films. How did you ultimately move toward cutting feature films?
I had a real attraction to documentary filmmaking, but my heart wanted to make narrative features. I think once you put that out in the universe, then those jobs start coming to you. I then stumbled upon my mentor, Jim Lyons, who cut all of Todd Haynes’s movies for years. When I worked on Velvet Goldmine as an assistant editor, I knew this was where I really needed to be. This was a film with music that was trying to say something, and was also very subversive. Jim and Todd were these amazing filmmakers that were just shining examples of the things I wanted to make in the future.

Any other filmmakers or editors whose work influenced you as you were starting out?
In addition to Todd Haynes, directors like Gus Van Sant and John Hughes. When I was first watching films, I didn’t really understand what editors did, so at the same time I was influenced by Spielberg, or somebody like George Romero. Then I realized there were editors later who made these things. Ang Lee, and his editor Tim Squyres were like a gods to me. I really wanted to work on one of Ang’s crews very badly, but everyone wanted to work with him. I was working at the same facilities where Ang was cutting, and I was literally sneaking into his edit rooms. I would be working on another film, and I would just kind of peek my head in and see what they were doing and that kind of thing.

How did this Crazy Rich Asians come about for you?
Brad Simpson, who was a post supervisor on Velvet Goldmine back in the ‘90s when I was the assistant editor, is a producer on this film. Flash forward 20 years and I stumbled upon this script through agents. I read it and I was like, “I really want to be a part of this, and Brad’s the producer on this thing? Let me reach out to him.” He said, “I think you might be the right fit for this.” It was pretty nerve-wracking because I’d never worked with Jon before. Jon was a pretty experienced filmmaker, and he’d worked with a lot of editors. I just knew that if I could be part of the process, we could make something special.

My first interview with Jon was a Skype interview. He was in Malaysia already prepping for the film. Those interviews are very difficult to not look or sound weird. I just spoke from the heart, and said this is what I think makes me special. These are the ways I can try to influence a film and be part of the process. Lucky enough between that interview and Brad’s recommendation, I got the job.

Myron Kerstein and director Jon Chu.

When did you begin your work on the film?
I basically started the first week of filming and joined them in Malaysia and Singapore for the whole shoot. It was a pretty amazing experience being out there in two Muslim countries — two Westernized Muslim countries that were filled with some of the friendliest people I’ve ever met. It was an almost entirely local crew, a couple of assistant editors, and me. Sometimes I feel like it might not be the best thing for an editor to be around set too much, but in this case it was good for me to see the setting they were trying to portray… and feel the humidity, the steaminess, the romance and Singapore, which is both alien and beautiful at the same time.

What was your collaboration like with Jon Chu?
It was just an organic process, where my DNA started to become infused with Jon’s. The good thing about my going to Malaysia and Singapore was we got to work together early. One thing that doesn’t happen often anymore is a director who actually screens dailies in a theater. Jon would do that every weekend. We would watch dailies, and he would say what he liked and didn’t like, or more just general impressions of his footage. That allowed me to get into his head a bit.

At the same time I was also cutting scenes. At the end of every day’s screening, we would sit down together. He gave me a lot of freedom, but at the same time was there to give me his first impressions of what I was doing. I think we were able to build some trust really early.

Because of the film’s overwhelming success, this has opened doors for other Asian-led projects.
Isn’t that the most satisfying thing in the world? You hope to define your career by moments like this, but rarely get that chance. I watched this film, right when it was released, which was on my birthday. I ended up sitting next to this young Asian boy and his mom. This kid was just giggling and weeping throughout the movie. To have an interaction with a kid like that, who may have never seen someone like himself represented on the screen was pretty outstanding.

Music was such an important part of this film. The soundtrack is so crucial to moments in the film that it almost felt like a musical. Were you editing scenes with specific songs in mind, or did you edit  and then come back and add music?
Jon gave me a playlist very early on of music he was interested in. A lot of the songs sounded like they were from the 1920s — almost big band tunes. Right then I knew the film could have more of a classy Asian-Gatsby quality to it. Then as we were working on the film together, we started trying out these more modern tunes. I think the producers might have thought we were crazy at one point. You’re asking the audience to go down these different roads with you, and that can sometimes work really well, or sometimes can be a train wreck.

But as much as I love working with music, when I assemble I don’t cut with any music in mind. I try not to use it as a crutch. Oftentimes you cut something with music, either with a song in your head, or often editors will cut with a song as a music bed. But, if you can’t tell a story visually without a song to help drive it, then I think you’re fooling yourself.

I really find that my joy of putting in music happens after I assemble, and then I enjoy experimenting. That Coldplay song at the end of the film, for example… We were really struggling with how to end our movie. We had a bunch of different dialogue scenes that were strung together, but we didn’t feel like it was building up to some kind of climax. I figured out the structure and then cut it like any other scene without any music. Then Jon pitched a couple songs. Ironically enough I had an experience with Coldplay from the opening of Garden State. I liked the idea of this full circle in my own career with Coldplay at the end of a romantic comedy that starred an all-Asian cast. And it really felt like it was the right fit.

The graphic design was fascinating, especially in the early scene with Rachel and Nick on their date that kicks off all of the text messages. Is that something that was storyboarded early, or was that something you all figured out in the edit and in post?
Jon did have a very loose six-page storyboard of how we would get from the beginning of this to the end. The storyboard was nothing compared to what we ended up doing. When I first assembled my footage, I stitched together a two-minute sequence of just split screens of people reacting to other people. Some of that footage is in the movie, but it was just a loose sketch. Jon liked it, but it didn’t represent what he imagined this sequence to be. To some extent he had wondered whether we even needed the sequence.

Jon and I discussed it and said, “Let’s give this a shot. Let’s find the best graphics company out there.” We ended up landing with this company called Aspect, led by John Berkowitz. He and his team of artists worked with us to slowly craft this sequence over months. Beginning with, “How do we get the first text bubble to the second person? What do those text bubbles look like? How do they travel?” Then they gave us 20 different options to see how those two elements would work together. Then we asked, “How do we start expanding outward? What information are we conveying? What is the text bubble saying?” It was like this slowly choreographed dance that we ended up putting together over the course of months.

They would make these little Disney-esque pops. We really loved that. That kind of made it feel like we were back in old Hollywood for a second. At the same time we had these modern devices with text bubbles. So far as the tone was concerned, we tried percussion, just drumming, and other old scores. Then we landed on a score from John Williams from 1941, and that gave us the idea that maybe some old-school big band jazz might go really well in this. Our composer Brian Tyler saw it, and said, “I think I can make this even zanier and crazier.”

How do you work with your assistants?
Assistants are crucial as far as getting through the whole process. I actually had two sets of assistants; John To and David Zimmerman were on the first half in Malaysia and Singapore. I found John through my buddy Tom Cross, who edits for Damien Chazelle. I wanted somebody who could help me with the challenges of getting through places like Malaysia and Singapore, because if you’re looking for help for your Avid, or trying to get dailies from Malaysia to America, you’re kind of on your own. Warner Bros. was great and supportive, and they gave us all the technical help. But it’s not like they can fly somebody out if something goes wrong in an hour.

On the post side I ended up using Melissa Remenarich-Aperlo, and she was outstanding. In the post process I needed somebody to hold down the fort and keep me organized, and also somebody for me to bounce ideas off of. I’m a big proponent of using my assistants creatively. Melissa ended up cutting the big fashion montage. I really struggled with that sequence because I felt strongly like this might be a trope that this film didn’t need. That was the debate with a lot of them. Which romantic comedy tropes should we have in this movie? Jon was like, “It’s wish fulfillment. We really need this. I know we’ve seen it a thousand times, but we need this scene.”

I said let’s try something different. Let’s try inter-cutting the wedding arrival with the montage, and let’s try to make it one big story to get us from us not knowing what she’s going to show up in to her arrival. Both of those sequences were fine on their own, but it didn’t feel like either one of them was doing anything interesting. It just felt like we were eating up time, and we needed to get to the wedding, and we had a lot of story to tell. Once we inter-cut them we knew this was the right choice. As Jon said, you need these moments in the film where you can just sit back and take a breath, smile for a minute and get ready for the drama that starts. Melissa did a great job on that sequence.

Do you have any advice for somebody who’s just starting out and really wants to edit feature films?
I would tell them to start cutting. Cut anything they can. If they don’t have the software, they can cut on iMovie on their iPhone. Then they should  reach out to people like me and create a network. And keep doing that until people say yes. Don’t be afraid to reach out to people.

Also don’t be afraid to be an assistant editor. As much as they want to cut, as they should, they also need to learn the process of editing from others. Be willing to stick with it, even if that means years of doing it. I think you’d be surprised how much you learn over the course of time with good editors. I feel like it’s a long bridge. I’ve been doing this for 20 years, and it took a long time to get here, but perseverance goes a long way in this field. You just have to really know you want to do it and keep doing it.


Amy Leland is a film director and editor. Her short film, “Echoes”, is now available on Amazon Video. She also has a feature documentary in post, a feature screenplay in development, and a new doc in pre-production. She is an editor for CBS Sports Network and recently edited the feature “Sundown.” You can follow Amy on social media on Twitter at @amy-leland and Instagram at @la_directora.

Presenting at IBC vs. NAB

By Mike Nuget

I have been lucky enough to attend NAB a few times over the years, both as an onlooker and as a presenter. In 2004, I went to NAB for the first time as an assistant online editor, mainly just tagging along with my boss. It was awesome! It was very overwhelming and, for the most part, completely over my head.  I loved seeing things demonstrated live by industry leaders. I felt I was finally a part of this crazy industry that I was new to. It was sort of a rite of passage.

Twelve years later, Avid asked me to present on the main stage. Knowing that I would be one of the demo artists that other people would sit down and watch — as I had done just 12 years earlier — was beyond anything I thought I would do back when I first started. The demo showed the Avid and FilmLight collaboration between the Media Composer and the Baselight color system. Two of my favorite systems to work on. (Watch Mike’s presentation here.)

Thanks to my friend and now former co-worker Matt Schneider, who also presented alongside of me, I had developed a very good relationship with the Avid developers and some of the people who run the Avid booth at NAB. And at the same time, the Filmlight team was quickly being put on my speed dial and that relationship strengthened as well.

This past NAB, Avid once again asked me to come back and present on the main stage about Avid Symphony Color and FilmLight’s Baselight Editions plug-in for Avid, but this time I would get to represent myself and my new freelance career change — I had just left my job at Technicolor-Postworks in New York a few weeks prior. I thought that since I was now a full-time freelancer this might be the last time I would ever do this kind of thing. That was until this past July, when I got an email from the FilmLight team asking me to present at IBC in Amsterdam. I was ecstatic.

Preparing for IBC was similar enough as far as my demo, but I was definitely more nervous than I was at NAB. I think it was two reasons: First, presenting in front of many different people in an international setting. Even though I am from the melting pot of NYC, it is a different and interesting feeling being surrounded by so many different nationalities all day long, and pretty much being the minority. On a personal note, I loved it. My wife and I love traveling, and to us this was an exciting chance to be around people from other cultures. On a business level, I guess I was a little afraid that my fast-talking New Yorker side would lose some people, and I didn’t want that to happen.

The second thing was that this was the first time that I was presenting strictly for FilmLight and not Avid. I have been an Avid guy for over 15 years. It’s my home, it’s my most comfortable system, and I feel like I know it inside and out. I discovered Baselight in 2012, so to be presenting in front of FilmLight people, who might have been using their systems for much longer, was a little intimidating.

When I walked into the room, they had setup a full-on production, along with spotlights, three cameras, a projector… the nerves rushed once again. The demo was standing room only. Sometimes when you are doing presentations, time seems to fly by, so I am not sure I remember every minute of the 50-minute presentation, but I do remember at one point within the first few minutes my voice actually trembled, which internally I thought was funny, because I do not tend to get nervous. So instead of fighting it, I actually just said out loud “Sorry guys, I’m a little nervous here,” then took a deep breath, gathered myself, and fell right into my routine.

I spent the rest of the day watching the other FilmLight demos and running around the convention again saying hello to some new vendors and goodbye to those I had already seen, as Sunday was my last day at the show.

That night I got to hang out with the entire Filmlight staff for dinner and some drinks. These guys are hilarious, what a great tight-knit family vibe they have. At one point they even started to label each other, the uncle, the crazy brother, the funny cousin. I can’t thank them enough for being so kind and welcoming. I kind of felt like a part of the family for a few days, and it was tremendously enjoyable and appreciated.

Overall, IBC felt similar enough to NAB, but with a nice international twist. I definitely got lost more since the layout is much more confusing than NAB’s. There are 14 halls!

I will say that the “relaxing areas” at IBC are much better than NAB’s! There is a sandy beach to sit on, a beautiful canal to sit by while having a Heineken (of course) and the food trucks were much, much better.

I do hope I get to come back one day!


Mike Nuget (known to most as just “Nuget”) is a NYC-based colorist and finishing editor. He recently decided to branch out on his own and become a freelancer after 13 years with Technicolor-Postworks. He has honed a skill set across multiple platforms, including FilmLight’s Baselight, Blackmagic’s Resolve, Avid and more. 

Editor Paul Zucker on cutting Hotel Artemis

By Zack Wolder

The Drew Pearce-directed Hotel Artemis is a dark action-thriller set in a riot-torn Los Angeles in the not-too-distant future. What is the Hotel Artemis? It’s a secret members-only hospital for criminals run by Jodie Foster with the help of David Bautista. The film boasts an impressive cast that also includes Sterling K. Brown, Jeff Goldblum, Charlie Day, Sofia Boutella and Jennie Slate.

Hotel Artemis editor Paul Zucker, ACE, has varied credits that toggle between TV and film, including Trainwreck, This is 40, Eternal Sunshine of a Spotless Mind, Girls, Silicon Valley and many others.

We recently reached out to Zucker, who worked alongside picture editor Gardner Gould, to talk about his process on the film.

Paul Zucker and adorable baby.

How did you get involved in this film?
This was kind of a blind date set-up. I wasn’t really familiar with Drew, and it was a project that came to me pretty late. I think I joined about a week, maybe two, before production began. I was told that they were in a hurry to find an editor. I read the script, I interviewed with Drew, and that was it.

How long did it take to complete the editing?
About seven months.

How involved were you throughout the whole phase of production? Were you on set at all?
I wasn’t involved in pre-production, so I wasn’t able to participate in development of the script or anything like that, but as soon as the camera started rolling I was cutting. Most of the film was shot on stages in downtown LA, so I would go to set a few times, but most of the time there was enough work to do that I was sequestered in the edit room and trying to keep up with camera.

I’m an editor who doesn’t love to go to set. I prefer to be uninfluenced by whatever tensions, or lack of tensions, are happening on set. If a director has something he needs me for, if it’s some contribution he feels I can make, I’m happy, able and willing to participate in shot listing, blocking and things like that, but on this movie I was more valuable putting together the edit.

Did you have any specific deadlines you had to meet?
On this particular movie there was a higher-than-average number of requests from director Drew Pearce. Since it was mostly shot on stages, he was able to re-shoot things a little easier than you would if we were on location. So it became important for him to see the movie sooner rather than later.

A bunch of movies ago, I adopted a workflow of sending the director whatever I had each Friday. I think it’s healthy for them to see what they’re working on. There’s always the chance that it will influence the work they’re doing, whether it’s performance of the actors or the story or the script or really anything.

As I understand it from the directors I’ve worked for, seeing the editor’s cut can be the worst day of the process for them. Not because of the quality of the editing, but because it’s hard in that first viewing to look past all the things that they didn’t get on set. Its tough to not just see the mistakes. Which is totally understandable. So I started this strategy of easing them into it. I just send scenes; I don’t send them in sequence. By the time they get to the editors cut, they’ve seen most of the scenes, so the shock is lessened and hopefully that screening is more productive

Do you ever get that sense that you may be distracting them or overwhelming them with something?
Yes, sometimes. A couple of pictures ago, I did my normal thing — sending what I had on a Friday — and the director told me he didn’t want to watch them. For him, issues of post were a distraction while he was in production. So to each his own.

Drew Pearce certainly benefitted. Drew was the type of director who, if I sent it at 9pm, he would be watching it at 9:05pm, and he would be giving me notes at 10:05pm.

Are you doing temp color and things like that?
Absolutely. I do as much as the footage I’m given requires. On this particular movie, the cinematographer, the DIT and the lab were so dialed in that these were the most perfect-looking dailies I think I’ve ever gotten. So I had to do next to nothing. I credit DP Chung-Hoon Chung for that. Generally, if I’m getting dailies that are mismatched in color tone, I’m going to do whatever it takes to smooth it out. Nothing goes in front of the director until it’s had a hardcore sound and color pass. I am always trying to leave as little to the imagination as possible. I try to present something that is as close to the experience that the audience will have when they watch the movie. That means great color, great sound, music, all of that.

Do you ever provide VFX work?
Editorial is typically always doing simple VFX work like split-screens, muzzle-flashes for guns, etc. Those are all things that we’re really comfortable doing.

On this movie, theres a large VFX component, so the temp work was more intense. We had close to 500 VFX shots, and there’s some very involved ones. For example, a helicopter crashes into a building after getting blasted out of the sky with a rocket launcher. There are multiple scenes where characters get operated on by robotic arms. There’s a 3D printer that prints organs and guns. So we had to come up with a large number of temp shots in editorial.

Editor Gardner Gould and assistant editors Michael Costello and Lillian Dawson Bain were instrumental in coming up with these shots.

What about editing before the VFX shots are delivered?
From the very beginning, we are game-planning — what are the priorities for the movie vis-a-vis VFX? Which shots do we need early for story reasons? Which shots are the most time consuming for the VFX department? All of these things are considered as the entire post production department collaborates to come up with a priorities list.

If I need temp versions of shots to help me edit the scene, the assistants help me make them. If we can do them, we’ll do them. These aid in determining final VFX shot length, tempo, action, anything. As the process goes on, they get replaced by shots we get from the VFX department.

One thing I’m always keeping in mind is that shots can be created out of thin air oftentimes. If I have a story problem, sometimes a shot can be created that will help solve it. Sometimes the entire meaning of a scene can change.

What do you expect from your assistant editors?
The first assistant had to have experience with visual effects. The management of workflow for 500 shots is a lot, and on this job, we did not have a dedicated VFX editor. That fell upon (my co-editor) editor Gardner Gould.

I generally kick a lot of sound to the assistant, as I’m kind of rapidly moving through cutting picture. But I’m also looking for someone who’s got that storytelling bone that great editors have. Not everybody has it, not every great assistant has it.

There is so much minutiae on the technical side of being an assistant editor that you run the risk of forgetting that you’re working on a movie for an audience. And, indeed, some assistants just do the assistant work. They never cut scenes, they never do creative work, they’re not interested or they just don’t. So I’m always encouraging them to think like an editor at every point.

I ask them for their opinions. I invite them into the process, I don’t want them to be afraid to tell me what they think. You have to express yourself artistically in every decision you make. I encourage them to think critically and analytically about the movie that we’re working on.

I came up as an assistant and I had a few people who really believed in me. They invited me into the room with the director and they gave me that early exposure that really helped me learn my trade. I’m kind of looking to pay back that favor to my assistants.

Why did you choose to edit this film on Avid? Are you proficient in any other NLEs?
Oh, I’d say strictly Avid. To me, a tool, a technology, should be as transparent as possible. I want to have the minimum of time in between thought and expression. Which means that if I think of an edit, I want to automatically, almost without thinking, be able to do a keystroke and have that decision appear on the monitor. I’m so comfortable with Avid that I’m at that point.

How is your creative process different when editing a film versus a TV show?
Well first, a TV show is going to have a pre-determined length. A movie does not have a pre-determined length. So in television you’re always wrangling with the runtime. The second thing that’s different is in television schedules are a little tighter and turnaround times are a little tighter. You’re constantly in pre-production, production and post at the same time.

Also, television is for a small screen. Film, generally speaking, is for the big screen. The venue matters for a lot of reasons, but it matters for pacing. You’re sitting in a movie theater and maybe you can hold shots a little bit longer because the canvas is so wide and there’s so much to look at. Whereas with the small screen, you’re sitting closer to the television, the screen itself is smaller, maybe the shots are typically not as wide or you cut a little quicker.

You’re a very experienced comedic editor. Was it difficult to be considered for a different type of film?
I guess the answer is yes. The more famous work I’ve done in the last couple of years has been for people like Lena Dunham and Judd Apatow. So people say, “Well, he’s a comedy editor.” But if you look at my resume dating back to the very first thing I did in 2001, I edited my first movie — a pretty radical film for Gus Van Sant called Gerry, and it was not a comedy. Eternal Sunshine was not a comedy. Before Girls, I couldn’t get hired on comedies.

Then I got pulled on by Judd to work on some of his movies, and he’s such a brand name that people see that on your resume and they say, “Well, you must be a comedy editor.” So, yes, it does become harder to break out of that box, but that’s the box that other people put you in, I don’t put myself in that. My favorite filmmakers work across all types of genre.

Where do you find inspiration? Music? Other editors? Directors?
Good question. I mean… inspiration is everywhere. I’m a movie fan, I always have been, that’s the only thing I’ve ever wanted to do. I’m always going to the movies. I watch lots of trailers. I like to keep up with what people are doing. I go back and re-watch the things that I love. Listening to other editors or reading other editors speak about their process is inspiring to me. Listening and speaking with people who love what they do is inspiring.

For Hotel Artemis, I went back and watched some movies that were an influence on this one to get in the tone-zone. I would listen to a lot of the soundtracks that were soundtracks to those movies. As far as watching movies, I watched Assault on Precinct 13, for instance. That’s a siege movie, and Hotel Artemis is kind of a siege movie. Some editors say they don’t watch movies while they’re making a movie, they don’t want to be influenced. It doesn’t bother me. It’s all in the soup.


Zack Wolder is a video editor based in NYC. He is currently the senior video editor at Billboard Magazine.  Follow him on Instagram at @thezackwolder.

Avid adds to Nexis product line with Nexis|E5

The Nexis|E5 NL nearline storage solution from Avid is now available. The addition of this high-density on-premises solution to the Avid Nexis family allows Avid users to manage media across all their online, nearline and archive storage resources.

Avid Nexis|E5 NL includes a new web-based Nexis management console for managing, controlling and monitoring Nexis installations. NexislE5 NL can be easily accessed through MediaCentral | Cloud UX or Media Composer and also integrates with MediaCentral|Production Management, MediaCentral|Asset Management and MediaCentral|Editorial Management to help collaboration, with advanced features such as project and bin sharing. Extending the Nexis|FS (file system) to a secondary storage tier makes it easy to search for, find and import media, enabling users to locate content distributed throughout their operations more quickly.

Build for project parking, staging workflows and proxy archive, Avid reports that Nexis | E5 NL streamlines the workflow between active and non-active assets, allowing media organizations to park assets as well as completed projects on high-density nearline storage, and keep them within easy reach for rediscovery and reuse.

Up to eight Nexis|E5 NL engines can be integrated as one virtualizable pool of storage, making content and associated projects and bins more accessible. In addition, other Avid Nexis Enterprise engines can be integrated into a single storage system that is partitioned for better archival organization.

Additional Nexis|E5 NL features include:
• It’s scalable from 480TB of storage to more than 7PB by connecting multiple Nexis|E5 NL engines together as a single nearline system for a highly scalable, lower-cost secondary tier of storage.
• It offers flexible storage infrastructure that can be provisioned with required capacity and fault-tolerance characteristics.
• Users can configure, control and monitor Nexis using the updated management console that looks and feels like a MediaCentral|Cloud UX application. Its dashboard provides an overview of the system’s performance, bandwidth and status, as well as access to quickly configure and manage workspaces, storage groups, user access, notifications and other functions. It offers the flexibility and security of HTML5 along with an interface design that enables mobile device support.

Pacific Post adds third LA location servicing editorial

Full-service editorial equipment rental and services provider Pacific Post has expanded its footprint with the opening of a new 10,000 square-foot facility in Sherman Oaks, California. This brings the total locations in the LA area to three, including North Hollywood and Hollywood.

The new location offers 25 Avid suites with 24/7 technical support, alongside a writer’s room and several production offices. Pacific Post has retrofitted the entire site, which is supported by Avid Nexis shared storage and 1GB of dedicated Fiber internet connectivity.

“We recently provided equipment and services to the editorial team on Game Over, Man! for Netflix in Sherman Oaks, and continued to receive inquiries from other productions in the area,” says Pacific Post VP Kristin Kumamoto. “The explosion we’ve seen in scripted production, especially for streaming platforms, prompted our decision to add this building to our offerings.”

Kumamoto says a screening room is also close to completion. It features a 150-inch screen and JVC 4K projector for VFX reviews and an enhanced, in-house viewing experience. Additional amenities at Pacific Post Sherman Oaks include MPAA-rated security, reserved parking, a full kitchen and lounge, VoIP phone systems and a substantial electrical infrastructure.

We reached out to Kumamoto to find out more.

Why the investment in Avid over some of the other NLE choices out there currently?
It really stems from the editorial community — from scripted and non-scripted shows that really want to work in shared project environments. They trust the media management with Avid’s shared storage, making it a clear choice when working on projects with the tightest deadlines.

How do you typically work with companies coming in looking for editing space? What is your process?
It usually starts with producers looking for a location that meets the needs of the editors in terms of commute or the proximity to studios for executives.  After that, it really comes down to having a secure and flexible layout along with a host of other requirements.”

With cutting rooms in North Hollywood/Universal City and in Hollywood, we feel Sherman Oaks is the perfect location to complement the other facilities and really give more choices to producers looking to set up cutting rooms in the San Fernando Valley area of LA.

A Sneak Peek: Avid shows its next-gen Media Composer

By Jonathan Moser

On the weekend of NAB and during Avid Connect, I found myself sitting in a large meeting room with some of the most well-known editors and creatives in the business. To my left was Larry Jordan, Steve Audette was across from me, Chris Bovè and Norman Hollyn to my right, and many other luminaries of the post world filled the room. Motion picture, documentary, boutique, commercial and public broadcasting masters were all represented here… as well as sound designers and producers. It was quite humbling for me.

We’d all been asked to an invite-only meeting with the leading product designers and engineers from Avid Technology to see the future of Media Composer… and to do the second thing we editors do best: bitch. We were asked to be as tough, critical and vocal as we could about what we’re about to see. We were asked to give them a thumbs up or thumbs down on their vision and execution of the next generation of Media Composer as they showed us long-needed overhauls and redesigns.

Editors Chris Bové and Avid’s Randy Martens getting ready for the unveil.

What we were shown is the future of the Media Composer, and based on what I saw, its future is bright. You think you’ve heard that before? Maybe, but this time is different. This is not vaporware, smoke and mirrors or empty promises… I assure you, this is the future.

The Avid team, including new Avid CEO Jeff Rosica, was noticeably open and attentive to the assembled audience of seasoned professionals invited to Avid Connect… a far cry from the halcyon days of the ‘90s and 2000s when Media Composer ruled the roost, and sat complacently on its haunches. Too recently, the Avid corporate culture was viewed by many in the post community as arrogant and tone deaf to its users’ criticisms and requests. This meeting was a far cry from that.

What we were shown was a redefined, reenergized and proactive attitude from Avid. Big corporations aren’t ordinarily so open about such big changes, but this one directly addressed decades of users’ concerns and suggestions.

By the way, this presentation was separate from the new NAB announcements of tiered pricing, new feature rollouts and enhanced interoperability for Media Composer. Avid invited us here not for approval, but for appraisal… for our expertise and feedback and to help steer them in the right direction.

As a life-long Avid user who has often questioned the direction of where the company was headed, I need to say this once more: this time is different.

These are real operational changes that we got to see in an open, informed — and often questioned and critiqued — environment. We editors are a tough crowd, but team Avid was ready, listening, considering and feeding back new ideas. It was an amazingly open and frank give and take from a company that once was shut off from such possibilities.

In her preliminary introduction, Kate Ketcham, manager of Media Composer product management, gave the assembled audience a pretty brutal and honest assessment of Media Composer’s past (and oft repeated) failings and weaknesses —a task usually reserved for us editors to tell Avid, but this time it was Avid telling us what we already knew and they had come to realize. Pretty amazing.

The scope of her critique showed us that, despite popular opinion, Avid HAS been listening to us all along: they got it. They acknowledged the problems, warts and all, and based on the two-hour presentation shown through screenshots and demos, they’re intent on correcting their mistakes and are actively doing so.

Addressing User Concerns
Before the main innovations were shared, there was an initial concern from the editors that Avid be careful not to “throw out the baby with the bathwater” in its reinvention. Media Composer’s primary strength — as well as one of its most recognized weaknesses among newer editors — has been its consistency of look and feel, as well as its logical operational methodology and dependable media file structural organization. Much was made of one competitor’s historical failure to keep consistency and integrity of the basic and established editing paradigms (such as two-screen layout, track-based editing, reasonably established file structure, etc.) in a new release.

We older editors depend on a certain consistency. Don’t abandon the tried and true, but still “get us into this century” was the refrain from the assembled. The Avid team addressed these concerns clearly and reassuringly — the best, familiar and most trusted elements of Media Composer would stay, but there will now be so much more under the hood. Enough to dynamically attract and compel newer users and adoptees.

The company has spent almost a year doing research, redesign and implementation; this is a true commitment, and they are pledging to do this right. Avid’s difficult and challenging task in reimagining Media Composer was to make the new iteration steadfast, efficient and dependable (something existing users expect), yet innovative, attractive, flexible, workflow-fluid and intuitive enough for the newer users who are used to more contemporary editing and software. It’s a slippery and problematic slope, but one the Avid team seemed to navigate with an understanding of the issues.

As this is still in the development stage, I can’t reveal particulars (I really wish I could because there were a ton), but I can give an overview of the types of implementation they’ve been developing. Also, this initial presentation deals only with one stage of the redesign of Media Composer — the user interface changes — with much more to come within the spectrum of change.

Rebuilding the Engine
I was assured by the Avid design team that most of the decades-old Media Composer code has been completely rewritten, updated and redesigned with current innovations and implementations similar to those of the competition. This is a fully realized redesign.

Flexibility and customization are integrated throughout. There are many UI innovations, tabbed bins, new views and newer and more efficient access to enhanced tools. Media Composer has entirely new windowing and organizational options that goes way beyond mere surface looks and feels, yet it is much different than the competition’s implementations. You can now customize the UI to incredible lengths. There are new ways of viewing and organizing media, source and clip information and new and intuitive (and graphical) ways of creating workspaces that get much more usable information to the editor than before.

The Avid team examined weaknesses of the existing Media Composer environment and workflow: clutter, too many choices onscreen at once; screens that resize mysteriously, which can throw concentration and creative flow off-base; looking at what causes oft-repeated actions and redundant keystrokes or operations that could be minimized or eliminated altogether; finding ways of changing how Media Composer handles screen real estate to let the editor see only what they need to see when they need it.

Gone are the windows covering other windows and other things that might slow users down. Avid showed us how attention was paid to making Media Composer more intuitive to new editors by shrinking the learning curve. The ability for more contextual help (without getting in the way of editing) has been addressed.

There are new uses of dynamic thumbnails, color for immediate recognition of active operations and window activation, different ways of changing modalities — literally changing how we looked at timelines, how we find media. You want tabbed bins? You want hover scrubbing? You want customization of workspaces done quickly and efficiently? Avid looked at what do we need to see and what we don’t. All of these things have been addressed and integrated. They have addressed the difficulties of handling effect layering, effect creation, visualization and effect management with sleek but understandable solutions. Copying complex multilayered effects will now be a breeze.

Everything we were shown answered long-tolerated problems we’ve had to accept. There were no gimmicks, no glitz, just honesty. There was method to the madness for every new feature, implementation and execution, but after feedback from us, many things were reconsidered or jettisoned. Interruptions from this critical audience were fast and furious: “Why did you do that?” “What about my workflow?” “Those palette choices don’t work for me.” “Why are those tools buried?” This was a synergy and free-flow of information between company and end-users unlike anything I’ve ever seen.

There was no defensiveness from Avid; they listened to each and every critique. I could see they were actively learning from us and that they understood the problems we were pointing out. They were taking notes, asking more questions and adding to their change lists. Editors made suggestions, and those suggestions were added and actively considered. They didn’t want blind acceptance. We were informing them, and it was really amazing to see.

Again, I wish I could be more specific about details and new implementations — all I can say is that they really have listened to the complaints and are addressing them. And there is much more in the works, from media ingest and compatibility to look and feel and overall user experience.

When Jeff Rosica stopped in to observe, talk and listen to the crowd, he explained that while Avid Technology has many irons in the fire, he believes that Media Composer (and Pro Tools) represent the heart of what the company is all about. In fact, since his tenure began, he has redeployed tremendous resources and financial investment to support and nurture this rebirth of Media Composer.

Rosica promised to make sure Avid would not repeat the mistakes made by others several years ago. He vowed to continue to listen to us and to keep what makes Media Composer the dependable powerhouse that it has been.

As the presentation wound down, a commitment was made by the Avid group to continue to elicit our feedback and keep us in the loop throughout all phases of the redevelopment.

In the end, this tough audience came away optimistic. Yeah, some were still skeptical, but others were elated, expectant and heartened. I know I was.

And I don’t drink Kool-Aid. I hate it in fact.

There is much more in development for MC at Avid in terms of AI integration, facial recognition, media ingest, export functionality and much more. This was just a taste of many more things to come, so stand by.

(Special thanks for access to Marianna Montague, David Colantuoni, Tim Claman, Randy Fayan, and Randy Martens of Avid Technology. If I’ve missed anyone, thank you and apologies.)


Jonathan Moser is a six-time Emmy winning freelance editor/producer based in New York. You can email him at flashcutter@yahoo.com.

A Conversation: Veteran editor Lawrence Jordan, ACE

By Randi Altman

Lawrence Jordan’s fate was essentially sealed upon birth. His father and his grandfather made a living working in post and film editing in New York City.

He grew up around it; it encircled him. His path became pretty clear at a very young age. “I was very fortunate to be born into a film editing family. The running joke is that a trim bin was my first playpen,” he laughs.

Even with his rich family history, Jordan wasn’t handed a job. He started the way many did, as a runner. “I learned all the things that someone in that job learns about the cutting room — while trying to hone editing skills in my spare time. I then got into the union and became very focused on feature film editing.”

Some of those feature films include Jack Frost, Deuce Bigalow: Male Gigolo, Riding in Cars With Boys, Fallen and Are We There Yet? He also embraced dramatic television series such as NYPD Blue and CSI Miami. He most recently cut a feature for Netflix, called Naked.

Naked

Not long ago, we threw some questions at Jordan, about his love for editing, how he evolved with the technology of industry and his online class, Master the Workflow.

What was your path to editing?
My father, Morton Fallick was a film editor who started Cinemetric, one of the first integrated commercial post companies in New York in the 1960s. He followed in the footsteps of my grandfather, a projection and sound engineer, who helped organize the unions in New York. He worked for CBS News for many years. Because of this history and my love of movies, I knew I wanted to work in film from a very young age.

Many of the film editors who I ended up really admiring came out of my father’s shop. They were young guys who wanted to get into film, and his commercial house was one way to learn the craft. People like Richard Marks, Barry Malkin, Craig McKay and Evan Lottman — they went on to become some of the most respected feature film editors of the ‘70s, ‘80s and beyond.

My first job was as an apprentice in the Warner Bros. film library. Soon after that I got a job as an apprentice sound editor working on a picture the legendary Dede Allen was cutting. It was called Mike’s Murder directed by James Bridges. I worked directly for supervising sound editor Norval Crutcher.

How has editing evolved since you started in the industry?
I started back in the days of 35mm film. It was a completely different industry. The editing community was incredibly small back then. I think there were only about 1,000 or 1,500 people in the entire guild, and we all edited on Moviolas or flatbed machines like the Kem or Steenbeck. Back then, editing was a much slower and more deliberate process. Things were done by hand and ideas were executed at a different pace.

I saw videotape becoming a popular means of editing. Videotape annoyed me because it seemed that it had a lot to do with punching numbers into a keyboard and timecode. Kind of ironic isn’t it? I wasn’t particularly fond of that way of approaching editing. I liked the visceral and physical feeling of handling the actual film. And with the exception of experiments by Francis Coppola, back then, nobody else was cutting features on videotape, so I focused on working in 35mm.

But as time went by, I couldn’t really avoid the technological change. New systems were being developed that used multiple videotapes to approximate the nonlinear nature of editing on film. Then there were systems that worked off of laserdisc, but I was building a career as an assistant in features and none of these new systems really seemed like they were “there” yet.

Then, in 1991, while I was working as additional editor on Jodie Foster’s directorial debut, Little Man Tate, I got a call from my dad who said, “They’re editing off of hard drives now!” He went on to tell me about the Avid Media Composer and how it was being used in commercials. This was very exciting to me because I had started to get into computers in my personal life, and in those days we were all awed by the power of even the most rudimentary computer systems.

I went down to the Avid offices in Burbank and got a demo of Media Composer. I think there were maybe four or five of us in the room, and when I saw the demo, I was floored by the power and simplicity of digital editing. I knew this was what my future was going to be if I was to continue to pursue a career as a film editor.

I spent a year learning everything I could about the Avid system and digital video — the hardware, software and compression algorithms. At the same time, an editor friend of mine, Steve Cohen, who was also into nonlinear editing, asked if I’d be interested in doing a show on the Montage Picture Processor. It was a hybrid/digital version of their multi-deck Betacam system, and just not up handling the demands of a feature-length project. About a week into dailies we decided to make the switch and cut on the Avid. That project was Teamster Boss: The Jackie Presser Story.

How did that change the way you worked as an editor?
With the speed and flexibility of digital, editors were soon expected to do many of the tasks that traditionally were given to other departments. More complex sound editing was first. On films, temp dubs were prepared by the sound department, but this became something you could do pretty well on the Avid. As digital editing evolved and CPU speeds accelerated, it became more common for the film editor to rough-out visual effects. The way it is now, the spectacular VFX that are being done with CGI and the like still have to be subbed out to the VFX team. But you can do an awful lot, especially for temp in the offline.

Today, directors, producers and studios all expect these tasks to be accomplished in the offline. Although you can execute ideas much faster, there’s a ton more work. Additionally, with digital cinematography, editors are getting more footage than ever before. Whereas an average-budget feature might have had 200,000 or 300,000 feet of film on 35mm, now that same project — not even one of the large tent-poles films — could easily have a million feet of dailies. Think about it. By comparison, it took Francis Coppola three years to shoot a million feet of dailies on Apocalypse Now!

Do you have a particular editing philosophy?
If I did, it would be that I let the dailies speak to me. I say this because, of course, we’ve all read the script and talked to the director about his or her vision, but once you actually get the dailies —for any number of reasons — you could be looking at something totally different from what you expected.

This could be affected by whatever the conditions were on the day of production. Or whatever discussions might have gone on between the actors and the director in terms of how they approached a scene or interpreted the script.

So I let the material in front of me dictate how I’m going to make my initial cut on a particular scene. Then it’s a process of looking at the film as a whole and going back to the script and finding the best way to tell the story with the material you have.

You have worked on TV and film. Do you wear a different hat depending on what you are working on?
In television you’re dealing with much tighter schedules. The workflow is highly structured, and although you don’t get as much film every day, you really need to bang scenes out quickly. TV is also a writer/producer’s medium. You only get to work with the director of each episode for a few days and then the producers come in and give you their notes. All of this is usually done in a few weeks’ time.

On feature films, it’s completely different because you’re the head of the department. And even if you’re working with an additional editor, you are communicating directly with the director on a regular basis. A feature film can often go in many more directions than a television show. In the case of comedy, there can be all kinds of improvisation and you are dealing with different situations each day.

When cutting a feature, you’re much more intimately involved in the DNA of the film because you’re living with it for a much longer period of time.

Then, of course, you get into the director’s cut period, which usually lasts around 10 weeks. During this time, you’re typically developing tone, and not only with the story, but in terms of sound effects, music and visual effects. Depending on the situation, the editor is often much more involved in the final mix, color correction and delivery. That level of involvement just doesn’t happen for editors in television.

Do you have a preference in how you work? On-set, near-set?
I guess cutting on-set is happening more often these days, but if I had my preference I’d be in a cutting room near the set. As an editor it’s always nice to have the luxury to be in a quiet space where you can really take in and sort through the material. We want to give it as much thought as possible and have the maximum amount of uninterrupted time to solve whatever problems may come up. I do know that more editors are being asked to edit on-set in real-time. And I guess that’s a necessity for certain films.

During my initial cut, I try to keep it as simple as possible. I’m focusing on two things: story and performance. I try to fill-out my cut with as much sound and music as possible, and as many temp visual effects as necessary. In regard to music, most films nowadays have music supervisors who can be of great help pulling material. Because source cues can be expensive, often they’ve had discussions with the director, even before the editor comes on board.

What system do you work on? Are there any plugins that you use regularly?
I work on the Avid Media Composer. As I said, I was involved with its introduction into feature filmmaking and television in Hollywood, and it’s still the primary tool for 99 percent of all feature films and television shows for studios and networks today.

I know that there are other pieces of software out there, and I’ve had some experience with them, but the longer you work on a tool, the more ingrained it becomes in your muscle memory. With the Avid, the speed at which I can execute ideas is much faster using software that I’ve been working with going on 25 years now.

As far as peripheral software and additional tools, I do like to use Adobe After Effects to work with temp visual effects. It’s a very powerful program. It does have its limitations in terms of getting metadata in and out of the system, but I can create temp comps and the like relatively quickly with it. Of course, there’s Photoshop. I’ve also used Boris FX pretty extensively, and their Mocha tracking tools are pretty amazing.

What are you working on now?
I just finished a feature for Netflix called Naked, starring Marlon Wayans. It’s a comedy that has a tremendous amount of improv. I worked with a great director named Mike Tiddes, with whom I had worked previously on another feature called Fifty Shades of Black.

We had a lot of fun. It was crazy, because for an editor, improv comedy is always challenging —sometimes you’re literally creating stuff that wasn’t shot! It was also exciting because it was for Netflix. Although it didn’t have a theatrical distribution, it was an original film for them and was distributed in 180 countries on the same day.

The power and possibility with the new streaming networks just amazes me. These production companies have tremendous resources and are really giving the film and television production world a shot in the arm — it’s a real boost for employment opportunities for editors and assistants. I think it holds tremendous promise for our industry in general.

How do you work with your assistant editor? Do you give them a chance to cut?
Because I spent 10 years as an assistant, I really have a lot of respect for what they do. Assistants are essentially the glue that holds the editorial process together. Without an assistant who is at the top of their game — focused, organized and generally passionate about what their role is in the process — an editor can really find himself/herself in a pickle.

Today, much of the assistant’s job has become a metadata manager. There are so many different types of media. It’s the same media that we used to have, but it is delivered digitally and in so many different formats.

I always try to give my assistants a shot at cutting at least a scene, if not a couple of scenes, on every project I do. There really is no other way to learn the editing craft, besides having it handed down to you by an editor. To me, this was something that existed when I was coming up and was essentially at the core of the apprenticeship nature of our craft from the time it started. This was how we learned to do our job.

It’s pretty much still the same way, but it’s the proverbial Catch-22. You can’t learn the actual nuts-and-bolts of the job in a cutting room, unless you have a job in a cutting room. You can’t learn this in theory while in film school. They don’t really teach the sort of inner workings of the feature film workflow, or even television workflow in film school. It’s much more of a macro approach — an overview to how the work is done. I’m not aware of any film programs that teach the job of the assistant editor.

NYPD Blue

Now, of course, there are certification courses and specialized schools, but unless you’re working on the front lines on a feature film or television show you’re really not going to get an understanding of the full spectrum of what the job entails.

So, yes, I do try to give my assistants a chance to cut. I also solicit their opinions on scenes that I have cut. I ask for their ideas. I ask for their feedback. I ask whether they remember anything in the dailies that I might have missed. That’s the nature of our work. It’s a collaborative process, and it helps me do my best work.

I hear you are doing something called Master the Workflow. Can you explain what that is?
Yes, Master the Workflow is something my assistant Richard Sanchez and I came up with on our last film, Naked. Richard had developed a comprehensive database in FileMaker that tracks all of the media and metadata created on a feature film. It made me realize how much the job of the assistant editor has changed from when I was an assistant. With the explosion of digital production and post, I thought that it would be of tremendous benefit to detail the critical role that the assistant editor plays in the editorial process.

We decided to create an online education course and named it Feature Film Assistant Editor Immersion 1.0. It takes a potential assistant editor from their initial meeting with their editor through final delivery of a finished film. I felt strongly about creating something like this, primarily because we wanted to show a way for people to learn what goes on in a cutting room in the way it used to be learned.

As I mentioned earlier, there has been an apprenticeship model in post and film editorial throughout its history, but because of digital technology, the editor and the assistants have become somewhat siloed. An assistant doesn’t get to sit in the room with the editor as they are creating the cut as much anymore. So the craft is not being handed down as it was traditionally.

The course is a detailed view of what takes place in the editing environment. For example, we discuss how you deal with the director, how an assistant deals with his editor, how to navigate the sometimes touchy political nature of dealing with producers and studios. Things as simple as when to express your opinion, and when not to.

We wanted to impart all of these things to a new generation of filmmakers and make it available online so that those who might not otherwise have the opportunity to get inside a cutting room and learn how the job is done could learn those skills. We’ve already had our first session with 50 students. They’ve been very, very positive with their feedback and we’re excited to see where it goes.

Stitch cuts down 200+ hours of footage for TalkTalk Xmas spot

Can you feel it? The holidays are here, and seasonal ads have begun. One UK company, TalkTalk — which provides pay television, telecommunications, Internet and mobile services — is featuring genuine footage of a family Christmas. Documenting a real family during last year’s holiday, this totally unscripted, fly-on-the-wall commercial sees the return of the Merwick Street family and their dog, Elvis, in This is Christmas.

Directed by Park Pictures’ Tom Tagholm and cut by Stitch’s Tim Hardy, the team used the same multi-camera techniques that were used on their 2016 This Stuff Matters campaign.

Seventeen cameras — a combination of Blackmagic Micro Studio 4K, a remote Panasonic AW-UE70WP and Go Pros — were used over the four-day festive period, located across eight rooms and including a remote controlled car. The cameras were rolling from 6:50am on Christmas Eve and typically rolled until midnight on most days, accumulating in over 200 hours of rushes that were edited down into this 60-second spot.

In lessons learned from the last year’s shoot, which was shot continuously, this time video loggers were in place to to identify moments the rooms were empty.

“I think we had pretty much perfected our system for organizing and managing the rushes in Talk Talk’s summer campaign, so we were in a good position to start off with,” explains editor Hardy, who cut the piece on an Avid Media Composer. “The big difference this time around was that the whole family were in the house at the same time, meaning that quite often there were conversations going on between two or three different rooms at once. Although it did get a little confusing, it was often very funny as they are not the quietest of families!”

Director Tagholm decided to add a few extra cameras, such as the toy remote-controlled car that crashes into the Christmas tree. “This extra layer of complexity added a certain feel to the Christmas film that we didn’t have in the previous ones,” says Hardy.

AJA and Avid intro Avid Artist | DNxIP hardware interface

AJA has collaborated with Avid to develop Avid Artist | DNxIP, a new hardware interface option for Avid Media Composer users that supports high frame rate, deep color and HDR IP workflows. It is a Thunderbolt 3-equipped I/O device that enables the transfer of SMPTE standard HD video over 10 GigE IP networks, with high-quality local monitoring over 3G-SDI and HDMI 2.0.

Based on the new AJA Io IP, Avid Artist | DNxIP is custom engineered to Avid’s specifications and includes an XLR audio input on the front of the device for microphone or line-level sources. Avid Artist | DNxIP uses Thunderbolt 3 to enable simple, fast HD/SD video and audio ingest/output from/to IP networks. It features dual Thunderbolt 3 ports for daisy chaining and two SFP+ cages for video and audio routing over 10 GigE IP networks. The portable, aluminum encased device also supports SMPTE 2022-6 uncompressed video, audio and VANC data over IP, as well as SMPTE 2022-7 for redundancy protection.

“The increased agility and efficiency of IP workflows is a must-have for content creators and broadcasters in today’s competitive climate,” says Alan Hoff, VP of market solutions for Avid. “We’ve collaborated with AJA on the newest addition to our Avid Artist product line, Avid Artist DNxIP, which offers broadcasters and post production facilities a portable, yet powerful, video interface for IP workflows.”

Avid Artist | DNxIP feature highlights include:
– Laptop or desktop HD/SD capture and playback over IP across Thunderbolt 3
– Audio input for analog microphone to record single-channel 16-bit D/A analog audio, 48 kHz sample rate, balanced, using industry standard XLR
– Backwards compatibility with existing Thunderbolt hosts
– SMPTE 2022-6 and 2022-7 I/O
– Dual 10 GigE connectivity via two SFP+ cages compatible with 10 GigE SFP transceiver modules from leading third-party providers
– Two Thunderbolt 3 ports for daisy chaining of up to six Thunderbolt devices
– 3G-SDI and HDMI 2.0 video monitoring
– Audio I/O: 16-channel embedded SDI; 8-channel embedded HDMI; 4-channel analog audio In and 4-channel audio out via XLR breakout
– Small, rugged design suited for a variety o production environments
– Downstream keyer
– Standard 12v 4-pin XLR for AC or battery power

Detroit editors William Goldenberg, ACE, and Harry Yoon

By Chris Visser

Kathryn Bigelow’s Detroit is not an easy film to watch. It deals with some very ugly moments in our nation’s history — specifically, Detroit’s 1967 12th Street Riot — and the challenge of adapting that history into a narrative feature film was no easy task. What do you show? What perspective do you give space to, and which ones do you avoid?

I sat down to talk to William “Billy” Goldenberg, ACE, and Harry Yoon, the editors of the film Detroit, to tackle these and other questions related to the film and their careers.

Billy Goldenberg

First, here are some details about the edit: Detroit was cut on Avid Media Composer 8.5.3 using an ISIS 5000 shared storage solution. The film was shot on Alexa Mini in ArriRaw. Dailies were delivered at DNX36 and then swapped for identical DNX115 media at the end of each production week.

In addition to Goldenberg and Yoon, other members of the Detroit editorial team were additional editor Brett Reed, VFX editor Justin Yates, first assistant editor Peter Dudgeon and apprentice editor Jun Kim. The film will be available digitally on November 28 and on DVD/Blu-Ray December 12.

Ok, let’s dig in…

How did this project come about?
Billy: Kathryn called to meet several months before the project started shooting. She sent me the script, but it soon became clear that I wouldn’t be able to start the film because I was still finishing Ben Affleck’s Live by Night. Kathryn said, “Look, let’s bring another editor on until you’re available, and then both of you can finish together.”

I thought of Harry because he had done some great work on The Newsroom, he knew Kathryn, and I knew he was a smart and talented guy. I ran it by Kathryn and she thought it was a great idea. So Harry was able to start the film.

At the beginning, half of the time I was at an editing room for Live by Night down the hall from the editing room for Detroit. I would sort of run back and forth throughout the day cutting and doing the stuff for both films. We did that for two months, and then I came onto Detroit full time. We did the rest of it together up until the end of the director’s cut. I finished the film from there.

Harry Yoon

How did you guys approach tackling the project together?
Harry: We had our key assistant Peter Dudgeon — who had worked in Billy’s cutting room on Live by Night and on a couple of other projects — there to prep dailies for Billy. It was fortunate because the way Billy likes to organize his bins and media is very, very akin to what I like as well.

In the mornings we would get dailies, and Billy and I would talk about who would take different scenes. Sometimes Billy wanted to really work on a particular sequence. Sometimes, I did. We would split scenes in the morning, and then go off and do our work. In the evening we’d come back together. This was my favorite part of the day because we would watch each other’s scenes. I would learn so much by seeing what he’d done and how he would approach the material. Getting critique and feedback from someone like Billy was like a master class for me.

I was also impressed that as Billy was working, he would ask the opinion of not just me, but the assistant as well. To have somebody be so transparent in his process was not only incredibly instructive personally, but really helped us to have a consistent style and approach to the material as we were working day by day. That consistent approach is apparent, especially during the entire Algiers Motel sequence in the film. It was one of the most visceral and emotionally draining things I’ve ever seen.

When I saw the film for a second time, I timed that sequence at 42 minutes. Seeing it the first time, I remember thinking that the sequence felt realtime. It felt like you were living through 42 minutes of these people’s lives. How did you approach something of that magnitude?
Billy: They shot that in sequence order, for the most part, for about three weeks. But they did shoot sections at a time that ultimately had to be mixed together. We got everything cut individually and then sat down together and decided how to work all the simultaneous action. We used the benefit of having two heads as opposed to one and talked about where things should be. What we would see and what we wouldn’t see. How to make this all feel simultaneous, but at a certain point, it’s just a feel thing.

Harry: One of the interesting challenges of this segment was that because Kathryn was shooting in realtime, and because the annex building was an actual building — it wasn’t a stage — camera people would be positioned in areas of overlapping action because Kathryn really wanted to make sure that the actors were in the moment every step of the way.

We would often finish a scene but then get new material for that scene that we could mine for better moments. Or, it might make sense to use the new coverage instead of the coverage from the day before to better show which character was where at what time. It was like having a puzzle and you would keep getting new pieces for the puzzle every day. It was definitely difficult, especially as the scene started to take shape. It was impossible not to feel a kind of resonance with everyday events that we were seeing on the news or on YouTube. I think it was tough to grapple with, but at the same time incredibly motivating for both Billy and Kathy and I — really everybody involved with the project — to say, “We have to get this right.” But, also, you’re adapting history. This is historical fiction; it’s not a documentary.

At the end of the film it says, “No one knows fully what happened. This has been pieced together through testimonials and interviews.”
Billy: I don’t know that I’m objective about what happened, obviously, but I did feel like I was just trying to portray the events as they occurred. And, Kathryn and Mark [Boal, Detroit’s screenwriter] did extensive research. They had police reports and ballistic reports, and this is what happened to the best of anybody’s recollection.

I tried to tell it as it happened and not bring my own feelings to it. We wanted people to experience that hallway sequence and the film, as a whole, in a way so they could draw their own conclusions. Let the audience experience it and understand this is why attention needs to be paid to this kind of violence.

Harry: Our conversations with Kathryn were critical throughout that process. She and Mark did extensive interviews with eyewitnesses. So, I think she was relying upon them for some of the factual elements, or at least what they remembered. But, I think any time where there was some ambiguity we tried to be true to that to a certain extent. We checked in with her about what to show and what not to show through that process.
As Billy said, what we didn’t want was to try to be manipulative for cinematic effect. The nature of events were so tragic and so brutal that it was still a very difficult thing to go through. Even though we tried to be as measured as possible while we were putting it together, it was a tough balancing act.

What kind of prep work was involved in this for you?
Billy: A lot of movies in my career are based on true events and true stories. With the first couple, I did a tremendous amount of research, and it seemed to get me into a little bit of trouble. I would start to think, “Well, it really happened like this, or it really went like that. How come we’re not using this part of the book or that part of the book?” It took my mind away from the story we were telling. So, I’ve learned over the years to do just enough research to where I feel like I have an understanding of the subject at that time in history.

But with the specific events of the Algiers, because they’re disputed somewhat, I tried to learn as much as I could about that time in history in 1967. What was happening in the country, and how we got there. In terms of the specific events, I tried not to learn too much. I relied on Kathryn and Mark’s research to have gotten it as close as they could get it. It’s not a documentary, I was still trying to tell the story. It’s a little bit of a balancing act in these types of movies.

Harry: I agree with Billy, it’s best to not focus on research for the particular story at hand, but to understand the context. One thing that impacted our editorial process was we received several reels of stock footage from the Michigan Film Archive. It was a lot of news footage from that time — aerials of fires, street-level shots of the National Guard stopping people, store fronts and things like that. That was really inspiring to see because it just felt so real in the feel of things and felt very of the moment. This led us into an additional hunt for material that took us through YouTube and a lot of period films, including documentaries that were done either during or right after the rebellion that focused on questions of, “How did this happen?”

It was a really wonderful way to sort of deep dive into that moment. We actually ended up using some footage from those documentaries throughout the film. Not just adding original film from the archives, but using it as source material as well. It was a great way for us to sort of hear the voices and see the footage of the time versus through the distance of history.

Let’s pivot away from the film a little bit. Let’s talk about mentorship. What does it mean to you? How has both being a mentor and a mentee been beneficial for your careers?
Billy: I assisted Michael Kahn, Steven Spielberg’s editor, for four years. To say that he was my mentor is sort of short-changing it. He was like my graduate professor. Being his first assistant, taught me almost everything that I needed to know about editing and how to be an editor. Obviously, he couldn’t give me talent, but he made me realize I had talent. At least he thought I did. He taught me how to handle myself politically, how to take criticism and how to approach scenes. If it wasn’t for his mentorship, I certainly wouldn’t be where I am right now.

He saw something in me that I didn’t see in myself. I’ve in turn tried to help others. Brett Reed has been with me for 17 years. He started out as my PA and has been my first assistant for about 11 years. He just got his first job as a film editor, so I’m losing him. I hope that I’ve done for him what Michael did for me.

At the end of my assistant career with Michael, he called up Phil Gersh of the Gersh Agency and said, “You know, you should sign this guy. He’s going to be a really talented editor.” He signed me when I was still an assistant. I was able to do the same thing for Brett at ICM. They signed him without him ever having cut a film. It makes me so happy that I was able to do something for somebody that worked so hard and deserved it. Brett made my editing better. He’s smart and he was able to be a bit more objective sometimes since he wasn’t the one working with the footage all day long.

The people I have working for me are really good at running the room and prepping the dailies, But I also picked them because they have a lot of creative talent and they help me. Harry touched on it earlier about me having the generosity of having other people in the room. Well, it’s a little generosity, but it’s also a lot that I value their opinions and it makes my editing better to hear other smart, talented people’s opinions. It really is a give-and-take relationship I don’t think that there’s ever been a more important relationship in my professional life than the editor/assistant mentorship one.

Harry: After a couple of years working here in LA, I was lucky enough to be part of a mentorship program called, “Project Involve” at Film Independent. I was paired up with Stephen Mirrione. To be able to speak to someone of his level and with his dedication to the craft — and his understanding of not just the hard skills of editing but also the people skills — was an amazing introduction. It gave me a very vivid picture of the kind of things that I needed to learn in order to get to that place. And consistently through my career, I’ve been given timely, incredible advice from people that I’ve sought out to be my mentors, including Troy Takaki and Lisa Lassek and, most recently, Billy. We worked as colleagues, but he modeled every day.

So much of what you don’t know is the soft skills. You can be a good editor in front of your Avid, or whatever system, but so much of what determines success is how you are in a room… your people skills, your work ethic.  Understanding when to speak and when not to. When is it appropriate for you to give a note? How to read the dynamic going on in a particular room. These are things that are probably as critical or more critical than whether or not you can make a good cut.

I could listen to you guys talk all day, but I want to be respectful of your time. Anything you want to leave our audience with?
Billy: I know this sounds cheesy, but I think it’s how lucky I feel getting to work with someone like Kathryn on Detroit. Or to work with some of the directors I’ve gotten to work with, and I put Katherine at the top of that list.  I can’t believe how fortunate I have been to have the career that I have.

Harry: What that speaks to in relation to Detroit is what I’ve seen consistently in the people that I’ve been mentored by, and whose careers I’ve most admired — how important it is to continue to love the craft. I find it inspiring and endlessly fascinating. What I see in people is they’re motivated by this sense that there’s always more to learn. The sequence could always be better. The scene can always be better. That’s something that I definitely saw in Billy through this process.


Chris Visser is a Wisconsin kid who works and lives in LA. He’s currently an assistant editor in scripted TV, as well as the VP of BCPCWest, the Los Angeles-based chapter of the Blue Collar Post Collective. You can find him on Twitter (@chrisvisser)

Red Giant Universe 2.2 gets 11 new transitions, supports Media Composer

Red Giant is now offering Universe 2.2, which features 11 new transition tools — 76 transitions and effects in total — for editors and motion graphics artists. In addition to brand new transitions, Red Giant has made updates to two existing plugins and added support for Avid Media Composer. The Universe toolset, and more, can be seen in action in the brand new short film Hewlogram, written and directed by Red Giant’s Aharon Rabinowitz, and starring David Hewlett from the Stargate: Atlantis series.

The latest update to Red Giant’s collection of GPU-accelerated plugins, Universe 2.2’s transitions range from Retrograde, which creates an authentic film strip transition using real scans from 16mm and 8mm film to a Channel Surf transition that creates the effect of changing channels on an old CRT TV.

This release brings the complete set of Universe tools to Avid Media Composer, which means that all 76 Red Giant Universe effects and transitions now run in eight host applications, including: Adobe Premiere Pro CC, After Effects CC, Apple Final Cut Pro X, Blackmagic DaVinci Resolve and more.

Retrograde

Brand-new transition effects in Red Giant Universe 2.2 include:
• VHS Transition: A transition that mimics the effect that occurs when a VCR has been used to record over pre-existing footage.
• Retrograde Transition: A transition that that uses real scans of 16mm and 8mm film to create an authentic film strip transition.
• Carousel Transition: A transition that mimics advancing to the next slide in an old slide projector.
• Flicker Cut: A transition that rapidly cuts between two clips or a solid color, and which can invert the clips or add fades.
• Camera Shake Transition: A transition that mimics camera shake while it transitions between clips.
• Channel Surf: A transition that mimics the distortion you’d get by changing the channel on a cathode ray tube TV.
• Channel Blur: A transition that blurs each of the RGB channels separately for a unique chromatic effect.
• Linear Wipe: A classic linear wipe with the addition of wipe mirroring, as well as an inner/outer stroke with glow on the wipe border.
• Shape Wipe: A transition that uses an ellipse, rectangle or star shape to move between 2 pieces of footage. Includes control over points, size, stroke and fill.
• Color Mosaic: A Transition that overlays a variety of colors in a mosaic pattern as it transitions between 2 clips.
• Clock Wipe: A classic radial wipe transition with feathering and the option for a dual clock wipe.

Updates to existing effects in Universe 2.2 include:
• VHS: This update includes new VHS noise samples, VHS style text, timecode and function icons (like play, fast-forward, rewind), updated presets, and updated defaults for better results upon application.
• Retrograde: This update includes a small but valuable addition that allows Retrograde to use the original aspect ratio of your footage for the effect.

Existing Universe customers can download the new tools directly by launching Red Giant Link. Universe is available as an annual subscription ($99/year) or as a monthly subscription ($20/month). Red Giant Universe is available in Red Giant’s Volume Program, the flexible and affordable solution for customers who need five or more floating licenses.

Raising money and awareness for childhood cancer via doc short

Pablove One Another is a documentary short film produced by Riverstreet and directed by the company’s co-founders Tracy Pion and Michael Blum. The film explores Pablove’s Shutterbug program for children undergoing cancer treatment and its connection to the cancer research work that Pablove funds.

Blum and Pion spoke with us about the project, including the release of its title track “Spark” and the importance of giving back.

How did you become involved in the project?
Pion: We have known Pablove’s founders Jo Ann Thrailkill and Jeff Castelaz, for almost 11 years. Our sons were dear friends and classmates in preschool. When Jeff and Jo Ann lost their son Pablo to cancer eight years ago they set out to start a foundation named Pablove in his honor. We’ve been committed to helping Pablove whenever we can along the way by doing PSAs and other short films and TV spots in order to help raise awareness for the organization’s mission, including the Shutterbugs program and research funding.

Michael Blum, Mady and Tracy Pion.

What was the initial goal of the documentary?
Blum: The goal was always about awareness and fundraising. It first debuted at the annual Pablove Foundation gala fundraiser and helped raise over $500,000 in an hour. It continues to live online and hopefully it inspires people to connect with Pablove and support its amazing programs.

Beyond the amazing cause, why was this project a good fit for Riverstreet?
Pion: At the core of what we do — campaigns, commercials, interstitials, network specials — is emotionally-driven storytelling. We do development, scripting, design, animation, live-action production, editorial and completion for a variety of brands and networks and when possible we try to apply this advertising and production expertise to philanthropic causes. Our collaboration with Pablove came out of a deeply personal connection, but above and beyond that, we think that our industry has an obligation to use our resources to help raise awareness. Why not use our power of persuasion for the betterment of others?

How did you decide on the approach and the interweaving of stories?
Blum: The film tells the Pablove story from three experiences: a young girl who is being treated for cancer who is part of Pablove’s Shutterbug photography program; an instructor with Shutterbugs who is a cancer survivor; and a researcher whose innovation is supported in part by Pablove’s grants. We thought it was important to tell the human impact of the work of the Pablove Foundation through different vantage points to reflect the scope of what they do. We worked with a fundraising expert (Benevon) who advised Pablove and Riverstreet on how to design the film from a high-impact standpoint.

What were some unexpected or unique moments in the production of the film?
Pion: Well, for us it was a couple of things. Firstly, the power of the kids’ photos really caught us, especially those by Mady, who we were featuring. When she pulled out her “Light the End of the Tunnel” image we were doubly struck by the simple power of the image and its obvious meaning for her, and, as filmmakers, we knew we had our ending. We were also grateful of how sensitive our crew was with the Mady and Miles. Everyone was working for hardly any money and yet they didn’t want to be anywhere else. It was a moment of gratitude for the amazing crews that we have gathered together over the years.

What were some of the editing challenges to the above?
Pion: We had several hours of footage, and some very emotional interviews with our subjects, so it was a real but familiar challenge: how to pick the most salient footage and how to weave the threads together and how to capture the emotion.

What was the documentary edited on?
Pion: We use Avid Media Composer on an ISIS server.

How did the song come to be?
Blum: While working on the film, we were looking for a music track that would effectively unite these interweaving stories. We heard a girl singing on our daughter’s phone — a classmate — and thought, wouldn’t it be great to have a young teenager’s voice on a spot that is a for and about children. The Bird & The Bee’s “Spark,” paired with the luminous voice of Gracie Abrams, perfectly carries through the message of the Foundation’s impact on the lives of children through creativity and research funding. Written by Inara George and Greg Kurstin, the music production was handled by composer/producer Rob Cairns, who has worked with Riverstreet on numerous projects.

Pion: At the fundraiser, people were buzzing about the song, trying to Shazam it. We loved the song, and thought it was amazing for the film, but this reaction made us stop and consider, “Is there something more we can do with it to help Pablove?” Fortunately, everyone who worked on it felt the same way, and agreed to release the track with proceeds going to Pablove Foundation.

Cabin Editing Company opens in Santa Monica focusing on editing, VFX

Cabin Editing Company has opened in Santa Monica, started by three industry veterans: managing partner Carr Schilling and award-winning editors Chan Hatcher, Graham Turner and Isaac Chen.

“We are a company of film editors with a passion for storytelling who are committed to mentoring talent and establishing lasting relationships with directors and agencies,” says Schilling, who formerly worked alongside Hatcher, Turner and Chen at NO6.

L-R: Isaac Chen, Carr Schilling, Graham Turner and Chan Hatcher.

Cabin, which also features creative director/Flame artist Verdi Sevenhuysen and editor Lucas Spaulding, will offer creative editorial, visual effects, finishing, graphics and color. The boutique’s work spans mediums across broadcast, branded content, web, film and more.

Why was now the right time to open a studio? “Everything aligned to make it possible, and at Cabin we have a collective of top creative talent where each of us bring our individual style to our projects to create great work with our clients,” says Schilling.

The boutique studio has already been busy working with agencies such as 215 McCann, BBDO, CP+B, Deutsch, GSD&M, Mekanism and Saatchi & Saatchi.

In terms of tools, Cabin uses Avid Media Composer and Autodesk Flame Premium all centralized to the Facilis TerraBlock shared storage system via Fibre.

Michael Kammes’ 5 Things – Video editing software

By Randi Altman

Technologist Michael Kammes is back with a new episode of 5 Things, which focuses on simplifying film, TV and media technology. The web series answers, according to Kammes, the “five burning tech questions” people might have about technologies and workflows in the media creation space. This episode tackles professional video editing software being used (or not used) in Hollywood.

Why is now the time to address this segment of the industry? “The market for NLEs is now more crowded than it has been in over 20 years,” explains Kammes. “Not since the dawn of modern NLEs have there been this many questions over what tools should be used. In addition, the massive price drop of NLEs, coupled with the pricing shift (monthly/yearly, as opposed to outright) has created more confusion in the market.”

In his video, Kammes focuses on Avid Media Composer, Adobe Premiere, Apple Final Cut Pro, Lightworks, Blackmagic Resolve and others.

Considering its history and use on some major motion pictures, (such as The Wolf of Wall Street), why hasn’t Lightworks made more strides in the Hollywood community? “I think Lightworks has had massive product development and marketing issues,” shares Kammes. “I rarely see the product pushed online, at user groups or in forums.  EditShare, the parent company of LightWorks, also deals heavily in storage, so one can only assume the marketing dollars are being spent on larger ticket items like professional and enterprise storage over a desktop application.”

What about Resolve, considering its updated NLE tools and the acquisition of audio company Fairlight? Should we expect to see more Resolve being used as a traditional NLE? “I think in Hollywood, adoption will be very, very slow for creative editorial, and unless something drastic happens to Avid and Adobe, Resolve will remain in the minority. For dailies, transcodes or grading, I can see it only getting bigger, but I don’t see larger facilities adopting Resolve for creative editorial. Outside of Hollywood, I see it gaining more traction. Those outlets have more flexibility to pivot and try different tools without the locked-in TV and feature film machine in Hollywood.”

Check it out:

Jimmy Helm upped to editor at The Colonie

The Colonie, the Chicago-based editorial, visual effects and motion graphics shop, has promoted Jimmy Helm to editor. Helm has honed his craft over the past seven years, working with The Colonie’s senior editors on a wide range of projects. Most recently, he has been managing ongoing social media work with Facebook and conceptualizing and editing short format ads. Some clients he has collaborated with include Lyft, Dos Equis, Capital One, Heineken and Microsoft. He works on both Avid Media Composer and Adobe Premiere.

A filmmaking major at Columbia College Chicago, Helm applied for an internship at The Colonie in 2010. Six months later he was offered a full-time position as an assistant editor, working alongside veteran cutter Tom Pastorelle on commercials for McDonald’s, Kellogg’s, Quaker and Wrangler. During this time, Helm edited numerous projects on his own, including broadcast commercials for Centrum and Kay Jewelers.

“Tom is incredible to work with,” says Helm. “Not only is he a great editor but a great person. He shared his editorial methods and taught me the importance of bringing your instinctual creativity to the process. I feel fortunate to have had him as a mentor.”

In 2014, Helm was promoted to senior assistant editor and continued to hone his editing skills while taking on a leadership role.

“My passion for visual storytelling began when I was young,” says Helm “Growing up in Memphis, I spent a great deal of time watching classic films by great directors. I realize now that I was doing more than watching — I was studying their techniques and, particularly, their editing styles. When you’re editing a scene, there’s something addictive about the rhythm you create and the drama you build. I love that I get to do it every day.”

Helm joins The Colonie’s editorial team, comprised of Joe Clear, Keith Kristinat, Pastorelle and Brian Salazar, along with editors and partners Bob Ackerman and Brian Sepanik.

 

 

Baby Driver editors — Syncing cuts to music

By Mel Lambert

Writer/director Edgar Wright’s latest outing is a major departure from his normal offering of dark comedies. Unlike his Three Flavours Cornetto film trilogy — Shaun of the Dead, Hot Fuzz and The World’s End — and Scott Pilgrim vs. the World, TriStar Pictures’ Baby Driver has been best described as a romantic musical disguised as a car-chase thriller.

Wright’s regular pair of London-based picture editors, Paul Machliss, ACE, and Jonathan Amos, ACE, also brought a special brand of magic to the production. Machliss, who had worked with Wright on Scott Pilgrim, The World’s End and his TV series Spaced for Channel 4, recalls that, “very early on, Edgar decided that I should come along on the shoot in Atlanta to ensure that we had the material he’d already storyboarded in a series of complex animatics for the film [using animator Steve Markowski and editor Evan Schiff]. Jon Amos joined us when we returned to London for sound and picture post production, primarily handling the action sequences, at which he excels.”

Developed by Wright over the past two decades, Baby Driver tells the story of an eponymous getaway driver (Ansel Elgort), who uses earphones to drown out the “hum-in-the-drum” of tinnitus — the result of a childhood car accident — and to orchestrate his life to carefully chosen music. But now indebted to a sinister kingpin named Doc (Kevin Spacey), Baby becomes part of a seriously focused gang of bank robbers, including Buddy and Darling (Jon Hamm and Eiza González), Bats (Jamie Foxx) and Griff (Jon Bernthal). Debora, Baby’s love interest (Lily James), dreams of heading west “in a car I can’t afford, with a plan I don’t have.” Imagine, in a sense, Jim McBride’s Breathless rubbing metaphorical shoulders with Tony Scott’s True Romance.

The film also is indebted to Wright’s 2003 music video for Mint Royale’s Blue Song, during which UK comedian/actor Noel Fielding danced in a stationery getaway car. In that same vein, Baby Driver comprises a sequence of linked songs that tightly choreograph the action and underpin the dramatic arcs being played out, often keying off the songs’ lyrics.

The film’s opener, for example, features Elgort partly lipsyncing to “Bellbottoms,” by the Jon Spencer Blues Explosion, as the villains commit their first robbery. In subsequent scenes, our hero’s movements follow the opening bass riffs of The Damned’s “Neat Neat Neat,” then later to Golden Earring’s “Radar Love” before Queen’s “Brighton Rock” adds complex guitar cacophony to a key encounter scene.

Even the film’s opening titles are accompanied by Baby performing a casual coffee run in a continuous three-minute take to Bob & Earl’s “Harlem Shuffle” — a scene that reportedly took 28 takes on the first day of practical photography in Atlanta. And the percussion and horns of “Tequila” provide syncopation for a protracted gunfight. Fold in “Egyptian Reggae,” “Unsquare Dance,” and “Easy,” followed by “Debora,” and it’s easy to appreciate that Wright is using music as a key and underpinning component of this film. The director also brought in music video choreographer Ryan Heffington to achieve the timing precision he needed.

The swift action is reflected in a fast style of editing, including whip pans and crash zooms, with cuts that are tightly synchronized to the music. “Whereas the majority of Edgar’s previous TV series and films have been parodies, for Baby Driver he had a very different idea,” explains Machliss. Wright had accumulated a playlist of over 30 songs that would inspire various scenes in his script. “It’s something that’s very much a part of my previous films,” says director Wright, “and I thought of this idea of how to take that a stage further by having a character who listens to music the entire time.”

“Edgar had organized a table read of his script in the spring of 2012 in Los Angeles, at which he recorded all of the dialog,” says Machliss. “Taking that recording, some sound effects and the music tracks, I put together a 100-minute ‘radio play’ that was effectively the whole film in audio-only form that Edgar could then use as a selling tool to convince the studios that he had a viable idea. Remember, Baby Driver was a very different format for him and not what he is traditionally known for.”

Australia-native Machliss was on set to ensure that the gunshots, lighting effects, actors and camera movements, plus car hits, all happened to the beat of the accompanying music. “We were working with music that we could not alter or speed up or slow down,” he says. “We were challenged to make sure that each sequence fit in the time frame of the song, as well as following the cadence of the music.”

Almost 95% of music included in the first draft of Wright’s script made it into the final movie according to Machliss. “I laid up the relevant animatic as a video layer in my Avid Media Composer and then confirmed how each take worked against the choreographed timeline. This way I always had a reference to it as we were filming. It was a very useful guide to see if we were staying on track.”

Editing On Location
During the Atlanta shoot, Machliss used Apple ProRes digital files captured by an In2Core QTake video assist that was recording taps from the production’s 35mm cameras. “I connected to my Mac via Ethernet so I could create a network to the video assist’s storage. I had access to his QuickTime files the instant he stopped recording. I could use Avid’s AMA function to place the clip in the timeline without the need for transcoding. This allowed almost instantaneous feedback to Edgar as the sequence was built up.”

Paul Machliss on set.

While on location, Machliss used a 15-inch MacBook Pro, Avid Mojo DX and a JVC video monitor “which could double as a second screen for the Media Composer or show full-screen video output via the Mojo DX.” He also had a Wacom tablet, an 8TB Thunderbolt drive, a LaCie 500GB rugged drive — “which would shuttle my media between set and editorial” — and an APU “so that I wouldn’t lose power if the supply was shut down by the sparks!”

LA’s Fotokem handled film processing, with negative scanning by Efilm. DNX files were sent to Company 3 in Atlanta for picture editorial, “where we would also review rushes in 2K sent down the line from Efilm,” says Machliss. “All DI on-lining and grading took place at Molinare in London.” Bill Pope, ASC, was the film’s director of photography.

Picture and Sound Editorial in London
Instead of hiring out editorial suites at a commercial facility in London, Wright and his post teams opted for a different approach. Like an increasing number of London-based productions, they elected to rent an entire floor in an office building.

They located a suitable location on Berners Street, north of the Soho-based film community. As Machliss recalls: “That allowed us to have the picture editorial team in the same space as the sound crew,” which was headed up by Wright’s long-time collaborator Julian Slater, who served as sound designer, supervising sound editor and re-recording engineer on Baby Driver. “Having ready access to Julian and his team meant that we could collaborate very closely — as we had on Edgar’s other films — and share ideas on a regular basis,” as the 10-week Director’s Cut progressed.

British-born Slater then moved across Soho to Goldcrest Films for sound effects pre-dubs, while his co-mixer, Tim Cavagin, worked on dialog and Foley pre-mixes at Twickenham Studios. Print mastering of the Dolby Atmos soundtrack occurred in February 2017 at Goldcrest, with Slater handling music and SFX, while Cavagin oversaw dialog and Foley. “Following Edgar’s concept of threading together the highly choreographed songs with linking scenes, Jon and I began the cut in London against the pre-assembled material from Atlanta,” says Machliss.

To assist Machliss during his picture cut, the film’s sound designer had provided a series of audio stems for his Avid. “Julian [Slater] had been working on his sound effects and dialog elements since principal photography ended in Atlanta. He had prepared separate, color-coded left-center-right stems of the music, dialog and SFX elements he was working on. I laid these [high-quality tracks] into Media Composer so I could better appreciate the intricacies of Julian’s evolving soundtrack. It worked a lot better than a normal rough mix of production dialog, rough sound effects and guide music.”

“From its inception, this was a movie for which music and sound design worked together as a whole piece,” Slater recalls. “There is a large amount of syncopation of the diegetic sounds [implied by the film’s action] to the music track Baby is listening to. Sometimes it’s obvious because the action was filmed with that purpose in mind. For example, walking in tempo to the music track or guns being fired in tempo. But many times it’s more subtle, including police sirens or distant trains that have been pitched and timed to the music,” and hence blend into the overall musical journey. “We strived to always do this to support the story, and to never distract from it.”

Because of the lead character’s tinnitus, Slater worked with pitch changes to interweave elements of the film’s soundtrack. “Whenever Baby is not listening to music, his tinnitus is present to some degree. But it became apparent very soon in our design process that strident, high-pitched ‘whistle tones’ would not work for a sustained period of time. Working closely with composer Steven Price, we developed a varied set of methods to convey the tinnitus — it’s rarely the same sound twice. Much of the time, the tinnitus is pitched according to either the outgoing or incoming music track. This then enabled us to use more of it, yet at the same time be quite subtle.”

Meticulous Planning for Set Pieces and Car Chases
Picture editor Amos joined the project at the start of the Director’s Cut to handle the film’s set pieces. He says, “These set pieces were conceptually very different from the vast majority of action scenes in that they were literally built up around the music and then visualized. Meticulous development and planning went into these sequences before the shoot even began, which was decisive in making the action become musical. For example, the ‘Tequila’ gunfight started as a piece of music by Button Down Brass. It was then laced with gunfire and SFX pitched to the music, and in time with the drum hits — this was done at the script stage by Mark Nicholson (aka, Osymyso, a UK musician/DJ) who specializes in mashup/bastard pop and breakbeat.”

Storyboards then grew around this scripted sound collage, which became a precise shot list for the filmed sequences. “Guns were rigged to go off in time with the music; it was all a very deliberate thing,” adds Amos. “Clearly, there was a lot of editing still to be done, but this approach illustrates that there’s a huge difference between something that is shot and edited to music, and something that is built around the music.”

“All the car chases for Baby Driver were meticulously planned, and either prevised or storyboarded,” Amos explains. “This ensured that the action would always fit into the time slot permitted within the music. The first car chase [against the song ‘Bellbottoms’] is divided into 13 sections, to align to different progressions in the music. One of the challenges resulted from the decision to never edit the music, which meant that none of these could overrun. Stunts were tested and filmed by second unit director Darrin Prescott, and the footage passed back to editorial to test against the timing allowed in the animatic. If a stunt couldn’t be achieved in the time allowed, it was revised and tweaked until it worked. This detailed planning gave the perfect backbone to the sequences.”

Amos worked on the sequences sequentially, “using the animatic and Paul’s on-set assembly as reference,” and began to break down all the footage into rolls that aligned to specific passages of the music. “There was a vast amount of footage for all the set pieces, and things are not always shot in order. So generally I spent a lot of time breaking the material down very methodically. I then began to make selects and started to build the sequences from scratch, section by section. Once I completed a pass, I spent some time building up my sound layers. I find this helps evolve the cut, generating another level of picture ideas that further tighten the syncopation of sound and picture.”

Amos’ biggest challenge, despite all the planning, was finding ways to condense the material into its pre-determined time slot. “The real world never moves quite like animatics and boards. We had very specific points in every track where certain actions had to take place; we called these anchor points. When working on a section, we would often work backwards from the anchor point knowing, for instance, that we only had 20 seconds to tell a particular part of the story. Initially, it can seem quite restrictive, but the edits become so precise.

Jonathan Amos

“The time restriction led to a level of kineticism and syncopation that became a defining feature of the movie. While the music may be the driving force of the action scenes, editorial choices were always rooted in the story and the characters. If you lose sight of the characters, the audience will disengage with the sequence, and you’ll lose all the tension you’ve worked so hard to create. Every shot choice was therefore very considered, and we worked incredibly hard to ensure we never wasted a frame, telling the story in the most compelling, rhythmic and entertaining way we could.”

“Once we had our cut,” Machliss summarizes, “we could return the tracks to Julian for re-conforming,” to accommodate edit changes. “It was an excellent way of working, with full-sounding edit mixes.”

Summing up his experience in Baby Driver, Machliss considers the film to be “the hardest job I’ve ever done, but the most fun I’ve ever had. Ultimately, our task was to create a film that on one level could be purely enjoyed as an exciting/dramatic piece of cinema, but, on repeated viewing, would reveal all the little elements ‘under the surface’ that interlock together — which makes the film unique. It’s a testament to Edgar’s singular vision and, in that regard, he is a tremendously exciting director to work with.”


Mel Lambert has been involved with production industries on both sides of the Atlantic for more years than he cares to remember. He is principal of Content Creators, a LA-based copywriting and editorial service, and can be reached at mel.lambert@content-creators.com. He is also a long-time member of the UK’s National Union of Journalists.

WWE adds iPads, iPhones to production workflow

By Nick Mattingly

Creating TV style productions is a big operation. Lots of equipment, lots of people and lots of time. World Wrestling Entertainment (WWE) is an entertainment company and the largest professional wrestling organization in the world. Since its inception, it has amassed a global audience of over 36 million.

Each year, WWE televises over 100 events via its SmackDown, WWE Raw and Pay-Per-View events. That doesn’t include the hundreds of arena shows that the organization books in venues around the world.

“Putting this show on in one day is no small feat. Our shows begins to load-in typically around 4:00am and everything must be up and ready for production by 2:00pm,” explained Nick Smith, WWE’s director of remote IT and broadcast engineering. “We travel everything from the lighting, PA, screens, backstage sets, television production facilities, generators and satellite transmission facilities, down to catering. Everyone [on our team] knows precisely what to do and how to get it done.”

Now the WWE is experimenting with a new format for the some 300 events it hosts that are currently not captured on video. The goal? To see if using Switcher Studio with a few iPhones and iPads can achieve TV-style results. A key part of testing has been defining workflow using mobile devices while meeting WWE’s high standard of quality. One of the first requirements was moving beyond the four-camera setup. As a result, the Switcher Studio team produced a special version of Switcher that allows unlimited sources. The only limitation is network bandwidth.

Adding more cameras was an untested challenge. To help prevent bottlenecks over the local network, we lowered the resolution and bitrate on preview video feeds. We also hardwired the primary iPad used for switching using Apple dongles. Using the “Director Mode” function in Switcher Studio. WWE then triggered a recording on all devices.

For the first test using Switcher Studio, the WWE had a director and operator at the main iPad. The video from the iPad was output to an external TV monitor using Apple’s AirPlay. This workflow allowed the director to see a live video feed from all sources. They were also able to talk with the camera crew and “direct” the operator when to cut to each camera.

The WWE crew had three camera operators from their TV productions to run iPhones in and around the ring. To ensure the devices had enough power to make it through the four-hour-long event, iPhones were attached to batteries. Meanwhile, two camera operators captured wide shots of the ring. Another camera operator captured performer entrances and crowd reaction shots.

WWE setup a local WiFi network for the event to wirelessly sync cameras. The operator made edits in realtime to generate a line cut. After the event the line cut and a ISO from each angle was sent to the WWE post team in the United Kingdom.

Moving forward, we plan to make further improvements to the post workflow. This will be especially helpful for editors, using tools like Adobe Premiere or Avid Media Composer.

If future tests prove successful, WWE could use this new mobile setup to provide more content to their fans–building new revenue streams along the way.


Nick Mattingly is the CEO/co-founder of Switcher Studio. He has over 10 years of experience in video streaming, online monetization and new technologies. 

Bluefish444 releases IngeSTore 1.1, adds edit-while-record capability

Bluefish444 was at NAB with Version 1.1 of its IngeSTore multichannel capture software, which is now available free from the Bluefish444 website. Compatible with all Bluefish444 video cards, IngeSTore captures multiple simultaneous channels of 3G/HD/SD-SDI to popular media files for archive, edit, encoding or analysis. IngeSTore improves efficiency in the digitization workflow by enabling multiple simultaneous recordings from VTRs, cameras and any other SDI source.

The new version of IngeSTore software also adds “Edit-While-Record” functionality and additional support for shared storage including Avid. Bluefish444 has partnered with Drastic Technologies to bring additional CODEC options to IngeSTore v1.1 including XDCAM, DNxHD, JPEG 2000, AVCi and more. Uncompressed, DV, DVCPro and DVCPro HD codecs will be made available free to Bluefish444 customers in the IngeSTore update.

The Edit-While-Record functionality allows editors access captured files while they are still being recorded to disk. Content creation tools such as Avid Media Composer, Adobe Premiere Pro CC, and Assimilate Scratch can output SDI and HDMI with Bluefish444 video cards while IngeSTore is recording and the files are growing in size and length.

Frame.io 2.0 offers 100 new features, improvements for collaboration

Frame.io, developers of the video review and collaboration platform for content creators, has unveiled Frame.io 2.0 , an upgrade offering over 100 new features and improvements. This new version features new client Review Pages, which expands content review and sharing. In addition, the new release offers deeper workflow integration with Final Cut Pro X and Avid Media Composer, plus a completely re-engineered player.

“Frame.io 2 is based on everything we’ve learned from our customers over the past two years and includes our most-requested features,” says Emery Wells, CEO of Frame.io.

Just as internal teams can collaborate using Frame.io’s comprehensive annotation and feedback tools, clients can now provide detailed feedback on projects with Review Pages, which is designed to make the sharing experience simple, with no log-in required.

Review Pages give clients the same commenting ability as collaborators, without exposing them to the full Frame.io interface. Settings are highly configurable to meet specific customer needs, including workflow controls (approvals), security (password protection, setting expiration date) and communication (including a personalized message for the client).

The Review Pages workflow simplifies the exchange of ideas, consolidating feedback in a succinct manner. For those using Adobe Premiere or After Effects, those thoughts flow directly into the timeline, where you can immediately take action and upload a new version. Client Review Pages are also now available in the Frame.io iOS app, allowing collaboration via iPhones and iPads.

Exporting and importing comments and annotations into Final Cut Pro X and Media Composer has gotten easier with the upgraded, free desktop companion app, which allows users to open downloaded comment files and bring them into the editor as markers. There is now no need to toggle between Frame.io and the NLE.

Users can also now copy and paste comments from one version to another. The information is exportable in a variety of formats, whether that’s a PDF containing a thumbnail, timecode, comment, annotation and completion status that can be shared and reviewed with the team or as a .csv or .xml file containing tons of additional data for further processing.

Also new to Frame.io 2.0 is a SMPTE-compliant source timecode display that works with both non-drop and drop-frame timecode. Users can now download proxies straight from Frame.io.

The Frame.io 2.0 player page now offers better navigation, efficiency and accountability. New “comment heads” allow artists to visually see who left a comment and where so they can quickly find and prioritize feedback on any given project. Users can also preview the next comment, saving them time when one comment affects another.

The new looping feature, targeting motion and VFX artists, lets users watch the same short clip on loop. You can even select a range within a clip to really dive in deep. Frame.io 2.0’s asset slider makes it easy to navigate between assets from the player page.

The new Frame.io 2.0 dashboard has been redesigned for speed and simplicity. Users can manage collaborators for any given project from the new collaborator panel, where adding an entire team to a project takes one click. A simple search in the project search bar makes it easy to bring up a project. The breadcrumb navigation bar tracks every move deeper into a sub-sub-subfolder, helping artists stay oriented when getting lost in their work. The new list view option with mini-scrub gives users the birds-eye view of everything happening in Frame.io 2.0.

Copying and moving assets between projects takes up no additional storage, even when users make thousands of copies of a clip or project. Frame.io 2.0 also now offers the ability to publish direct to Vimeo, with full control over publishing options, so pros can create the description and set privacy permissions, right then and there.

A conversation with editor Hughes Winborne, ACE

This Oscar-winning editor talks about his path, his process, Fences and Guardians of the Galaxy.

By Chris Visser

In the world of feature film editing, Hughes Winborne, ACE, has done it all. From cutting indie features (1996’s Sling Blade) to CG-heavy action blockbusters (2014’s Guardians of the Galaxy) to winning an Oscar (2005’s Crash), Winborne has run the proverbial gamut of impactful storytelling through editing.

His most recent film, the multiple-Oscar-nominated Fences, was an adaptation of the seminal August Wilson play. Denzel Washington, who starred alongside Viola Davis (who won an Oscar for her role), directed the film.

Winborne and I chatted recently about his work on Fences, his career and his brief foray into house painting before he caught the filmmaking bug. He edits on Avid Media Composer. Let’s find out more.

What led you to the path you are on now?
I grew up in Raleigh, North Carolina, and I went to college at the University of North Carolina at Chapel Hill. I graduated with a degree in history without a clue as to what I was going to do. I come from a family of attorneys, so because of an extreme lack of imagination, I thought I should do that. I became a paralegal and worked at North Carolina Legal Services for a bit. It didn’t take me long to realize that that wasn’t what I was meant to do, and I became a house painter.

A house painter?
I had my own house painting business for about three years with a couple of friends. The preamble to that is, I had always been a big movie fan. I went to the movies all the time in high school, but after college I started seeing between five and 10 a week. I didn’t even imagine working in the film business, because in Raleigh, that wasn’t really something that crossed my radar.

Then I saw an ad in the New York Times magazine for a six-week summer workshop at NYU. I took the course, moved to New York and set out to become a film editor. In the beginning, I did a lot of PA work for commercials and documentaries. Then I got an assistant editor job on a film called Girl From India.

What came next?
My father told me about a guy on the coast of North Carolina, A.B. Cooper, Jr., who wanted to make his own slasher film. I made him an offer: “If I get you an editor, can I be the assistant?” He said yes! About one-third of the way through the film, he fired the editor, and I took over that role. It was only my second film credit. I was never an assistant again, which is to the benefit of every editor that ever worked — I was terrible at it!

Where you able to make a living editing at that point?
Not as a picture editor, but I really started getting paid full-time for my editing when I started cutting industrials at AT&T. From there, I worked my way to 48 Hours. While I was there, they were kind enough to let me take on independent film projects for very little money, and they would hire me back after I did the job.

After a while, I moved to LA and started doing whatever I could get my hands on. I started with TV movies and gradually indie films, which really started for me with Sling Blade. Then, I worked my way into the studios after Crash. I’ve been kind of going back and forth ever since.

You mention your love of movies. What are the stories that inspire you? The ones that you get really excited to tell?
The movie that made me want to work in the film business was Barry Lyndon. Though it was not, by far, the film that got me started. I grew up on Truffaut. All his movies were just, for me, wonderful. It was a bit of a religion for me in those days; it gave me sustenance. I grew up on The Graduate. I grew up on Midnight Cowboy and Blow-Up.

I didn’t have a specific story I was interested in telling. I just knew that editing would be good for me. I like solitary jobs. I could never work on the set. It’s too crazy and social for me. I like being able to fiddle in the editing room and try things. The bottom line is, it’s fun. It can be a grind, and there can be a bit of pressure, but the best experiences I’ve had have been when I everybody on the show was having fun and working together. Films are made better when that collaboration is exploited to the limit.

Speaking of collaboration, how did that work on a film like Fences? What about working with actor/director Denzel Washington?
I’d worked with Denzel before [on The Great Debaters], so I kind of knew what he liked. They shot in Pittsburgh, but I didn’t go on location. There was no real collaboration the first six weeks but because I had worked with him before I had a sense of what he wanted.

I didn’t have to talk to him in order to put the film together because I could watch dailies — I could watch and listen to direction on camera and see how he liked to play the scenes. I put together the first cut on my own, which is typical, but in this case it was without almost any input. And my cut was really close. When Denzel came back, we concentrated in a few places on getting the performances the way he really wanted them, but I was probably 85 percent there. That’s not because I’m so great either, by the way, it’s because the actors were so great. Their performances were amazing, so I had a lot to choose from.

Can you talk about editing a film that was adapted from a play?
It was a Pulitzer Prize-winning play, so I wasn’t going to be taking anything out of it or moving anything around. All I had to do was concentrate on putting it together with strong performances — that’s a lot harder than it sounds. I’m working within these constraints where I can’t do anything, really. Not that I really wanted to. Have you seen the movie?

Yes, I loved it. It’s a movie I’ve been coming back to every day since I’ve seen it. I’ve been thinking about it a lot.
Then you’ll remember that the first 45 minutes to an hour is like a machine gun. That’s intentional. That’s me, intentionally, not slowing it down. I could have, but the idea is — and this is what was tricky — the film is about rhythm. Editing is about rhythm anyway, but this film is like rhythm to the 50th degree.

There’s very little music in the film, and we didn’t temp with much music either. I remember when Marc Evans [president, Motion Picture Group, Paramount Pictures] saw this film, he said, “The language is the music.” That’s exactly right.

To me, the dialogue feels like a score. There’s a musicality to it, a certain beat and timbre where it’s leading the audience through the scene, pulling them into the emotion without even hearing what they’re saying. Like when Denzel’s talking machine gun fast and it’s all jovial, then Lyons comes in and everything slows down and becomes very tense, then the scene busts back open and it’s all happy and fun again.
Yeah. You can just quote yourself on that one. [Laughs] That’s a perfect summation of it.

Partially, that’s going to come from set, that’s the acting and the direction, but on some level you’re going to have to construct that. How conscious of that were you the entire time?
I was very conscious of it. Where it becomes a little bit dicey at times is, unlike a play, you can cut. In a play, you’re sitting in the audience and watching everybody on stage at the same time. In a film, you’re not. When you start cutting, now you’ve got a new rhythm that’s different from the stage. In so doing, you’ve got to maintain that rhythm. You can’t just be on Denzel the entire time or Viola. You need to move around, and you need to move around in a way that rhythmically stays in time with the language. That was hard. That’s what we worked on most of the time after Denzel came back. We spent a lot of time just trying to make the rhythms right.

I think that’s one of the most difficult jobs an editor has, is choosing when to show someone saying something and when to show someone’s reaction to the thing being said. One example is when Troy is telling the story of his father, and you stay on him the entire time.
Hughes: Right.

The other side of that coin is when Troy reveals his secret to Rose and the reveal is on her. You see that emotion hit her and wash over her. When I was watching the movie, I thought, “That is the moment Viola Davis won an Oscar.”
Yeah, yeah, yeah. I agree.

I think that’s one of the most difficult jobs as an editor, knowing when to do what. Can you speak to that?
When I put this film together initially, I over-cut it, and then I tried to figure out where I wanted to be. It gets over-cut because I’m trying the best I can to find out what the core of the scene is. By I’m also trying to do that with what I consider to be the best performances. My process is, I start with that, and then I start weeding through it, getting it down and focusing; trying to make it as interesting as I can, and not predictable.

In the scenes that you’re talking about, it was all about Viola’s reaction anyway. Her reaction was going to be almost more interesting than whatever he says. I watched it a few times with audiences, and I know from talking to Denzel that when he did it on stage, there’s like a gasp.

When I saw it, everybody in the theatre was like, “What?” It was great.
I know, I know. It was so great. On the stage, people would talk to him, yell at him [Denzel]. “Shame on you, Denzel!” [laughs]. Then, she went into the backyard and did the scene, and that was the end of it. I’d never seen anything like it before. Honestly. It blew me away.

I was cutting that scene at my little home office. My wife was working behind me on her own stuff, and I was crying all the time. Finally, she turned around and asked, “What is wrong with you?” I showed it to her, and she had the same response. It took eight takes to get there, but when she got it, it was amazing. I don’t think too many actresses can do what Viola did. She’s so exposed. It’s just remarkable to watch.

There were three editors on Guardians of the Galaxy — you, Fred Raskin and Craig Wood. How did that work?
Marvel films are, generally speaking, 12 months from shoot to finish. I was on the film for eight months. Craig came in and took over for me. Having said that, it’s hard with two editors or just multiple editors in general. You have to divvy up scenes. Stuff would come in and we would decide together who was going to do it. I got the job because of Fred. I’d known Fred for 25 years. Fred was my intern on Drunks.

Fred had a prior relationship with James Gunn [director of Guardians]. In most cases, I deferred to Fred’s judgment as to how he wanted to divvy up the scenes, because I didn’t have much of a relationship with James when we started. I’d never done a big CG film. For me, it was a revelation. It was fun, trying to cut a dialogue scene between two sticks. One was tall, and one was short — the green marking was going to be Groot, and the other one was going to be Rocket Raccoon.

Can you talk about the importance of the assistant editor in the editorial process? How many assistants did you have on Fences?
On Fences, I had a first and a second. I started out cutting on film, and the assistant editor was a physical job. Touch it, slice it, catalog it, etc. What they have to do now is so complicated and technical that I don’t even know how to do it. Over my career, I’ve pretty much worked with a couple of assistants the whole time. John Breinholt and Heather Mullen worked with me on Fences. I’ve known Heather for 30 years.

What do you look for in an assistant?
Somebody who is going to be able to organize my life when I’m editing; I’m terrible at that. I need them to make sure that things are getting done. I don’t want to think about everything that’s going on behind the scenes, especially when I’m cutting, because it takes a lot of concentration for me just to sit there for 10 hours a day, or even longer, and concentrate on trying to put the movie together.

I like to have somebody that can look at my stuff and tell me what’s working and what’s isn’t. You get a different perspective from different assistants, and it’s really important to have that relationship.

You talked about working on Guardians for eight months, and I read that you cut Fences in six. What do you do to decompress and take care of your own mental health during those time periods?
Good question. It’s hard. When I was working on Fences, I was on the Paramount lot. They have a gym there, so I tried to go to the gym every day. It made my day longer, because I’d get there really early, but I’d go to the gym and get on the treadmill or something for 45 minutes, and that always helped.

Finally, for those who are young or aspiring editors, do you have any words of wisdom?
I think the once piece of advice is to keep going. It helps if you know what you want to do. So many people in this business don’t survive. There can be a lot of lean years, and there certainly were for me in the beginning — I had at least 10. You just have to stay in the game. Even if you’re not working at what you want to do, it’s important to keep working. If you want to be an editor, or a director, you have to practice.

Also, have fun. It’s a movie. Try and have a good time when you’re doing it. You’ll do your best work when you’re relaxed.


Chris Visser is a Wisconsin kid who works and lives in LA. He is currently an assistant editor working in scripted TV. You can find him on Facebook and Twitter.

DP John Kelleran shoots Hotel Impossible

Director of photography John Kelleran shot season eight of the Travel Channel’s Hotel Impossible, a reality show in which struggling hotels receive an extensive makeover by veteran hotel operator and hospitality expert Anthony Melchiorri and team.

Kelleran, who has more than two decades experience shooting reality/documentary projects, called on Panasonic VariCam LT 4K cinema camcorders for this series.

eWorking for New York production company Atlas Media, Kelleran shot a dozen Hotel Impossible hour-long episodes in locations that include Palm Springs, Fire Island, Capes May, Cape Hatteras, Sandusky, Ohio, and Albany, New York. The production, which began last April and wrapped in December 2016, spent five days in each location.

Kelleran liked the VariCam LT’s dual native ISOs of 800/5000. “I tested ISO5000 by shooting in my own basement at night, and had my son illuminated only by a lighter and whatever light was coming through the small basement window, one foot candle at best. The footage showed spectacular light on the boy.”

Kelleran regularly deployed ISO5000 on each episode. “The crux of the show is chasing out problems in dark corners and corridors, which we were able to do like never before. The LT’s extreme low light handling allowed us to work in dark rooms with only motivated light sources like lamps and windows, and absolutely keep the honesty of the narrative.”

Atlas Media is handling the edit, using Avid Media Composer. “We gave post such a solid image that they had to spend very little time or money on color correction, but could rather devote resources to graphics, sound design and more,” concludes Kelleran.

Rick Pearson on cutting Kong: Skull Island

By Randi Altman

Who doesn’t love a good King Kong movie? And who says a good King Kong movie has to have the hairy giant climbing the Empire State Building, lady in hand?

The Jordan Vogt-Roberts-directed Kong: Skull Island, which had an incredible opening weekend at the box office — and is still going strong — tells the story of a 1973 military expedition to map out an island where in 1944 two downed pilots happened upon a huge monster. What could possibly go wrong?

Editor Rick Pearson, who was originally set to come on board for 10 weeks during the Director’s Cut process to help with digital effects turnovers, ended up seeing the project through to the end. Pearson came on during the last third of production, as the crew was heading off to Vietnam.

The process was already in place where rough cuts were shared on the PIX system for the director’s review. That seemed to be work well, he says.

To find out more about the process, I recently touched base with Pearson, who at the time of our interview was in Budapest editing a film about the origin of Robin Hood. He kindly took time out of his busy schedule to talk about his work and workflow on Kong: Skull Island, which in addition to Vietnam shot in Hawaii and Australia.

Would director Vogt-Roberts get you notes? Did he give you any direction in terms of the cut?
Yes, he would give very specific notes via PIX. He would drop the equivalent of locators or markers on sequences that I would send him and say, “Could you maybe try a close-up here?” Or “Could you try this or that?” They were very concise, so that was helpful. Eventually, though, you get to a point where you really need to be in a room together to explore options.

There are a lot of visual effects in the film. Can you talk about how that affected your edit and workflow?
Some of the sequences were quite evolved in terms of previsualization that had been done a year or more prior. Then there was a combination of previs, storyboards and some sequences, one in particular had kind of a loose set of storyboards and some previs, but then the set piece was evolving as we were working.

The production was headed to Vietnam and there was a lot of communication between myself, Jordan and the producers about trying to nail down the structure of this set piece so they would know what to shoot in terms of plates, because it was a battle that largely took place between Kong and one of the creatures of the island — it was a lot of plate work.

Would you say that that was the most difficult sequence to work on, or is there another more challenging sequence that you could point to?
I think they were all challenging. For me, that last sequence, which we called the “Final Battle” was challenging in there was not a lot that was nailed down. There were some beats we knew we wanted to try to play, but it sort of kept evolving. I enjoy working on these kinds of films with those types of sequences because they’re so malleable. It’s a fun sandbox to play in because, to an extent, you’re limited only by your imagination.

Still, you’re committing a lot of money, time and resources, so you need to look down field as far as you can to say, “This is the right direction and we’re all on the same page.” It’s a big, slow-moving, giant cargo ship that takes a long time to course-correct. You want to make sure that you’re heading in the right direction, or at least as close as you can be, when you start going down those roads.

Any other shots that stand out?
There was one thing that was kind of a novelty on this picture — and I know that it’s not the first time it’s been done, but it was the first time for me. We had some pretty extensive re-shoots, but our cast was kind of spread all over the globe. In one of the re-shoots, we needed a conversation to happen in a bar between three of the characters, Tom Hiddelston, John Goodman and Cory Hawkins. None of them were available at the same time or in the same city.

The scene was going to the three of them sitting down at a table having a conversation where John Goodman’s character offers Tom Hiddelston’s character a job as their guide to take them to Skull Island. I think it was Goodman’s character that was shot first. We show Goodman’s side of the table in New York with that side of the bar behind him and an empty chair beside him. Then we shot Hawkin’s character by himself in front of a greenscreen sitting in a chair reacting to Goodman and delivering his dialogue. Lastly, we shot Hiddelston in LA with that side of the bar and overs with doubles. It all came together, and I thought, “I don’t think anybody would have a clue that none of these people were in the same room at the same time.” It was kind of a Rubik’s Cube… an editorial bit of sleight of hand that worked in the end.


You worked with other editors on the film, correct?
Yes, editor Bob Murawski helped me tremendously; he ended up taking over my original role, which was during the Director’s Cut. Bob came on to help split up these really demanding visual effects sequence turnovers every two weeks. We had to keep on it to make the release date.

Murawski was a huge help, but so was the addition of Josh Schaeffer, who had worked with Jordan in the past. He was one of the additional editors on Jordan’s Kings of Summer (2013). Jordan had shot a lot of material — it wasn’t necessarily montage-based, but we weren’t entirely sure how it was going to work in the picture. We knew that he had a long-standing relationship with Josh and was comfortable with him. Bob said, “While we’re in the middle of a Director’s Cut and you and I are trying to feed this giant visual effects beast, there’s also this whole other aspect that Jordan and Josh could really focus on.” Josh was a really big help in getting us through the process. Both Bob and Josh were very big assets to me.

How do you work with your assistant editor?
I’ve had the same first assistant, Sean Thompson, for about 12 years. Unfortunately, he’s not with me here in Budapest. I took this film after the original editor dropped out for health reasons. Sean has a young family, and 15 weeks in Budapest and then another 12 weeks in London just wasn’t possible for him.

How did you work with Sean on Skull Island?
He’s a terrific manager of the cutting room in terms of discerning the needs of other departments, be it digital effects, music or sound. I lean on him to let me know what I absolutely need to know, and he takes care of the rest. That’s one of the roles he serves, and he’s bulletproof.

I also rely on him creatively. He’s tremendous with his sound work and very good at looking at cuts with me and giving his feedback. I throw him scenes to cut as much as I can, but sometimes on films like this there are so many other demands as a manager.

You use Avid Media Composer. Any special keyboard mappings, or other types of work you provide?
As a feature film editor my main objective is to make sure that the story and the characters are firing on all cylinders. I’m not particularly interested in how far I can push the box technically.

I’ve mapped the keyboard to what I’m comfortable with, but I don’t think it’s anything that’s particularly sophisticated. I try to do as much as I can on the keyboard so that I keep the
pointing and clicking to a minimum.

You edit a lot of action films. Is that just because people say, “He does action,” or is that your favorite kind of film to cut?
It’s interesting you should say that… the first Hollywood feature I cut was Bowfinger, which is comedy. I hadn’t cut any comedy before that and suddenly I was the comedy editor. I found it ironic because everything I had done prior was action-based television, music videos and commercials. I’ve always loved cutting action and juxtaposing images in a way that tells a story that’s not necessarily being told verbally. It’s not just like, “Wow, look at how much stuff is blowing up and that’s amazing how many cars are involved.” It’s actually character-based and story-driven.

I also really enjoy comedy. There is quite a lot of comedy in Kong, so it’s nice to flex that muscle too. I’ve tried very hard to not get pigeonholed.

So you are knee-deep in this Robin Hood film?
I sure am! I wasn’t planning on getting back on to another film quite so quickly, but I was very intrigued by both the director and script. As I mentioned earlier, they had an editor slated for the picture but unfortunately she fell ill just weeks prior to the start of production. So suddenly, here I am.

The added bonus is you get to play in Europe for a bit.
Yes, actually, I’m sitting here in my apartment. I have a laptop and an additional monitor and I’ve been cutting scenes. I have a lovely view of the parliament building, which is on the Danube. It’s a beautiful domed building that’s lit up every night until midnight. It’s really kind of cool.

Resident Evil: The Final Chapter editor Doobie White

By Brady Betzel

Editor Doobie White straddles two worlds. As co-founder of West Los Angeles-based Therapy Studios, he regularly works on commercials and music videos, but he also gets to step out of that role to edit movies. In fact, his most recent, Resident Evil: The Final Chapter for director Paul WS Anderson, is his ninth feature film.

Recently, we reached out to White to ask him about his workflow on this film, his editing techniques, his background and why regularly cutting more than one type of project makes him a better editor. Ok, let’s dig in.

Doobie White

What was it like coming onto a film that was an established franchise, and the last film in that franchise? Did that add any pressure?
The pressure was definitely on. The Final Chapter needed to be bigger, scarier and more exciting than the previous films. It’s also when the story comes full circle. We find out who Alice really is and what she has been fighting against throughout the franchise. There was a considerable amount of time and effort put into the edit to make it the best possible film it could be. That is what we aim for.

How early did you come on board? Were you on set? Near set? Keeping up with camera?
I was brought on a month or so before principal photography began. The film was shot in South Africa. I was there for four months cutting away like a madman. I was keeping up with camera so they could pick up any extra shots that would help tell the story. This was a life saver in the end. Scenes got better and we solved problems early on in the process. By the time we left South Africa we had a full rough cut. No re-shoots were necessary because they got it all before we left.

This is an action film with a lot of VFX. How did that affect your edit? Did they do pre and postvis on this one? Does that help you?
There was previs for some of the more complicated VFX in the movie, but that was mainly for production, to get a better understanding of what Paul was looking for and to make sure they captured every shot that was needed. I do think it really helps to get everyone on the same page. The scene usually evolves, but it’s a great way to start. I basically do the same thing but with the real footage when there is a lot of VFX involved.

When I’m working on a heavy VFX sequence, I really put everything into the scene that I possibly can to make sure it is working. There is a scene with a big flying creature at the beginning of the film that we called Dragon vs. Hummer. It’s basically exactly what it sounds like. I took still cut-outs of a temp creature and placed them into the shots, making the creature chase Alice around a destroyed Washington, DC. My goal was to make the edit look and sound like the finished film — albeit, with a silly cut-out of a scary monster. If I can create excitement with a still, I know the finished scene is going to be great.

Did the director shoot a lot of footage?
Paul does shoot a lot. He covers everything really well. I’m hardly ever painted into a corner. He always gives me a way out. Having tons of footage does make it more difficult when putting scenes together, but I love having the options to play.

What direction were you given from him in terms of the cut, if any?
We kinda had a motto for the film. Probably not a motto, but it’s something that Paul would say after showing him a cut, and I would always keep in mind. “There is a lot of great stuff in there… all you need to do now is move it all closer together.” Paul really wanted this film to be non-stop — for the story to always be propelled forward. I took that as a mission statement: to always make the audience feel like Alice, caught in this crazy post-apocalyptic world — with violence, chaos and monsters!

How did you work with Anderson? How often were you showing him cuts?
Paul is great to work with. We had an absolute blast cutting this film together. In the early stages I was just trying to tame the beast, so we would get together a couple of times a week to review. By the end, Paul was in everyday pushing me to take the edit into new territory. What’s incredible about Paul is that he never runs out of ideas. Anytime there was a problem he would always have a creative solution. It truly is a joy to work on a film like this with a director that isn’t afraid to push visual storytelling.

What system did you edit on?
Avid 8.3.1. It was the most stable at the time. Avid is still the best at having multiple people working on the same material at the same time. I might consider other software if they could match the sharing functionality that Avid has been doing for years. I also frequently use Adobe Photoshop and After Effects.

Do you have any special shortcuts/tricks you use often?
This really isn’t a shortcut or a trick, but it relates. If I find a performance that I like but there is something wrong with the image, I will usually figure out a way to fix it. I tend to do this on every job to some degree. For instance, if an actors eyes are shut when they are delivering a line, I will replace their eyes from a different take. Sometimes I’m replacing heads to have characters looking in the right direction. I will comp two different takes together. I use every tool I’ve got to get the best performance possible.

How do you organize your projects? Any special bins/folders of commonly used stuff like speed ramps, transitions, etc.?
I have a lot of bins that migrate from job to job. I place just as much importance on sound as I do on picture. Everything I do involves sound in a very specific way. So I have around 120 sound effects bins that I move over to every job that I do. Everything from footsteps to gunshots. I’m adding to this all the time, but it saves multiple days of work to keep a master set of sounds and then add specific sounds for each job.

On this one, we recorded a bunch of people in our office for zombie sounds, pitching their voices and adding effects to make them sound truly disturbing. I also have 60 bins or so of music that I keep on hand. I’m adding to this all the time as well.

What do you expect from an assistant editor, and how much knowledge should they already have? Are they essentially technical editors or do you mentor them?
I expect a lot from my assistants. They need to be technically savvy, but they also need to know how to edit. I do so many VFX and do a full sound design pass on every scene. My assistants have to be able to contribute on all fronts. One day they will be organizing. The next they will be adding sounds and lasers (temp VFX) to a scene. I have worked with the same assistant for a bunch of years. Her name is Amy K. Bostrom, and she is amazing. She does all the technical side, but she is a great editor in her own right. I have no doubt that she will have a great career.

How did you approach this project and was it any different than commercials/music videos?
It’s definitely different, but I start in a very similar way. I like to get a scene/commercial/music video cut together as fast as possible. I don’t watch a lot of the footage on the first pass. When I have a rough cut I go back to the dailies and watch everything. At that point I know what I’m looking for and my selects have a purpose.

If you could edit any genre and project what would you do?
That’s a tough question. I don’t think I really have a preference. I want the challenge and to be pushed creatively. Every project that I work on I’m really just trying to make myself feel something. I search for footage and sound that evokes emotion, and I cut it in a way that produces some sort of feeling in myself. Whether that be happiness, pain, excitement, fear, pleasure — if I can feel something when I’m working, then others will as well. I want to work on projects that connect with people in some way. The genre is secondary.

Are you ever satisfied with an edit, or does the edit just stop because of deadlines? Could you tinker forever or do know when something is at the right spot?
I think it is a little bit of both. You work really hard to get a project into a good place. Fix all the problems, fine-tune everything, but eventually you run out of time. A movie could be worked on forever. So it is like George Lucas said, “Movies are abandoned.” I believe a film can always be better. I go for that until I can’t.

Do you have any life/work balance tips or processes you do?
Unfortunately, no. I wish I did. I just have a lot of passion for what I do. I can’t really focus on anything else when I’m on a project. I try to disconnect from it, but I’m always thinking about it in some way. How can it be better? What is this scene/movie/commercial really about? How can I fix something that is not working? I’m half present when I’m on a project and I’m not in the cutting room. It takes over my life. It’s probably not the healthiest way to go, but it’s the only way I know. Honestly, I love it. I’m fine with getting a little obsessive. I’m going to work on meditating!

It must be fun to run an editorial house, but also step into the world of features films from time to time. Keeps things fresh for you?
Yes, it is nice to be able to jump from different types of projects. I love commercials and I love movies, but they are quite different and use different muscles. By the time I’m done with a movie I am so ready to cut commercials for a while, and vice versa. Films are extremely rewarding, but it’s an endurance race. Commercials are instant gratification. You cut for a week or two, and its on air the following week. It’s great! After a few months of commercials I’m ready for a new challenge.

Where/when did you get the first itch to work in video/film?
I had no plans of working in the film industry. I loved movies, music videos, and commercials, but I was so far removed from that world that I never saw a path. I was a ski bum studying art in Lake Tahoe, and one of the classes offered was digital media. This is the first time I realized you could edit clips together on a computer. It changed everything I was focused on. I started making silly short films and cutting them together. It wasn’t a film school and no one else was doing this so I had to do everything. From the writing, shooting and the music.

What I enjoyed the most was editing these little masterpieces. I decided I was going to figure out how to get someone to pay me to be an editor. I moved to LA and pretty much got laughed at. I couldn’t find a job, I was sleeping on couches. It was a bit desperate. The only opportunity that I eventually landed was an internship at a post house. After many coffee runs and taking out the trash, an editor asked me to work on a music video over the weekend. I jumped at the opportunity and didn’t go home until he came back on Monday. After he saw the cut I was hired the next day. This post house is where I met three of my best friends who would eventually become my partners at Therapy Studios.

Was your family supportive of you going into a creative job like editing?
To a degree, yes. It took a long time to find a path as an editor, and I think it was a bit confusing for them when I started working as an intern, especially being that I had zero cash and they were in no place to help. What I think is hard for a lot of people to understand that are not in the industry is that its very difficult to get a job in the film business. No matter what career you want to do, there are a thousand other kids that are trying to do the same thing. Perseverance is key. If you can outlast others you will probably find a way… ha!


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: Apple’s new MacBook Pro

By Brady Betzel

What do you need to know about the latest pro laptop from Apple? Well, the MacBook Pro is fast and light; the new Touch Bar is handy and sharp but not fully realized; the updated keys on the keyboard are surprisingly great; and working with ProRes QuickTime files in resolutions higher than 1920×1080 inside of FCP X, or any NLE for that matter, is blazing fast.

When I was tasked with reviewing the new MacBook Pro, I came into it with an open mind. After all, I did read a few other reviews that weren’t exactly glowing, but I love speed and innovation among professional workstation computers, so I was eager to test it myself.

I am pretty open-minded when it comes to operating systems and hardware. I love Apple products and I love Windows-based PCs. I think both have their place in our industry, and to be quite honest it’s really a bonus for me that I don’t rely heavily on one OS or get too tricked by the Command Key vs. Windows/Alt Key.

Let’s start with the call I had with the Apple folks as they gave me the lowdown on the new MacBook Pro. The Apple reps were nice, energetic, knowledgeable and extremely helpful. While I love Apple products, including this laptop, it’s not the be-all-end-all.

The Touch Bar is nice, but not a revolution. It feels like the first step in an evolution, a version 1 of an innovation that I am excited to see more of in later iterations. When I talked with the Apple folks they briefed me on what Tim Cook showed off in the reveal: emoji buttons, wide gamut display, new speakers and USB-C/Thunderbolt 3 connectivity.

NLEs
They had an FCPX expert on the call, which was nice considering I planned on reviewing the MacBook Pro with a focus on the use of nonlinear editing apps, such as Adobe Premiere Pro, Avid Media Composer and Blackmagic’s Resolve. Don’t get me wrong, FCPX is growing on me — it’s snappy jumping around the timeline with ProRes 5K footage; assigning roles are something I wish every other app would pick up on; and the timeline is more of a breeze to use with the latest update.

The other side to this is that in my 13 years of working in television post I have never worked on a show that primarily used FCP or FCPX to edit or finish on. This doesn’t mean I don’t like the NLE, it simply means I haven’t relied on it in a professional working environment. Like I said, I really like the road it’s heading down, and if they work their way into mainstream broadcast or streaming platforms a little more I am sure I will see it more frequently.

Furthermore, with the ever-growing reduction in reliance on groups of editors and finishing artists apps like FCPX are poised to shine with their innovation. After all that blabbering, in this review I will touch on FCPX, but I really wanted to see how the MacBook Pro performed with the pro NLEs I encounter the most.

Specs
Let’s jump into the specs. I was sent a top-of-the-line 15-inch MacBook Pro with Touch Bar, which costs $3,499 if configured online. It comes with a quad/-core Intel Core i7 2.9GHz (up to 3.8 GHz using Turbo Boost) processor, 16GB of 2133MHz memory, 1TB PCI-e SSD hard drive and Radeon Pro 460 with 4GB of memory. It’s loaded. I think the only thing that can actually be upgraded beyond this configuration would be to include a 2TB hard drive, which would add another $800 to the price tag.

Physically, the MacBook Pro is awesome — very sturdy, very thin and very light. It feels great when holding it and carrying it around. Apple even sent along a Thunderbolt 3 (USB-C) to Thunderbolt 2 adapter, which costs an extra $29 and a USB-C to Lightning Cable that costs an extra $29.

So yes, it feels great. Apple has made a great new MacBook Pro. Is it worth upgrading if you have a new-ish MacBook Pro at home already? Probably not, unless the Touch Bar really gets you going. The speed is not too far off from the previous version. However, if you have a lot of Thunderbolt 3/USB-C-connected peripherals, or plan on moving to them, then it is a good upgrade.

Testing
I ran some processor/graphics card intensive tests while I had the new MacBook Pro and came to the conclusion that FCPX is not that much faster than Adobe Premiere Pro CC 2017 when working with non-ProRes-based media. Yes, FCPX tears through ProRes QuickTimes if you already have your media in that format. What about if you shoot on a camera like the Red and don’t want to transcode to a more edit-friendly codec? Well, that is another story. To test out my NLEs, I grabbed a sample Red 6K 6144×3160 23.98fps clip from the Red sample footage page, strung out a 10-minute-long sequence in all the NLEs and exported both a color-graded version and a non-color-graded version as ProRes HQ QuickTimes files matching the source file’s specs.

In order to work with Red media in some of the NLEs, you must download a few patches: for FCPX you must install the Red Apple workflow installer and for Media Composer you must install the Red AMA plug-in. Premiere doesn’t need anything extra.

Test 1: Red 6K 6144×3160 23.98fps R3D — 10-minute sequence (no color grade or FX) exported as ProRes HQ matching the source file’s specs. Premiere > Media Encoder = 1 hour, 55 minutes. FCPX = 1 hour, 57 minutes. Media Composer = two hours, 42 minutes (Good news, Media Composer’s interface and fonts display correctly on the new display).

You’ll notice that Resolve is missing from this list and that is because I installed Resolve 12.5.4 Studio but then realized my USB dongle won’t fit into the USB-C port — and I am not buying an adapter for a laptop I do not get to keep. So, unfortunately, I didn’t test a true 6K ProRes HQ export from Resolve but in the last test you will see some Resolve results.

Overall, there was not much difference in speeds. In fact, I felt that Premiere Pro CC 2017 played the Red file a little smoother and at a higher frames-per-second count. FCPX struggled a little. Granted a 6K Red file is one of the harder files for a CPU to process with no debayer settings enabled, but Apple touts this as a MacPro semi-replacement for the time being and I am holding them to their word.

Test 2: Red 6K 6144×3160 23.98fps R3D — 10-minute color-graded sequence exported as ProRes HQ matching the source files specs. Premiere > Media Encoder = one hour, 55 minutes. FCPX = one hour, 58 minutes. Media Composer = two hours, 34 minutes.

It’s important to note that the GPU definitely helped out in both Adobe Premiere and FCPX. Little to no extra time was added on the ProRes HQ export. I was really excited to see this as sometimes without a good GPU — resizing, GPU-accelerated effects like color correction and other effects will slow your system to a snail’s pace if it doesn’t fully crash. Media Composer surprisingly speed up its export when I added the color grade as a new color layer in the timeline. By adding the color correction layer to another layer Avid might have forced the Radeon to kick in and help push the file out. Not really sure what that is about to be honest.

Test 3: Red 6K 6144×3160 23.98fps R3D — 10-minute color-graded sequence resized to 1920×1080 on export as ProRes HQ. Premiere > Media Encoder = one hour, 16 minutes. FCPX = one hour, 14 minutes. Media Composer = one hour, 48 minutes. Resolve = one hour, 16 minutes

So after these tests, it seems that exporting and transcoding are all about the same. It doesn’t really come as too big of a surprise that all the NLEs, except for Media Composer, processed the Red file in the same amount of time. Regardless of the NLE, you would need to knock the debayering down to a half or more to start playing these clips at realtime in a timeline. If you have the time to transcode to ProRes you will get much better playback and rendering speed results. Obviously, transcoding all of your files to a codec, like ProRes or Avid DNX, takes way more time up front but could be worth it if you crunched for time on the back end.

In addition to Red 6K files, I also tested ProRes HQ 4K files inside of Premiere and FCPX, and both played them extremely smoothly without hiccups, which is pretty amazing. Just a few years ago I was having trouble playing down 10:1 compressed files in Media Composer and now I can playback superb-quality 4K files without a problem, a tremendous tip of the hat to technology and, specifically, Apple for putting so much power in a thin and light package.

While I was in the mood to test speeds, I hooked up a Thunderbolt 2 SSD RAID (OWC Thunderbay 4 mini) configured in RAID-0 to see what kind of read/write bandwidth I would get running through the Apple Thunderbolt 3 to Thunderbolt 2 adapter. I used both AJA System Test as well as the Blackmagic Disk Speed Test. The AJA test reported a write speed of 929MB/sec. and read speed of 1120MB/sec. The Blackmagic test reported a write speed of 683.1MB/sec. and 704.7MB/sec. from different tests and a read speed of 1023.3MB/sec. I set the test file for both at 4GB. These speeds are faster than what I have previously found when testing this same Thunderbolt 2 SSD RAID on other systems.

For comparison, the AJA test reported a write speed of 1921MB/sec. and read speed of 2134MB/sec. when running on the system drive. The Blackmagic test doesn’t allow for testing on the system drive.

What Else You Need to Know
So what about the other upgrades and improvements? When exporting these R3D files I noticed the fan kicked on when resizing or adding color grading to the files. Seems like the GPU kicked on and heated up which is to be expected. The fan is not the loudest, but it is noticeable.

The battery life on the new MacBook Pro is great when just playing music, surfing the web or writing product reviews. I found that the battery lasted about two days without having to plug in the power adapter. However, when exporting QuickTimes from either Premiere or FCPX the battery life dropped — a lot. I was getting a battery life of one hour and six minutes, which is not good when your export will take two hours. Obviously, you need to plug in when doing heavy work; you don’t really have an option.

This leads me to the new USB-C/Thunderbolt 3 ports — and, yes, you still have a headphone jack (thank goodness they didn’t talk with the iPhone developers). First off, I thought the MagSafe power adapter should have won a Nobel Peace Prize. I love it. It must be responsible for saving millions of dollars in equipment when people trip over a power cord — gracefully disconnecting without breaking or pulling your laptop off the table. However, I am disappointed Apple didn’t create a new type of MagSafe cable with the USB-C port. I will miss it a lot. The good news is you can now plug in your power adapter to either side of the MacBook Pro.

Adapters and dongles will have to be purchased if you pick up a new MacBook Pro. Each time I used an external peripheral or memory card like an SD card, Tangent Ripple Color Correction panel or external hard drive, I was disappointed that I couldn’t plug them in. Nonetheless, a good Thunderbolt 3 dock is a necessity in my opinion. You could survive with dongles but my OCD starts flaring up when I have to dig around my backpack for adapters. I’m just not a fan. I love how Apple dedicated themselves to a fast I/O like USB-C/Thunderbolt 3, but I really wish they gave it another year. Just one old-school USB port would have been nice. I might have even gotten over no SD card reader.

The Touch Bar
I like it. I would even say that I love it — in the apps that are compatible. Right now there aren’t many. Adobe released an update to Adobe Photoshop that added compatibility with the Touch Bar, and it is really handy especially when you don’t have your Wacom tablet available (or a USB dongle to attach it). I love how it gives access to so many levels of functionality to your tools within your immediate reach.

It has super-fast feedback. When I adjusted the contrast on the Touch Bar I found that the MacBook Pro was responding immediately. This becomes even more evident in FCPX and the latest Resolve 12.5.4 update. It’s clear Apple did their homework and made their apps like Mail and Messages work with the Touch Bar (hence emojis on the Touch Bar). FCPX has a sweet ability to scrub the timeline, zoom in to the timeline, adjust text and more from just the Touch Bar — it’s very handy, and after a while I began missing it when using other computers.
In Blackmagic’s latest DaVinci Resolve release, 12.5.4, they have added Touch Bar compatibility. If you can’t plug in your color correction panels, the Touch Bar does a nice job of easing the pain. You can do anything from contrast work to saturation, even adjust the midtones and printer lights, all from the Touch Bar. If you use external input devices a lot, like Wacom tablets or color correction panels, the Touch Bar will be right up your alley.

One thing I found missing was a simple application launcher on the Touch Bar. If you do pick up the new MacBook Pro with Touch Bar, you might want to download Touch Switcher, a free app I found via 9to5mac.com that allows you to have an app launcher on your Touch Bar. You can hide the dock, allowing you more screen real estate and the efficient use of the Touch Bar to launch apps. I am kind of surprised Apple didn’t make something like this standard.

The Display
From a purely superficial and non-scientific point of view, the newly updated P3-compatible wide-gamut display looks great… really great, actually. The colors are rich and vibrant. I did a little digging under the hood and noticed that it is an 8-bit display (data that you can find by locating the pixel depth in the System Information > Graphics/Display), which might limit the color gradations when working in a color space like P3 as opposed to a 10-bit display displaying in a P3 color space. Simply, you have a wider array of colors in P3 but a small amount of color shades to fill it up.

The MacBook Pro display is labeled as 32-bit color meaning the RGB and Alpha channels each have 8 bits, giving a total of 32 bits. Eight-bit color gives 256 shades per color channel while 10-bit gives 1,024 shades per channel, allowing for much smoother transitions between colors and luminance values (imagine a sky at dusk going smoothly from an orange to light blue to dark blue — the more colors per channel allows for a smoother gradient between lights and darks). A 10-bit display would have 30-bit color with each channel having 10 bits.

I tried to hook up a 10-bit display, but the supplied Thunderbolt 3 to Thunderbolt 2 dongle Apple sent me did not work with the mini display port. I did a little digging and it seems people are generally not happy that Apple doesn’t allow this to work, especially since Thunderbolt 2 and mini DisplayPort are the same connection. Some people have been able to get around this by hooking up their display through daisy chaining something like a Thunderbolt 2 RAID.

While I couldn’t directly test an external display when I had the MacBook Pro, I’ve read that people have been able to push 10-bit color out of the USB-C/Thunderbolt 3 ports to an external monitor. So as long as you are at a desk with a monitor you can most likely have 10-bit color output from this system.

I reached out to Apple on the types of adapters they recommend for an external display and they suggest a USB-C to DisplayPort adapter made by Aukey. It retails for $9.99. They also recommend the USB-C to DisplayPort cable from StarTech, which retails for $39.99. Make sure you read the reviews on Amazon because the experience people have with this varies wildly. I was not able to test either of these so I cannot give my personal opinion.

Summing Up
In the end, the new MacBook Pro is awesome. If you own a recent release of the MacBook Pro and don’t have $3,500 to spare, I don’t know if this is the update you will be looking for. If you are trying to find your way around going to a Windows-based PC because of the lack of Mac Pro updates, this may ease the pain slightly. Without more than 16GB of memory and an Intel Xeon or two, however, it might actually slow you down.

The battery life is great when doing light work, one of the longest batteries I’ve used on a laptop. But when doing the heavy work, you need to be near an outlet. When plugged into that outlet be careful no one yanks out your USB-C power adapter as it might throw your MacBook Pro to the ground or break off inside.

I really do love Apple products. They typically just work. I didn’t even touch on the new Touch ID Sensor that can immediately switch you to a different profile or log you in after waking up the MacBook Pro from sleep. I love that you can turn the new MacBook Pro on and it simply works, and works fast.

The latest iteration of FCPX is awesome as well, and just because I don’t see it being used a lot professionally doesn’t mean it shouldn’t be. It’s a well-built NLE that should be given a fairer shake than it has been given. If you are itching for an update to an old MacBook Pro, don’t mind having a dock or carrying around a bunch of dongles, then the 2016 MacBook Pro with the Touch Bar is for you.

The new MacBook Pro chews through ProRes-based media from 1920×1080 to 4K, 6K and higher will play but might slow down. If you are a Red footage user this new MacBook Pro works great, but you still might have to knock the debayering down a couple notches.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: Lenovo ThinkStation P410

By Brady Betzel

With the lukewarm reaction of the professional community to the new Apple MacBook Pro, there are many creative professionals who are seriously — for the first time in their careers — considering whether or not to jump into a Windows-based world.

I grew up using an Apple II GS from 1986 (I was born in 1983, if you’re wondering), but I always worked on both Windows and Apple computers. I guess my father really instilled the idea of being independent and not relying on one thing or one way of doing something — he wanted me to rely on my own knowledge and not on others.

Not to get too philosophical, but when he purchased all the parts I needed to build my own Windows system, it was incredibly gratifying. I would have loved to have built my own Apple system, but obviously never could. That is why I am so open to computer systems of any operating system software.

If you are deciding whether or not to upgrade your workstation and have never considered solutions other than HP, Dell or Apple, you will want to read what I have to say about Lenovo‘s latest workstation, the P410.

When I set out on this review, I didn’t have any Display Port-compatible monitors and Lenovo was nice enough to send their beautiful Think Vision Pro 2840m — another great piece of hardware.

Digging In
I want to jump right into the specs of the ThinkStation P410. Under the hood is an Intel Xeon E5-1650 v4, which in plain terms is a 6-core 3.6GHz 15MB CPU that can reach all the way up to 4.0GHz if needed using Intel’s Turbo Boost technology. The graphics card is a medium-sized monster — the Nvidia Quadro M4000 with 8GB of GDDR5 memory and 1664 CUDA cores. It has 4 DisplayPort 1.2 ports to power those four 30-bit 4096×2160 @60Hz displays you will run when editing and color correcting.

If you need more CUDA power you could step up to the Nvidia M5000, which runs 2048 CUDA cores or the M6000, which runs 3072 CUDA cores, but that power isn’t cheap (and as of this review they are not even an option from Lenovo in the P410 customization — you will probably have to step up to a higher model number).

There is 16GB of DD4-2400 ECC memory, 1TB 2.5-inch SATA 6Gb/s SSD (made by Macron), plus a few things like a DVD writer, media card reader, keyboard and mouse. At the time I was writing this review, you could configure this system for a grand total of $2,794, but if you purchase it online at shop.lenovo.com it will cost a little under $2,515 with some online discounts. As I priced this system out over a few weeks I noticed the prices changed, so keep in mind it could be higher. I configured a similar style HP z440 workstation for around $3,600 and a Dell Precision Tower 5000 for around $3,780, so Lenovo’s prices are on the low end for major-brand workstations.

For expansion (which Windows-based PCs seem to lead the pack in), you have a total of four DIMM slots for memory (two are taken up already by two 8GB sticks), four PCIe slots and four hard drive bays. Two of the hard drive bays are considered Flex Bays, which can be used for hard drives, hard drive + slim optical drive or something like front USB 3.0 ports.

On the back there are your favorite PS/2 keyboard port and mouse port, two USB 2.0 ports, four USB 3.0 ports, audio in/out/mic and four DisplayPorts.

Testing
I first wanted to test the P410’s encoding speed when using Adobe Media Encoder. I took a eight-minute, 30 second 1920×1080 23.98fps ProRes HQ QuickTime that I had filmed using a Blackmagic Pocket Cinema Camera, did a quick color balance in Adobe Premiere Pro CC 2017 using the Lumetri Color Correction tools and exported a Single Pass, variable bit rate 25Mb/s H.264 using Media Encoder. Typically, CUDA cores kick in when you use GPU-accelerated tools like transitions, scaling in Premiere and when you export files with GPU effects such as Lumetri Color tools. Typically, when exporting from tools, like Adobe Premiere Pro CC or Adobe Media Encoder, the GPU acceleration kicks in only if you’ve applied GPU-accelerated effects, color correction with something like Lumetri (which is GPU accelerated) or a resize effect. Otherwise if you are just transcoding from one codec to another the CPU will handle the task.

In this test, it took Media Encoder about six minutes to encode the H.264 when Mercury Playback Engine GPU Acceleration (CUDA) was enabled. Without the GPU acceleration enabled it took 14 minutes. So by using the GPU, I got about a 40 percent speed increase thanks to the power of the Nvidia Quadro M4000 with 8GB of GDDR5 RAM.

For comparison, I did the same test on a newly released MacBook Pro with Touch Bar i7 2.9Ghz Quad Core, 16GB of 2133 MHz LPDDR3 memory and AMD Radeon Pro 460 4GB of RAM (uses OpenCL as opposed to CUDA); it took Media Encoder about nine minutes using the GPU.

Another test I love to run uses Maxon’s Cinebench, which simply runs real-world scenarios like photorealistic rendering and a 3D car chase. This taxes your system with almost one million polygons and textures. Basically, it makes your system do a bunch of math, which helps in separating immature workstations from the professional ones. This system came in around 165 frames per second. In comparison to other systems, with similar configurations to the P410, it placed first or second. So it’s fast.

Lenovo Performance Tuner
While the low price is what really sets the P410 apart from the rest of the pack, Lenovo has recently released a hardware tuning software program called Lenovo Performance Tuner. Performance Tuner is a free app that helps to focus your Lenovo workstation on the app you are using. For instance, I use Adobe CC a lot at home, so when I am working in Premiere I want all of my power focused there with minimal power focused on background apps that I may not have turned off — sometimes I let Chrome run in the background or I want to jump between Premiere, Resolve and Photoshop. You can simply launch Performance Tuner and click the app you want to launch in Lenovo’s “optimized” state. You can go further by jumping into the Settings tab and customize things like Power Management Mode to always be on Max Performance. It’s a pretty handy tool when you want to quickly funnel all of your computing resources to one app.

The Think Vision Pro Monitor
Lastly, I wanted to quickly touch on the Think Vision Pro 2840m LED backlit LCD monitor Lenovo let me borrow for this review. The color fidelity is awesome and can work at a resolution up to 3840×2160 (UHD, not full 4K). It will tilt and rotate almost any way you need it to, and it will even go full vertical at 90 degrees.

When working with P410 I had some problems with DisplayPort not always kicking in with the monitor, or any monitor for that matter. Sometimes I would have to unplug and plug the DisplayPort cable back in while the system was on for the monitor to recognize and turn on. Nonetheless, the monitor is awesome at 28 inches. Keep in mind it has a glossy finish so it might not be for you if you are near a lot of light or windows — while the color and brightness punch through, there is a some glare with other light sources in the room.

Summing Up
In the end, the Lenovo ThinkStation P410 workstation is a workhorse. Even though it’s at the entry level of Lenovo’s workstations, it has a lot of power and a great price. When I priced out a similar system using PC Partpicker, it ran about $2,600 — you can check out the DIY build I put together on PCPartpicker.com: https://pcpartpicker.com/list/r9H4Ps.

A drawback of DIY custom builds though is that they don’t include powerful support, a complete warranty from a single company or ISV certifications (ISV = Independent Software Vendors). Simply, ISVs are the way major workstation builders like HP, Dell and Lenovo test their workstations against commonly used software like Premiere Pro or Avid Media Composer in workstation-focused industries like editing or motion graphics.

One of the most misunderstood benefits of a workstation is that it’s meant to run day and night. So not only do you get enterprise-level components like Nvidia Quadro graphics cards and Intel Xeon CPUs, the components are made for durability as well as performance. This way there is little downtime, especially in mission-critical environments. I didn’t get to run this system for months constantly, but I didn’t see any sign of problems in my testing.

When you buy a Lenovo workstation it comes with a three-year on-site warranty, which covers anything that goes wrong with the hardware itself, including faulty workmanship. But it won’t cover things like spills, drops or electrical surges.

I liked the Lenovo ThinkStation P410. It’s fast, does the job and has quality components. I felt that it lacked a few of today’s necessary I/O ports like USB-C/Thunderbolt 3.

The biggest pro for this workstation is the overwhelmingly low price point for a major brand workstation like the ThinkStation P410. Check out the Lenovo website for the P410 and maybe even wander into the P910 aisle, which showcases some of the most powerful workstations they make.

Check out this video I made that gives you a closer look at (and inside) the workstation.

Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Team Player: Rules Don’t Apply editor Brian Scofield

By Randi Altman

In the scheme of things, we work in a very small industry where relationships, work ethic and talent matter. Brian Scofield is living proof of that. He is one of a team of editors who worked on Warren Beatty’s recent Rules Don’t Apply.

That team included lead editor Billy Weber, Leslie Jones and Robin Gonsalves. It was the veteran editor Weber (Beatty’s Bulworth 1998) who brought Scofield on board as a second editor.

Weber was Scofield’s mentor while he was in the MFA program at USC. “Not long after I completed graduate school, Billy helped me reconnect with the Malick camp, who I met while working in the camera crew on Tree of Life,” he explains. “I then became an apprentice on To the Wonder, and then an editor on Knight of Cups. When Billy came in as an advisor at the end of Knight of Cups, we reconnected in LA. He had just begun working on Rules Don’t Apply with Warren, and when I finished my work on Knight of Cups, he brought me aboard.”

Scofield recognizes that relationships open doors, but says you have to walk through them and prove you belong in the room all by yourself. “I think people often make the mistake of thinking that networking trumps talent and work ethic, or the other way around, and that just isn’t true.  All three are required to have a career as a film editor — the ability to form lasting relationships, the diligence to work really hard, and having natural instincts that you’re always striving to improve upon.”

Scofield says he will always be grateful to Weber and the example he’s set. “I’m only one of over a dozen people whose careers Billy has helped launch over the years. It’s in large part his generosity and mentorship that inspires me to pay it forward any chance I get.”

Let’s find out more from Scofield about his editing process, what he’s learned over the years, and the importance of collaboration.

You have worked with two Hollywood icons in Terrence Malick and Warren Beatty. I’m assuming you’re not easily intimidated.
It’s been a transformative experience in every way. These two guys, who have been making films for over 40 years, are constantly challenging themselves to try new things… to experiment, to learn. They’re always re-evaluating pretty much everything from the story to the style, and yet these are two guys with such distinct voices that really shine through their work. You know a Malick or Beatty film when you see it. The Inexhaustibility of the cinematic art form, I guess, is what I really took away from both of them.

Photo Credit: Francois Duhamel.They are both very different kinds of filmmakers.
You would never think that working on a Terrence Malick film would prepare you to work on a Warren Beatty film. Knight of Cups is a stream-of-consciousness, meditative tome about the meaning of life. Warren’s film is a romantic comedy with a historical drama slant. Aesthetically, they’re very different films, but the process of constantly finding ways to break open the movie all over again, and the mindset that requires, is very similar.

Both Terry and Warren are uncompromising and passionate about making movies the way they want and not bending to conventions, yet at the same time looking for ways to reach people on a very deep level. In this case, both films were also deeply personal for the director. When you work on something like that, it adds another layer of pressure because you want to honor how much of themselves they’re willing to put into their work. But that’s also where I believe the most exciting films come from. That pressure just becomes inspiration.

How early did you get involved on Rules Don’t Apply?
Right after production wrapped. I was finishing up with Terry on the mix stage for Knight of Cups when Billy called. They had an assembly of the film when I joined — everything was in there — and that version was probably about four hours long. Interestingly, some things have changed dramatically since that version and some are remarkably similar.

I was on Rules Don’t Apply for just over a year, but I’ve been back several times since officially finishing. I took a good amount of time off and went back, and since then I’ve popped in and out whenever Warren has needed me. Robin became a true caretaker of the film, staying with Warren through that additional time leading up to the release.

Is that typically how you’ve worked? Coming in after there’s an assembly?
I’ve come in as an additional set eyes on some, and I’ve been on films during production, sending cuts to the director while they’re in the middle of shooting. This includes giving feedback on pick-ups they need to grab or things to be wary of performance-wise, those types of things.

Both are thrilling experiences. It’s fun to come in when there has been one specific approach and they’re open to new ideas. You kind of get to shake people out of the one way they’ve been going about the film. When I’m the editor that’s been working on the film since the beginning, that initial discovery period when you see the film take shape for the first time is always thrilling. The relationship you form with both the film and the director is hard to beat. But then, I’m always excited for someone to come in and shake things up, to help me think differently. That’s why you do feedback screenings. That’s why you bring other editors into the room to take a look and to make you think about things from a different angle.

How was it on Rules Don’t Apply?
When I came on, so much of it was working really well from the first assembly, but I did want to strengthen the love story between Frank and Marla and make their attraction more evident early in the film so that it paid off later. I started by going through all of the scenes and looking for little moments where we could build up glances between them or find little raindrops before the storm of that budding relationship.

Photo Credit: Francois Duhamel.There were a few storylines going on at the same time as well?
The story takes place over a long period of time — you’ve got Warren Beatty playing Howard Hughes, you’re dealing with a young love story, you’re dealing with an incredible supporting cast, all of whom could be bigger characters or smaller characters. When you come in a little bit later, it’s often your job to help figure out which storylines or themes are going to become the main thrust of the movie.

So there are different definitions of co-editor?
Well, it varies every day. Some days Warren would want to work on a couple of different scenes, so one editor would take one and I would take the other. Sometimes you would have worked on a scene for a long time and somebody else would say, “Let me have a stab at that. I’ve got a different idea.” Sometimes we were all together in one room with one of us driving the Avid and the others offering a different set of eyes — eyes that aren’t staring at the timeline — and they’re looking at it side-by-side with the director, almost as a viewer instead of within the nitty-gritty of making the cut. We would take turns doing that.

You’ve got to check your ego at the door, I suppose? Everybody’s on the same team these days.
There’s no pecking order, and I think Billy Weber is really the one who sets that tone because he’s such a generous and experienced editor and man. There are people out in the industry that might be protective of their work versus letting anybody else touch it, but there’s none of that in any of the editing rooms that I’ve been fortunate enough to work in. Everybody’s respectful of each other.

On this film we had Billy, myself, Leslie Jones and Robin all working at the same time. You’ve got almost three generations of editors in that room, and to be treated as an equal really opens up your mind and your creativity. You feel the freedom to really present big ideas.

How is it collaborating with Warren?
He is such a unique guy. His favorite thing to do is to have a fight — he doesn’t want people who are just going to accept what he says. He wants a fiery debate, which can make people uncomfortable, but I’m okay with it. I actually really enjoyed that, especially when you realize he’s not taking it personally and neither should I. This is about making a movie the best that it can be. He wants people that are going to challenge him and push back.

Photo Credit: Francois Duhamel.So it’s part of his creative process?
Yes, it’s all about the discourse. If he has a strong point of view, he wants to argue it to make sure that he really believes it. And if you have a strong point of view, he wants you to be able to tell him why. I would say the fiercest fights led to him being most happy afterwards. At the end of the screaming, he would always say, “That was such a productive conversation. I’m so glad we did that!” He surrounds himself with people he knows he trusts. He knows that’s what he needs to make him as productive and as creative as he can be.

It’s been a long time since Warren directed a film, how did he react to the new technology?
He was thrilled with all of the new abilities of technology. This movie was shot on the Alexa, for the most part, and we did do a good amount of combining it will archival footage. This is a very modern movie in many ways, but it also has a distinctive throwback vibe. We had to try to marry those things without going overboard.

We resized frames, added a few push-ins, speed ramps, and so on. Ultimately, all of these tools just allowed him to explore the footage even more than he’s used to doing. He really loved taking advantage of new editorial opportunities that couldn’t have been done even 15 years ago, at least not as easily.

How do you organize things within the Avid Media Composer?
Any time I start a new job, I send a Google Doc to the assistant that specifies exactly how I want the project set up. It’s an evolution of things I’ve learned in different editing rooms over time.

For every scene, I have a bin with a frame view. If the bin is the size of my monitor, I should be able to see all clips in that one view without scrolling. Each set-up is separated from each other, so I can see very quickly, “Oh there are four takes of that shot, there are four takes of that shot, there are three takes of that one.” I have the assistant prepare three sequences: one that’s just a pure string-out of all of the clips, so I can, in one sequence, scrub through everything that’s there. I do a string-out “clean,” which is when you take out all the slates and you take out all the director’s talking, so I can be impartial and just look at the footage. Then I usually have one more sequence that’s just circle takes that the director chose on set. Then I go through and I make a select reel based off of everything that I watch. That’s the basic bin set-up.

For films that have multiple editors, organization is really important because somebody else has to be able to understand how your work is organized. You have to be able to find things that you did a year ago.

Any special tricks, like speed ramps, sound effects, transitions? I’m imagining that changes per project?
Yeah, it’s pretty unique to the project. There are a lot of editors who have specific effects that they go back to over and over again in their own bin. I’ve got a few of those, but I almost always end up tailoring them and sometimes just starting from scratch.  I go on the hunt for the right effect when I need it.

Photo Credit: Francois Duhamel.I’ve gotten pretty adept at tailoring the built-in effects to my needs as they come up, but people who use those effects all the time are working on more crazy action or stylized films because they’ve got a lot more demand for those than when you’re working on character-driven content.

Do you typically work with a template from a colorist, or do you do any temp color corrections yourself?
Most of the films have a look that the DP has already applied, and I do tweaking as needed. If we come up with a creative reason for color correction, I’ll do a sketch. I do a lot of work with sound, but with color, it just depends. If it needs to be changed in order to understand what the idea is or if we’re screening it for somebody that we don’t trust to be able to see what it is without color correction, then of course we’re going to go in and we’re going to tweak it. I’ve worked on a film where all the exteriors were really magenta, so we came up with our kind of default fix to be applied to all of those shots.

Can you elaborate on the sound part?
I cut as much for sound as I do for picture. I think people grossly underestimate the influence that sound has on how you watch a movie. I’m not a sound designer, but I try my best to provide a sketch for when we go into that next phase so the sound designer has a pretty clear idea of what we’re going for. Then, of course, they use their creativity to expand and do their own thing.

How do you work with your assistant editors? Do you encourage them to edit, or are they strictly technical?
It depends on the project and on the timeframe of the project. In the beginning, the priority is on getting everything set up. Then the priority is on helping me build a first sound pass after we’ve gotten an assembly. They help bring in effects and to smooth over things I’ve sketched out. Sometimes they’re just gathering effects for me and sometimes they’re cutting them themselves. Sometimes we’re kind of tossing them back and forth. I do a rough pass and I ask them to mix it, clean up the levels, add in a couple accents here and there. Once we’re through with that we kind of have at least a ground floor for sound to cut with.

When given the opportunity, I love to let my assistants get creative. I let them take a stab at scenes, or at least have them be present in the room to give feedback. When the director isn’t present, I rely a lot on my assistant just to check in and say, “Hey, is this crazy?” or try to engage them as much as I can in that creative process. It all just depends on the demands of the project and the experience level of the assistant.

Is there anything you would like to add?
Film is a collaborative art form, and in order to help a director do their best work, you need to be their friend, their antagonist, their therapist, their partner. Whatever it takes is what your job is. I was so fortunate to learn an enormous amount from Warren, but also from my fellow editors. I hope everybody has as much fun watching this crazy little movie as we did making it.

Finally, I’d just love to say that working with Warren will undoubtedly be one of the most cherished experiences of my life. Reputations be damned, he’s a kind, brilliant and uncompromising artist who it was endlessly inspiring to spend so much time with.  I’ll forever be grateful I had the opportunity to both work for him and to call him a friend.

Main Image: Robin Gonsalves, Warren Beatty and Brian Scofield.

Digging deeper with Jackie editor Sebastián Sepúlveda

By Mel Lambert

Cutting Jackie together was a major challenge, according to picture editor Sebastián Sepúlveda. “Cinematographer Stéphane Fontaine’s intricate handheld camera work — often secured in a single take — the use of the extreme close-ups and the unconventional narrative framework meant that my creative sensibilities were stretched to the maximum. I was won over by the personality of Jackie Kennedy, and saw the film and its component parts as a creative opportunity on several levels. I approached the edit as several small emotional moments that, as a whole, offered a peek into her inner life.”

sebastian_sepulveda

Sebastián Sepúlveda

Director Pablo Larraín’s new offering, which opens in the US on December 2, chronicles the tragic events following the assassination of President John F. Kennedy, as the late Jacqueline Bouvier Kennedy fights through grief and trauma to regain her faith, console her children and maintain her husband’s historic legacy, as well as the world of Camelot that they created. Jackie stars Natalie Portman, Peter Sarsgaard, Billy Crudup and Greta Gerwig.

The script, by Noah Oppenheim, is nonlinear. It opens with an interview between Jackie and an unnamed journalist from Life magazine just a few days after the assassination and transitions to earlier events as the narrative unfolds. The 100-minute film was shot in France and Washington, DC, on Kodak Vision3 Super 16mm film with an Arriflex 416 Plus camera. It had a 2K DI in an aspect ratio of is 1.66:1, which more convincingly matches the 4×3 archive footage than a wide-screen format.

The film is already getting award attention. Portman (Jackie) was nominated for a Gotham Independent Film Award for best actress, Larraín won the Platform Prize at the 2016 Toronto International Film Festival, and Oppenheim’s screenplay received the Golden Osella at the 2016 Venice Film Festival. The director was also nominated for the Golden Lion for best film at the latter festival.

The Edit
Sepúlveda (who has been nominated for a Spirit Award for his work on Jackie) previously edited Larraín’s Spanish-language film The Club and has collaborated with his friend on previous films. “I shaped Jackie’s unconventional narrative into a seamless story and dove into Larraín’s exploration of her internal reality — the emotional, enigmatic core of the most unknown woman in the world,” he explains. “I found emotional bridges to stitch the piece together in a format that’s bold, innovative and not taught in film school — it is organic to the movie and very much in sync with Larraín’s creative process.”

Sepúlveda identified four key layers to the narrative: ongoing interview sequences at Hyannis Port that provide an insight into the lead character’s frail emotional state; a reconstruction of the landmark White House tour that the First Lady hosted for CBS Television in 1961; sequences with an Irish catholic priest (John Hurt) that explore the lead character’s inevitable crisis of faith; and the assassination and harrowing high-speed exit from Dealey Plaza in Dallas on November 22, 1963.

“I navigated the edit by staying true to Jacqueline Kennedy’s emotional core, which was the primary through-line of the director’s approach to the movie. We had to bring to life a structure that went back and forth across many layers of time,” he says. “The film starts with a more classical interview of Mrs. Kennedy by a magazine journalist just after the tragedy. Then we have the White House tour in flashback, and then the day in Dallas where JFK was murdered. So that was tricky. We also had extreme close-ups of Natalie [Portman] in almost every scene, which we used not only to see the story from her point of view, but also to observe every detail of her expression following the nightmare the former First Lady had to go through.”

Sepúlveda, who works most often in Apple Final Cut Pro, was given an Avid Media Composer for this film. He says his biggest challenge in the editing suite was honoring the four identified layers throughout the complex cut. “It was very hard to balance all the facts, but also to give life to the film. I did a very quick edit in a week by keeping the structure very simple. I then went back and refined the edit while still honoring the basic shape. The use of extreme close-ups and medium shots let me keep [the First Lady] at the center of attention, and to make sure that the editing was not obtrusive to that vision of a sad, melancholic feel.

Photo by William Gray“And we had the gorgeous, incredible Natalie Portman, who plays with her eyes in a way that you cannot read so easily. It puts the character into a more mysterious perspective. You think in one scene that you understand the character, then comes the next scene and… boom! Natalie shows you another part of this complex character. Finally, you cannot pick which one is Jacqueline Kennedy, since all those different aspects of the character are the First Lady. We had to build the structure — the bridges between the scenes — only guided by this emotional path.”

Eye Contact
Both Larraín and Sepúlveda subscribe to the Shakespearean adage that our eyes are the windows to our soul, and arranged their cut around that conviction. “When we started the edit, after studying the rushes, Pablo and I had a conversation — maybe the most important/interesting part of the process for me — about the eyes,” says Sepúlveda. “For us, they built the entire emotional path of the storytelling process, because the viewer is always trying to read what’s behind the eyes. You can try to bluff with a facial expression, but our eyes are there to show things that you don’t want to say.

“As an audience member you are trying to go deeper into the character,” the editor continues, “but always find the unexpected. You become emotionally involved with this figure while wanting to know more about her. Your imagination is engaged, playing with the film. For me, that’s pure cinema.”

Sepúlveda considers the process as harkening back to the New Wave or La Nouvelle Vague film period of the late 1950s and 1960s. Although never a formally organized movement, New Wave filmmakers rejected literary-period pieces being made in France and written by novelists, and instead elected to shoot current social issues on location. They intended to experiment with a film form that chronicled social and political upheavals with their radical experiments in editing, visual style and narrative elements in more of a documentary style, with fragmented, discontinuous editing and long takes.

Photo by Pablo LarraínAnd not all scenes in Jackie involve complex cross editing. An example is the scene in the White House when Jacqueline Kennedy strips off her blood-stained clothing, to the ironical accompaniment of the title song from the Broadway musical Camelot sung by Richard Burton. “It was the first time she had been alone, and we had a number of long shots to emphasize that isolation; she was walking like a ghost, dropping clothes as she went from room to room — almost as if she was changing her skin — with several two- to three-minute takes,” describes Sepúlveda. “The music also recedes as if it was coming from an adjacent room, to add to the sense of separation, and the haunting loss of the sense of [JFK’s] Camelot — the dream was broken. This was not the same Jacqueline Kennedy known by the public.”

Because he has young girls, editing a film about this powerful, vulnerable, creative First Lady was important to this Chilean-born editor. “Given our current political situation here in the States – and which obviously has ripple effects beyond our borders — I think we need a little Jackie love and magic right now,” he says.

“As the father of two little girls I know that they don’t have the same opportunities as the boys, and that scares me. To participate in a film in which the main character is a woman who had to make important decisions for her country in a moment of political and personal crisis, is ethically important to me. Because, obviously, it was an extremely traumatic time for Jacqueline Kennedy, the idea was to create a seamless edit that could evoke how human memory works under trauma. In this case, we approached it like small glimpses of that period of the First Lady’s life. For me, it was very important to keep the audience emotionally involved with the main character, to almost participate in her experience and, ultimately, to empathize with her. It’s a portrait of grief but we also appreciate, ultimately, how she persevered and overcame it.”

An Editor’s Background
An experienced cinematographer, writer and director, Sepúlveda has enjoyed an eclectic career, whose vocations inform each other and also reflect a sometimes-stressful home life. “My family was exiled from Chile because of Pinochet’s coup d’état in 1973. My mother was a university professor and supported the Allende government. We lived in five countries — France, Venezuela, Argentina, Switzerland and Spain — but it was a very beautiful childhood. To live in Venezuela, discovering the Amazon rainforest, living in Argentina when the democracy returned in the eighties, attending public school in France and getting my dose of republican values. I studied history in a Chilean university, editing in Cuba and scriptwriting in Paris. I really like to work on different aspects of a film.”

In 2007, he worked in France as a film editor, and returned to Chile for vacations. “Pablo was editing Tony Manero, and invited me to give them feedback. It was a first cut, but astonishing. I was shocked in a positive way. We had a pleasant conversation about possible ways to build the film. Then I moved back to Chile and Pablo’s brother Juan invited me to work with them. I started as a script doctor for films and TV series they produced, edited some feature films, and also wrote some script treatments for Pablo. His company, Fabula, produced my first feature film as a director, Las Niñas Quispe (2013), which premiered at Venice Critics Week,” he concludes. “It’s been an amazing journey.”

—————
Mel Lambert is principal of Content Creators, an LA-based copywriting and editorial service, and can be reached at mel.lambert@content-creators.com Follow him on Twitter @MelLambertLA

Review: Tangent Ripple color correction panel

By Brady Betzel

Lately, it feels like a lot of the specializations in post production are becoming generalized and given to the “editor.” One of the hats that the editor now wears is that of color corrector — I’m not saying we are tasked with color grading an entire film, but we are asked to make things warmer or cooler or to add saturation.

With the standard Wacom tablet, keyboard and/or mouse combo, it can get a little tedious when color correcting — in Adobe Premiere, Blackmagic Resolve or Avid Media Composer/Symphony — without specialized color correction panels like the Baselight Blackboard, Resolve Advanced, Nucoda Precision, Avid Artist Color or even Tangent’s Element. In addition, those specialized panels run between $1,000 per piece to upwards of $30,000, leaving many people to fend for themselves using a mouse.

While color correcting with a mouse isn’t always horrible, once you use a proper color correction panel, you will always feel like you are missing a vital tool. But don’t worry! Tangent has released a new color correction panel that is not only affordable and compatible with many of today’s popular coloring and nonlinear editing apps, but is also extremely portable: the Tangent Ripple.

For this review I am covering how the Tangent Ripple works inside of Premiere Pro CC 2015.3, Filmlight’s Baselight Media Composer/Symphony plug-in and Resolve 12.5.

One thing I always found intimidating about color correction and grading apps like Resolve was the abundance of options to correct or grade an image. The Tangent Ripple represents the very basic first steps in the color correction pipeline: color balancing using lift, gamma, gain (or shadows, midtones and highlights) and exposure/contrast correction. I am way over-simplifying these first few steps but these are what the Ripple specializes in.

You’ve probably heard of the Tangent Element Panels, which go way beyond the basics — if you start to love grading with the Tangent Ripple or the Element-VS app, the Element set should be your next step. It retails for around $3,500, or a little below as a set (you can purchase the Element panels individually for cheaper, but the set is worth it). The Tangent Ripple retails for only $350.

Basic Color Correction
If you are an offline editor who wants to add life to your footage quickly, basic color correction is where you will be concentrating, and the Ripple is a tool you need to purchase. Whether you color correct your footage for cuts that go to a network executive, or you are the editor and finisher on a project and want to give your footage the finishing touch, you should check out what a little contrast, saturation and exposure correction can do.

panelYou can find some great basic color correcting tutorials on YouTube, Lynda.com and color correction-focused sites like MixingLight.com. On YouTube, Casey Faris has some quick and succinct color correction tutorials, check him out here. Ripple Training also has some quick Resolve-focused tips posted somewhat weekly by Alexis Van Hurkman.

When you open the Tangent Ripple box you get an instruction manual, the Ripple, three track balls and some carrying pouches to keep it all protected. The Ripple has a five-foot USB cable hardwired into it, but the track balls are separate and do not lock into place. If you were to ask a Ripple user to tell you the serial number on the bottom of the Ripple, most likely they will turn it over, dropping all the trackballs. Obviously, this could wreck the trackballs and/or injure someone, so don’t do it, but you get my point.

The Ripple itself is very simple in layout: three trackballs, three dials above the trackballs, “A” and “B” buttons and revert buttons next to the dials. That is it! If you are looking for more than that, you should take a look at the Element panels.

After you plug in the Ripple to an open USB port, you probably should download the Tangent Hub software. This will also install the Tangent Mapper, which allows you to customize your buttons in apps like Premiere Pro. Unfortunately, Resolve and the Media Composer Baselight plug-in do not allow for customization, but when you install the software you get a nice HUD that signals what service each Ripple button and knob does in the software you are using.

If you are like me and your first intro into the wonderful world of color correction in an NLE was Avid Symphony, you might have also encountered the Avid Artist Color panel, which is very similar in functionality: three balls and a couple of knobs. Unfortunately, I found that the Artist Color never really worked like it should within Symphony. Here is a bit of interesting news: while you can’t use the Ripple in the native Symphony color corrector, you can use external panels in the Baselight Avid plug-in! Finally a solution! It is really, really responsive to the Tangent Ripple too! The Ripple really does work great inside of a Media Composer plug-in.

The Ripple was very responsive, much more than what I’ve experienced with the Avid Artist Color panel. As I mentioned earlier, the Ripple will accomplish the basics of color correcting — you can fix color balance issues and adjust exposure. It does a few things well, and that is it. To my surprise, when I added a shape (a mask used in color correction) in Baselight, I was able to adjust the size, points and position of the shape using the Ripple. In the curves dialogue I was able to add, move and adjust points. Not only does Baselight change the game for powerful, in-Avid color correction, but it is a tool like the Ripple that puts color correction within any editor’s grasp. I was really shocked at how well it worked.

When using the Ripple in Resolve you get what Resolve wants to give you. The Ripple is great for basic corrections inside of Resolve, but if you want to dive further into the awesomeness of color correction, you are going to want to invest in the Tangent Element panels.

With the Ripple inside of Resolve, you get the basic lift, gamma and gain controls along with the color wheels, a bypass button and reset buttons for each control. The “A” button doesn’t do anything, which is kind of funny to me. Unlike the Baselight Avid plug-in, you cannot adjust shapes, or do much else with the Ripple panel other than the basics.

Element-Vs
Another option that took me by surprise was Tangent iOS and the Android app Element-Vs. I expected this app to really underwhelm me but I was wrong. Element-Vs acts as an extension of your Ripple — based off the Tangent Element panels. But keep in mind, it’s still an app and there is nothing comparable to the tactile feeling and response you get from a panel like the Ripple or Elements. Nonetheless, I did use the Element-Vs app on an iPad Mini and it was surprisingly great.

It is a bit high priced for an app, coming in at around $100, but I was able to get a really great response when cycling through the different Element “panels,” leading me to think that the Ripple and Element-Vs app combo is a real contender for the prosumer colorist. At a total of $450 ($350 for the Ripple and $100 for the Element-Vs app), you are in the same ballpark as a colorist who has a $3,000-plus set of panels.

As I said earlier, the Element panels have a great tactile feel and feedback that, at the moment, is hard to compare to an app, but this combo isn’t as shabby as I thought it would be. A welcome surprise was that the installation and connection were pretty simple too.

Premiere Pro
The last app I wanted to test was Premiere Pro CC. Recently, Adobe added external color panel support in version 2015.3 or above. In fact, Premiere has the most functionality and map-ability out of all the apps I tested — it was an eye-opening experience for me. When I first started using the Lumetri color correction tools inside of Premiere I was a little bewildered and lost as the set-up was different from what I was used to in other color correction apps.

I stuck to basic color corrections inside of Premiere, and would export an XML or flat QuickTime file to do more work inside of Resolve. Using the Ripple with Premiere changed how I felt about the Lumetri color correction features. When you open Premiere Pro CC 2015.3 along with the Tangent Mapper, the top row of tabs opens up. You can customize not only the standard functions of the Ripple within each Lumetri panel, like Basic, Creative, Curves, Color Wheels, HSL Secondaries and Vignette, but you can also create an alternate set of functions when you press the “A” button.

In my opinion, the best button press for the Ripple is the “B” button, which cycles you through the Lumetri panels. In the panel Vignette, the Ripple gives you options like Vignette Amount, Vignette Midpoint, feather and Vignette Roundness.

As a side note, one complaint I have about the Ripple is that there isn’t a dedicated “bypass” button. I know that each app has different button designations and that Tangent wants to keep the Ripple as simple as possible, but many people constantly toggle the bypass function.

Not all hope is lost, however. Inside of Premiere, if you hold the “A” button for alternate mapping and hit the “B” button, you will toggle the bypass off and on. While editing in Premiere, I used the Ripple to do color adjustments even when the Lumetri panel wasn’t on screen. I could cycle through the different Lumetri tabs, make adjustments and continue to edit using keyboard functions fast — an awesome feature both Tangent and Adobe should be promoting more, in my opinion.

It seems Tangent worked very closely with Adobe when creating the Ripple. Maybe it is just a coincidence, but it really feels like this is the Adobe Premiere Pro CC Tangent Ripple. Of course, you can also use the Element-Vs app in conjunction with the Ripple, but in Premiere I would say you don’t need it. The Ripple takes care of almost everything for you.

One drawback I noticed when using the Ripple and Element-Vs inside of Premiere Pro was a small delay when compared to using these inside of Resolve and Baselight’s Media Composer plug-in. Not a huge delay, but a slight hesitation — nothing that would make me not buy the Ripple, but something you should know.

Summing Up
Overall, I really love the Ripple color correction panel from Tangent. At $350, there is nothing better. The Ripple feels like it was created for editors looking to dive deep into Premiere’s Lumetri color controls and allows you to be more creative because of it.

Physically, the Ripple has a lighter and more plastic-type of feel than its Element Tk panel brother, but it still works great. If you need something light and compact, the Ripple is a great addition to your Starbuck’s-based color correction set-up.

I do wish there was a little more space between the trackballs and the rotary dials. When using the dials, I kept nudging the trackballs and sometimes I didn’t even realize what had happened. However, since the Ripple is made to be compact, lightweight, mobile and priced to beat every other panel on the market, I can forgive this.

It feels like Tangent worked really hard to make the Ripple feel like a natural extension of your keyboard. I know I sound like a broken record, but saving time makes me money, and the Tangent Ripple color correction panel saves me time. If you are an editor that has to color correct and grade dailies, an assistant editor looking to up their color correction game or just an all-around post production ninja who dabbles in different areas of expertise, the Tangent Ripple is the next tool you need to buy.


Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Earlier this year, Brady was nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.

The Colonie provides editing, VFX for Toyota Corolla spot

Chicago’s The Colonie has teamed with Burrell Communications to provide editorial, visual effects and design services for I Do, a broadcast spot introducing the 2017 Toyota Corolla.

Creative editor Bob Ackerman edited with the carmaker’s tagline, “Let’s Go Places,” in mind. Fast-paced cuts and editing effects helped create an upbeat message that celebrates the new Corolla, as well as its target audience — what Toyota refers to as today’s “on-the-go” generation of young adults.

Lewis Williams, Burrell’s EVP/CCO, brought The Colonie onboard early in the process to collaborate with the production company The Cavalry, ensuring the seamless integration of a variety of visual effects that were central to the style of this spot.

The commercial integrates three distinct vignettes. The spot opens with a young woman behind the wheel of her Toyota. She arrives at a city park and her friends help her yarn bomb the surroundings — from hand-knitted tree trunk covers to a slipcover for a love seat and a garbage pail cozy in the likeness of whimsical characters.

barbarFrom the get-go art director Winston Cheung was very focused on keeping the tone of the spot fresh and young. When selecting footage during the edit session, Ackerman and Cheung made sure to use some of the more playful set-ups from the yarn vignette to providing the bold color palette for final transfer.

The second scenario finds an enterprising man parking his Corolla and unloading his “Pop-Up Barbershop” in front of a tall wall featuring artful graffiti. A well-placed painting of a young man’s face extends over the top of the wall completes the picture. As soon as the barber sets up his chair, his first customer arrives.

The third vignette features a young filmmaker shooting footage of the 2017 Toyota as her crew adds some illuminating effects. Taking her cues from this scene, The Colonie senior designer, Jen Moody, crafted a series of shots that use a “light painting” technique to create a trail of light effect. One of the characters writes the spot’s title, I Do with a light, which Moody layered to create a more tangible quality that really sells the effect. VFX supervisor Tom Dernulc took a classic Toyota Corolla from a previous segment and seamlessly integrated it into the background of the scene.

The Colonie’s team explored several methods for creating the various VFX in the spot before deciding upon a combination of Autodesk Flame Premium and Adobe After Effects. Then it was a matter of picking the right moments. Ackerman grabbed some of their top choices, roughed in the effect on the Avid Media Composer, and presented the client with a nearly finished look right from the very first rough cuts.

“Early on, creative director Lisa McConnell had expressed a desire to explore using a series of stills flashing (á la TV’s Scandal) to advance the spot’s story,” says Ackerman. “We loved the idea. Condensing short sequences of footage into rapid progressions of imagery provided us with an innovative way to convey the full scope of these three scenarios in a very limited 30-second time frame — while also adding an interesting visual element to the final spot.”

Fred Keller of Chicago’s Filmworkers provided the color grade, CRC’s Ian Scott performed the audio mix and sound design, and composers Mike Dragovic, Michael Yessian and Brian Yessian provided the score.

Review: Avid Media Composer 8.5 and 8.6

By Brady Betzel

It seems that nonlinear editing systems, like Adobe Premiere, Apple FCP X, Lightworks, Vegas and Blackmagic Resolve are being updated almost weekly. At first, I was overjoyed with the frequent updates. It got to the point where I would see a new codec released on a Monday and by Friday you could edit with it (maybe a slight exaggeration, but pretty close to the truth). Unfortunately, this didn’t always mean the updates would work.

One thing that I have learned over the last decade is that reliable software is worth its weight in gold, and one NLE that has always been reliable in my work is Avid Media Composer. While Media Composer isn’t updated weekly, it has been picking up steam and has really given its competitors a run for their money.

With Avid Media Composer’s latest updates, including 8.5 and all the way through 8.6.1, we are seeing the true progression of THE gold standard in nonlinear editing software. From the changes that editors have been requesting for years, like the ability to add a new track to the timeline by simply dragging a clip, all the way to selecting all clips with the same source clip color in the timeline (an online editor’s dream — or maybe just mine), Media Composer is definitely heading in the right direction. Once they fix options, such as the Title Tool, I am sure many others will be in the same boat I am. Even with Adobe’s latest update news of Team Projects, I think Avid’s project sharing will remain on top, but don’t get me wrong, I love the competition and believe it’s healthy for the industry in general.

Digging In
So how great are the latest updates in Media Composer? Well, I am going to touch on a few that really make our lives as editors easier and more efficient, including the new Source Browser; custom-sized project creation Preset Manager; Audio Channel Grouping; grouping clips by audio waveform; and many more.

For simplicity’s sake I won’t be pointing out which update contained exactly what, so let’s just assume that you and I are both talking about 8.6.1. Even though 8.6.2 was released, it was subsequently pulled down because of a bad installer and replaced by 8.6.3. Long story short, I did this review right before 8.6.3 was released so I am sticking to 8.6.1. You can find the read me file for any 8.6.3 related bug fixes and feature updates, including Realtime EQ and Audio Suite Effects.

Source Browser
Let’s take a look at the new Source Browser first. If you have worked in Premiere Pro before then you are basically familiar with what the Source Browser does. Simply put, it’s a live access point from within Media Composer where you can either link to media (think AMA) or import media (the traditional way). The Source Browser is great because you can always leave it open if you want, or close it and reopen it whenever you want. One thing I found funny is that there was not a default shortcut to open the Source Browser — you have to manually map it.

Nonetheless, it’s a fast way to load media into your Source Monitor without bringing the media into a bin. It even has a Favorites tab to keep all of the media you access on a regular basis in the same place — a good spot for transition effects, graphics, sound effects and even music cues that you find yourself using a lot. The Source Browser can be found under the Tools menu. While I’ve seen some complaints about the menu reorganization and the new Source Browser, I like what Avid has done. The updated layout and optimized menu items seem to be a good fit for me, it will just take a little time to get used to.

Up next is my favorite update to Media Composer since I discovered the Select Right and Select Left commands without Filler and how to properly use the extend edit function: selecting clips in the timeline based on source color. If you’ve ever had to separate clips onto different video and audio tracks, you will understand the monotony and pain that a lot of assistant editors and conforming editors have to go through. Let’s say you have stock footage mixed in with over two hours of shot footage and you want to quickly raise all of the clips onto their own video layer. Previously, you would have to move each clip individually using the segment tool (or shift + click a bunch of clips), but now you can select every clip with the same source color at once.

Color Spaces

First, you should (or at least I recommend that you should) enable Source Color in your timeline, but you don’t have to for this to work. Second, you use either the red or yellow segment tools or alt (option) + click from left to right over the clip with the color that you want to select throughout the timeline. Once the clip is selected you will right click on the clip. Under the Select menu, click on Clips with the Same Source Color. Every clip with that same color will be selected and you can Shift + CTRL drag the clips to a new track. Make this a shortcut and holy cow — imagine the time you will save!

Immediately, I think of trouble shots that might need a specific color correction or image restoration applied to them like a dead pixel that appears throughout a sequence. In the bin, color the trouble clips one color, select them all in the timeline and bam you are ready to go, quickly and easily. This update is a game changer for me. Under the Select menu you will see a few other options like Offline Clips, Select Clips with No Source Color, Select Clips with Same Local Color, and even Reverse Selection.

Audio
Now let’s jump into the audio updates. First off is the nesting of audio effects. I mean come on! How many times have I wanted to apply a Vari-Fi effect at the end of a music cue and add D-Verb on top of it?! Now I can create all sorts of slow down promo/sizzle reel madness that a mixer will hate me for without locking myself into a decision!

I tried this a few times expecting my Media Composer to crash, but it worked like a champ. Previously, as a workaround, I had to mixdown the Vari-Fi audio (locking me into that audio with no easy way of going back) and apply the D-Verb to the audio mixdown. This isn’t the cleanest workflow but it guaranteed my Vari-Fi would make it into the mix. Now I guess I will have to trust the mixer to not strip my audio effects off of the AAF we send them.

Digging a bit further into the audio updates for Media Composer 8.5 and 8.6, I found the ability to add up to 64 tracks of audio and, more specifically, 64 voices. Sixty-four voices can be laid out in these possible combinations: 64 mono tracks, 32 stereo tracks, 10-5.1 tracks plus four mono tracks or even eight 7.1 tracks.

Nested Audio

Let’s be honest — from one editor to another — do we really need to use all 64 tracks of audio? I urge you to use this sparingly, and only if you have to. No one wants to be scrolling through 64 tracks of audio. I am hesitant to totally embrace this, because while it is an incredible update to Media Composer, it allows for editors to be sloppy, and nobody has time for that. Also, older versions of Media Composer won’t be able to open your sequence as they are not backwards compatible with this.

My second favorite update in Media Composer is Audio Groups. I am a pretty organized (a.k.a. obsessive-compulsive editor), and with my audio I typically lay out voiceover and ADR on tracks 1-2, dialogue on 3-6, sound effects on 7-12 and music on 13-16.

These have to be fluid, and I find that these fit my screen real estate well. They keep my audio edit as tidy as possible, although now with 64 tracks I can obviously expand. But one thing that always sucked was having to mute each track individually or all at once. Now, in the Audio Mixer you can easily create groups of audio tracks that can be enabled and disabled with one click instead of individually selecting each audio track. For instance, I can group all of my music tracks together to toggle them off and on with one check box. In the Audio Mixer there is a small arrow on the upper left that you will twirl down, select the audio tracks that you want to group, such as tracks 13-16 for music, right click, click Create New Group, name it and there you go — audio track selection glory.

Audio Ducking

Last in the audio updates is Audio Ducking. When I think of Audio Ducking I think of having a track of voiceover or ADR over the top of a music bed. Typically, I would go through and either add audio keyframes where I need to lower the music bed or create add edits, lower the audio in the mixer, apply a dissolve between edits and repeat throughout the segment.

Avid has really stepped its game up with Audio Ducking because now I can specify which of my dialogue tracks I want Avid to calculate for when my music bed is playing. You can even twirl down the advanced settings, adjust threshold and hold time for the dialogue tracks, as well as attenuation and ramp time for the music bed tracks. I tried it and it worked. I won’t go as far as to say that you should just use that instead of doing your own music edits, but it is an interesting feature that may help a few people.

Wait, There’s More
There were a few straggling updates I didn’t touch on you will want to check out. Avid has added support for HDR color spaces, such as RGB DCI-P3, RGB 2020 and many more. Once I get my hands on some sweet HDR footage (and equipment to monitor it) I will dabble in that space.

Also, you can now group footage by audio waveform. While grouping by audio waveform is an awesome addition, especially if you have previously used Red Giant’s PluralEyes and feel left out because they discontinued AAF support, it lacks a few adjustments that I find absolutely necessary when working with hours upon hours of footage. For instance, I would love to be able to manually sync clips that don’t have audio loud enough for Avid to discern properly and create a group. Even more so, for all grouping I would really love to be able to adjust the group after it has been created. If after the group was created I could alter it inside of a sequence that would immediately reflect my changes in the group itself, I —along with about one million other editors and assistant editors — would jump for joy.

Lastly, the Effects Tab has been improved with its Quick Find search-ability. Type in the effect you are looking for and it will pop up. This is another game-changing feature to me.

Summing Up
For a while there I thought Avid was satisfied to stay the course in 1080p land, but luckily that isn’t the case. They have added resolution independence, custom project resolutions, and while they have added features to Media Composer — like Source Browser and the ever-improving Frame Flex — they kept their project sharing and rock solid media management at the top level.

Even after all of these updates I mentioned, there are still some features that I would love to see. Those include built-in composite modes in the Effects Pallette; editable groups; an improved Title Tool that will work in 4K and higher resolutions without going to a third-party for support; updated Symphony Color Correction tools; smart mix-downs that can inherit alpha channels; the ability to disable video layers but still see all the layers above and below; and many more.

If I had to use just one word to describe Media Composer, I would say reliable. I love reliability more than fancy features. And now that you heard my Media Composer review, you can commence trolling on Twitter @allbetzroff.

Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Earlier this year, Brady was nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.

Editor Josh Beal on Netflix’s ‘Bloodline’

By Randi Altman

Looks are deceiving, and that familiar saying is at the heart of Netflix’s dramatic series Bloodline. The show focuses on the Rayburns, a respected family that runs a popular and long-standing beachfront hotel in the Florida Keys.

On the surface, they are pillars of the community and a perfect family, but when you dig below the surface they are a mess — long-standing family secrets, a black sheep with a criminal record, drug use, alcoholism. It’s all there.

Josh

Josh Beal at his stand-up desk.

So, if you are into intrigue, drug cartels, paranoia, gorgeous scenery and damn-good acting (Kyle Chandler was just nominated for a Lead Actor Emmy for his role as John, and Ben Mendelsohn got a nod for Supporting Actor for his role as Danny), the show’s first two seasons are available now for streaming.

Josh Beal joined the editing team of Bloodline around Episode 7 of Season 1 as the show’s only LA-based editor. The other three were in New York — Aaron Kuhn, Deborah Moran and Naomi Geraghty — working with East Coast-based show producers Daniel Zelman and Todd Kessler. Beal, at the time based at the Sony lot in Culver City, worked closely with producer Glenn Kessler, who was headquartered in Los Angeles.

Coming into Season 2, Glenn Kessler was the one who was going to be driving the episodes through editorial, so they moved the whole thing to LA. While Beal hesitates to be called the show’s main editor, he was the first one on and did cut the majority of episodes for Season 2, which is shot on a Sony F55 camera in 4K by DP Jaime Reynoso.

I reached out to editing veteran Beal, who uses the Avid Media Composer, to find out more about his process and the show’s workflow.

How many episodes would you say you cut for the second season?
I’m credited with four, but Bloodline is a unique show in that all of the editors — Lynne Willingham, Louise Innes, Sue Blainey and Michelle Tesoro — may end up working to some extent on episodes where they don’t get final credit. I’m credited as the primary editor on the second season’s last episode, but the other editors helped me out with it. That’s usually due to issues surrounding scheduling and the workload.

When did you start getting footage?
We received footage as soon as they started shooting. If they were shooting today I would be getting footage tomorrow morning, and it would just go like that through the production.

Camera files were uploaded to Sony’s 24p Dailies Lab from the shoot in Florida. The Avid media was then copied to a portable hard drive and delivered to our assistant editors each morning by our post assistant. Our assistant editors would then load the media onto our Avid ISIS and prep scenes for the editors.

Can you talk us through your workflow?
It followed a pretty typical workflow for episodic television. I would try to stay close to camera with the dailies and cut scenes as they were shot. Then I had a couple of days to put together my cut of the episode, which I would present to the director. The director would then get four days for the director’s cut, which would then be shown to producers. The producers then shared their cut with Sony and Netflix.

Often, in my case, the distinction between what was director cut and producer cut would start to break down because the majority of the episodes that I worked on for Season 2 were directed by one of the producers, with the exception of one. The typical division between those phases of the edit ceased to really apply and it all sort of became a long director’s cut, which was actually really great creatively to have that continuity.

The direction and feel of the show was established during season one. Can you talk about that?
One of the defining characteristics of the mood in Bloodline is a voyeuristic feel to the way a scene might be covered, and therefore the way a scene would be put together.

Sometimes there are behind-the-bushes kind of a shots that might pop up of in the middle of the scene in a way that is a little atypical. It gives the viewer a sense that they are always being watched, which is especially applicable thematically for Season 2. It heightens the sense of paranoia and unease. So editing-wise, it’s about finding a way to use that material and maximize that feeling while at the same time keeping the audience in the character’s perspective, keeping them in the scene.

As Season 2 progresses, John Rayburn’s world starts falling apart — you could feel his desperation. How did you handle that from an editing perspective?
At that stage, it’s trying to stay rooted in a character’s perspective and leaning into it and finding performance. Any decisions that you are making in terms of shot selection begin and end in large part with performance.

For example, with Kyle Chandler (John), I thought he was doing incredible work. Sometimes it was just about getting out of his way; it’s about watching the performance and trying to make that work first. It’s not so much a visual thing.

You edited the show on Media Composer. Do you use the script tool to help pick different takes?
I did for the first season, but I’ve never felt 100% comfortable with that tool. I did something else to help me with me with the sorts of problems that ScriptSync is intended to solve.

Can you talk about that a bit?
One of the challenges with it is you’ll have exceptionally long dialogue scenes. For example, two people sitting in an interrogation room speaking to one another. That kind of scene offers very few visual cues as to where you are within the text of the scene, and this is often what an editor might be going off of when looking for a specific line. If you want to audition line readings, especially when you are working with the producer in the room, you are frequently going off of these visual cues, but you don’t have many when it’s just people at a table.

My solution was every time someone spoke in the script, I’d put a number next to that chunk of dialogue. If a character’s name was mentioned for dialogue within the script it got a number. Then I would put a locator in my clip at that point — “L01, L02, L03 — all the way through the script. Then I would throw all the clips into a KEM roll in the Avid and sort that by the marker name in the marker window. I can also make notes as to whether or not I like a given reading over another. I can just see it listed in my description of it.

In fact, KZK (producers Glenn, Daniel and Todd) do something that I don’t often run into on other shows. They will write alternative lines into the script, and they’ll shoot it both ways. Then the decision gets made later as to which way it’s going to go in editorial.

Season 2 has a ton of flashback sequences. Did you have to tackle that in a different way?
One of the things that I appreciate about the way that flashbacks are handled on Bloodline is that they don’t talk down to the audience. We aren’t throwing some sepia tone effect on the flashbacks. There aren’t any flashy visual transitions in and out of the flashbacks. We are assuming that the audience will know pretty quickly where they are at within the timeline based on context.

In watching Bloodline, it almost seems sweat and alcohol play characters on the show.
Yes, it’s deeply threaded throughout the world created for the show. The sweat, I suspect, is simply unavoidable when it comes to shooting in the Keys. It is an element, however, that really helps sell the sense of place. It feels real because it is. It’s part of the atmosphere and is also something that perfectly dovetails into John and his siblings’ growing feelings of stress and dread.

The drinking was something that came up in conversation with producers and directors when we were chatting in the cutting room. All reports I heard are that indeed there’s a real drinking culture down there.  It’s just another detail I think that adds to the strong sense of place the show excels at presenting.

Is there a particular scene that stands out to you as more challenging than others?
In Season 2, there is an episode where (Rayburn siblings) Kevin and Meg were being interrogated again by (police officer) Marco. They are very long scenes, just back and forth between Kevin and Marco and then between Meg and Marco.

The audience thinks that Kevin is going to blow it and that Meg has her shit together, but she is the one that steps in it and puts everything at risk. I really liked those sequences, but they are hard to cut because it’s just two people in a room talking and yet the scenes had to have a shape. They had to twist and turn based on subtleties of performance. I often find those to be among the more challenging scenes because they can begin in a very unruly state and it’ll take time to carve them out just right.

When Glenn first saw those scenes he wanted to put a slightly different emphasis and rhythm to some of the performance — a slightly different intent to come across in the interaction between the characters. These were scenes that we worked on and modified quite a bit for the better as we went along.

Do you have a general philosophy about editing?
I guess my philosophy is not to get lost in the weeds, and try to keep perspective on the story. Try to always keep the big picture in mind, because there is all the minutia of editing. How can I get this cut to match? Or, how can I cut out this section that I’m being asked to remove without making the whole thing seem clunky? That type of thing is important but it can’t be the only way you look at the cut.

What’s next for you?
I’m working on a new show starring Pierce Brosnan based on a novel called The Son. It’ll air on AMC next year. It’s like a great American history lesson that follows a South Texas family over the course of multiple generations. It begins with the story of the family patriarch who builds a financial empire first in the cattle industry and later in oil. The storyline goes all the way up to the present.

Is there anything you would like to add?
I would like to emphasize how collaborative Bloodline was. Lynne and Louise and the other editors on the show are phenomenally talented. I was honored to be among them. I learned a lot working with them. That collaboration, for me, is one of the high points of working on the series. I’ve worked on shows where I hardly ever talked to the other editors, but we got to talk shop and bounce ideas off each other a bit. That was a really fun part of the show for me.

Quick Chat: Lost Planet editor Federico Brusilovsky

By Randi Altman

Buenos Aires native Federico Brusilovsky works as an editor at Lost Planet’s Los Angeles office, leading efforts on campaigns for Cadillac, Dodge, Heineken and HP. He joined Lost Planet’s New York studio four years ago as an intern, with production and assistant editorial experience already under his belt. From there, he worked his way up, learning from experienced editors, including the company’s Oscar-nominated editor and owner Hank Corwin (The Big Short).

Cadillac

We reached out to Brusilovsky, who studied film at New York’s City College after coming to the US, to talk about his path, the way he likes to work and tips for those just starting out.

How long have you been editing?
That’s a tough question. I guess my first experiences editing were all in-camera when I was shooting movies on VHS as a kid. Working in that linear format taught me a lot, especially how to recognize happy accidents: those coincidences and subconscious decisions that end up being some of the cooler parts of a film.

Once I was in college, I was working with Super 8 and 16mm and cutting with a guillotine. Working with your hands with such delicate materials teaches you a lot, too. There’s a craftsmanship to physically altering and moving around film that requires really sophisticated organization and patience. Those experiences are important to me now that I edit using digital, nonlinear systems. So, the short answer is probably “as long as I can remember.”

How did you get started in this business?
I was lucky to know editor Julie Monroe (Lolita, The Patriot), who offered me the chance to intern on the film she was working on, Mud. From there, Julie introduced me to Saar Klein (The Bourne Identity, Almost Famous), who introduced me to the studio that he’s on roster with, Hank Corwin’s Lost Planet.

After joining Lost Planet as an intern, it was easy stay focused on my goal of becoming an editor. I knew an internship wouldn’t naturally evolve into a lucrative editing career. I did lots of technical training on my own, so that the moment they needed me to step into a more challenging role, I would already have the skills.

Heineken

Heineken

How early did you know this would be your path?
I didn’t recognize it as a path early on, even though it was in front of me for a while, and I was always editing one way or another. It wasn’t until after meeting and talking to a bunch of feature editors that I was able to see it.

Was it much of a transition in terms of editing moving from Argentina to the US?
Speaking strictly in the commercial world, the biggest difference between the US and Argentina (and I guess most other countries as well) is the role of the editor and the director. For some reason, directors in the US are not as involved from start to finish as they are abroad. They tend to shoot and walk away, not always because they want to but because they have to.

Although we work with them at the beginning of the process, a lot of decisions that may be directorial end up in the hands of the editor. Which is great for me from a creative standpoint.

Overseas, directors are a good deal more involved in post production, and editors in production, which can be an advantage for strong collaborations but a disadvantage for editing objectively. It’s fun to get a feel for each shot while it’s happening on set, then work closely with the director in post, but that experience can make you married to footage that you would otherwise toss out in the edit room.

What system do you use?
Mostly Avid Media Composer. It’s the least user-friendly piece of software, probably because it pre-dates functions like drag-and-drop being available in every single app. But, with patience and the right training, it’s the most robust and attractive piece of non-linear software out there.

What’s your favorite shortcut?
CMD+Z (or CTRL+Z for the Windows crowd). Not only is “undo” possibly the most useful hotkey on a technical level, it has huge symbolic importance. Undo is a digital safety net. Knowing that I’m never more than two keys away from reverting to a previous version gives me the freedom and efficiency to take risks and experiment with different styles.

Cadillac

Cadillac

Do you use plug-ins?
With Avid I try not to, unless a specific project calls for it. If I’m using plug-ins, that means I’ve probably moved to After Effects.

What are some recent projects you have worked on?
Heineken’s “Soccer is Here” and Cadillac’s “CT6 Forward” are my two most recent campaigns. Cadillac was fantastic to be a part of. The material I had to work with was so rich and delicate. Each individual shot was successful on so many levels — color, light, movement, composition and so on. Some projects require a bit of strategy in the edit so high quality shots don’t call attention to less successful ones, but not with this project. It was like playing chess with a board full of queens and no pawns.

Do you have a favorite genre? If so, why?
Comedy. I know it’s a cliché to say that it’s my favorite because it’s the hardest genre to work in, but that’s probably why. One unique challenge to comedy is that the margin for error is so wide compared to other genres. The impact of a not-so-exciting fight scene is much less than the impact of a not-so-funny joke. Getting an audience to laugh is one of the biggest challenges in the industry, and so satisfying when you’re successful.

It’s also easy to get caught in an echo chamber of bad comedy in a writers’ room or on set. When you’re working with comedic premises and characters, trying out concepts and laughing with your coworkers, you can easily lose objectivity and convince yourself something is funny when it definitely isn’t.

I read somewhere that on the set of Sydney Pollack’s Tootsie, there was a real sense of seriousness and tension, despite it being one of Dustin Hoffman and Bill Murray’s funniest movies. Maybe that’s a secret to great comedy… approach it seriously.

Any tips for new editors just starting out?
Listen! Not listening is a classic rookie mistake. Also, it’s more important to be in the right place than to be doing the most glamorous or rewarding tasks. Better to be low on the totem pole for a good film with good people than at the top directing for a bad film. Try to be around companies and projects you admire, then work hard to grow in those communities.

Stitch edits spot for London’s Times, Sunday Times

Based in Soho in London, Stitch Editing has finished a project for The Times and The Sunday Times’ first-ever joint brand campaign. Directed by Rattling Stick’s Ivan Bird, the 30-second TV and cinema commercial features a montage of significant headline moments from around the world. Using the technique of isolating one unexpected sound from each scene, the commercial is designed to show how The Times and The Sunday Times get to the heart of each story they cover.

Stitch’s Richard Woolway cut the spot using an Avid Media Composer, making this his commercial editing debut: “It was a very interesting exercise to source footage that is extremely familiar and mix it with original scenarios of a similar vein that we developed and shot, creatively isolating an element of each scene that forces viewers to think differently on the subject,” he says. “I enjoyed the process of sourcing existing footage and shooting additional scenarios that we felt were missing at the same time. It was a new and exciting way to work and we’re very proud of the power of the end result.”

He continues: “Both Ivan and I worked closely with the agency and client from day one, whilst at times was challenging, I think benefitted the project as a whole, meaning we all knew the footage we had, in detail, and felt confident we were making the best possible version of the film.”

Woolway started his career cutting showreels at Smuggler, where he met the guys at Stitch and, as he says, “The rest is history, good history of course.”

Offline editing for the spot was done at Big Buoy with producer Jonathan Murray. Rattling Stick’s Emily Atterton produced. Sound and music were produced at Wave Studios by sound engineer Tony Rapaccioli and composer/arranger Nick Rapaccioli.

Editing an ‘Empire’ for Fox

Zack Arnold talks about his process on the first two seasons of this hit show.

By Ellen Wixted

Zack Arnold dreamed of being an editor before he even knew the job existed. As a kid, he and his brother shot short films with a VHS camcorder — but it wasn’t until his brother added music that the process captured his imagination. Before long, he was using two VCRs to re-mix existing movies, and loving every minute of the process.

Happily for Arnold, that’s still true today. After studying film at the University of Michigan, Arnold moved to LA, where he quickly landed a job editing trailers. That led to work on indie features, and eventually episodic TV. Editing Burn Notice was a breakthrough moment (one that lasted four seasons), and the experience opened other doors.

Arnold worked with show runner Ilene Chaiken on a medical drama that was cancelled, but the partnership was solid. When Chaiken was named an executive producer on Fox’s Empire, Arnold was brought on board as well. He edited the entire first season and was on through Episode 15 of season two. He’s currently working on a new series for Paramount called Shooter. “I’m an additional editor on the pilot and will be editing three additional episodes, including the season finale. It’s a 10-episode first season.”

I asked Arnold, who cut the show on Avid Media Composer, what makes Empire different from other shows, and his response was lightning-quick: “Empire requires an unusually varied skill set of its editors. Every episode combines elements of performance, comedy and drama. The musical numbers are so high-energy and require a lot of style. To make sure the jokes hit, you need to have great comedic timing. And weaving it all together is a performance-driven character drama, which requires a different approach.”

Arnold’s experience editing across multiple genres gave him a clear advantage here. Most editors tend to work within a single genre, but Arnold’s work editing trailers, combined with his experience editing a range of episodic content, mean he’s able to draw from a wealth of experience.

Musical numbers are central to Empire, and Arnold described the show’s accelerated editing process in some detail: “I’ll end up with eight hours of dailies from one day of shooting. After one of my assistants organizes the footage into multicam groups — there can be 50 to 60 different angles within a group — we’ll break the edit down into story-driven elements and the performance itself. With every music number, you have to tell a specific story, using rhythm to hold the viewer’s attention. Watching Cookie watch the performance may be critical to the story, and I’ve found if you watch everything while keeping the big picture in mind, it’ll wash over you, making it easy to decide what’s essential.”

Which edit is Arnold’s favorite? Episode 109. “The whole first season is about the family dynamic — the relationships between the parents and the children, and the current of homophobia in that community. In one cut, we went from a warm family scene to a terrifying psychotic break.” Arnold edited the club scene first and tackled the bipolar break sequence a few days later. The transition from one to the other didn’t work initially because they were too different emotionally, so a buffer scene with family members was inserted. “The studio didn’t respond well initially, but we fought back because it worked powerfully on an emotional level, and in the end the audience response on Twitter validated our approach.”

More Than Editing
In addition to his work as an editor, Arnold is the developer of Fitness in Post, a podcast that helps newcomers to the industry find creative ways to stay fit and healthy despite long hours sitting in darkened rooms. He also likes to give back to the community, and offers this bit of advice for those just starting out: “Figure out what you’re interested in doing right now, and go after it wholeheartedly. Starting at the bottom of a ladder you don’t want to climb makes it harder to jump off later. And as an editor, you should always be honing your storytelling craft — we are visual writers.”

Behind the Title: Editor Christian Jhonson

NAME: Christian Jhonson (@avidusersecuado)

COMPANY: Ecuador’s Teleamazonas

CAN YOU DESCRIBE YOUR COMPANY?
Teleamazonas is a TV station — the first television station in my country. We produce shows like The Voice Ecuador and broadcast international series like The Simpsons. We have also built a huge reputation for shows featuring local stories.

WHAT’S YOUR JOB TITLE?
Editor. I’m am also an Avid-certified trainer — I run The Cut Center, a future learning center. In addition, I volunteer for Avid as an ambassador, providing solutions for Latin America and beyond.

CAN YOU DESCRIBE YOUR JOB AT TELEAMAZONAS?
As an editor at the TV station, I create promos highlighting what the TV station has planned for the month.

YOU CUT ON MEDIA COMPOSER, BUT DO YOU HAVE A FAVORITE PLUG-IN?
For video, I use Boris FX’s Continuum Complete.

WHAT’S YOUR FAVORITE PART OF THE JOB?
No one project is the same; every promo is its own production and allows for creativity.

WHAT’S YOUR LEAST FAVORITE?
Digitizing and selecting shots.

WHAT IS YOUR FAVORITE TIME OF THE DAY?
Midday, because it is the time to take a rest, do some re-edits, watch what I’ve done and recreate the editing and creative “process.”

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I couldn’t see myself doing any other job.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
When I was 20, I had an experience with some film tests, which were homework from my university. Since then I took this profession seriously. That passion and dedication are part of me.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
Trailers for Latin America, which promote romance stories like like La Prepago and La Promesa.

WHAT IS THE PROJECT YOU ARE MOST PROUD OF?
The La Prepago trailer. It is the story of a women who has trouble making money. She is a student, but her parents are poor. She is desperate and decides to use her good looks to make money and agrees to be a “prepago,” which means a girl who sells her body. That is a very “illegal” way to gain money in Colombia. The film is produced by Sony Pictures Television for the Latin American market.

When I had to edit for the trailer, I found myself with a big problem — everything was in different places. I had to search for the music, ask for the shots, sound effects… everything had to be rebuilt. It was a surgeon-like job.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
Computer, cell phone and my car.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
Well, I’m an admin of Avid editors on Facebook, I manage Avid Pro Tools’ Facebook/Latin America, and I manage Avid Isis’ Facebook. I also enjoy reading and commenting on Avid blogs, Videoguys.com and postperspective.com.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I don’t feel stress at all. I love what I do.

Behind the Title: Editor Bob Mori

NAME: Bob Mori

WHAT’S YOUR JOB TITLE?
I’m a film editor working in commercials, feature films and experiential media projects. My experience is with brands such as Nike, BMW, Time Warner, Ford, Jeep, Coca-Cola, Scion, Altoids, LG, Sega, Dyson, Warner Bros., USPS, Chrysler, L’Oréal, Neutrogena, Budweiser, LucasArts, Dodge, Adidas,Toyota and AT&T.

CAN YOU DESCRIBE YOUR COMPANY:
I am currently working freelance in the Los Angeles and New York markets. I have worked with Jump, Optimus, Cosmo Street and Red Car.

WHAT DO YOU EDIT ON?
I use Avid Media Composer and Adobe Premiere Pro — visual effects, sound design and music are all-inclusive.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT YOUR ROLE AS AN EDITOR?
I nurture projects from award to final mix and delivery. Once on board A project, I’m there every step of the way.

Bob Mori

Bob Mori in his edit suite.

WHAT’S YOUR FAVORITE PART OF THE JOB?
I have a few: working with a creative team that understands everyone’s participation is important to success; elevation of the concept even at the smallest level; dedication, conceptual thought and grace through years of my own personal experience; and watching an audience’s reaction to all that hard work!

WHAT DOES THAT ENTAIL?
Being nimble and wearing many hats throughout the process. Having aesthetic and technical knowledge that includes different approaches. Starting again never scares me.

WHAT’S YOUR LEAST FAVORITE?

Running out of time and knowing that we could have made it better. Everyone has experienced this, and it’s a reality we’ve all come to accept.

WHAT IS YOUR FAVORITE TIME OF THE DAY?
Watching dailies. Although, now we call them “media assets”… right? Usually it’s at double speed due to time constraints.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I never had a “Plan B” mapped out. This is bliss for me. I love what I do and can’t dream of doing anything else.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
That dye was set early on for me. I shot and edited films on 8mm film in high school. I was a film major at Columbia College in Chicago.

CAN YOU NAME SOME RECENT PROJECTS YOU’VE WORKED ON?
In late 2015, my commercial projects included Equifax, Home Advisor and South of Wilshire.

I also did work for On the Record With Mick Rock. Rock is a British photographer, best known for his iconic shots of rock and roll legends such as Queen, David Bowie, Syd Barrett, Lou Reed, Iggy Pop and The Sex Pistols.

WHAT RECENT PROJECT ARE YOU MOST PROUD OF?
Who’s Driving Doug just had its world premiere at the Santa Barbara International Film Festival. It opened in theaters and VOD on February 26. The film stars RJ Mitte of Breaking Bad, Paloma Kwiatkowski of Bates Motel, Ray William Johnson of Equals Three, and Daphne Zuniga.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
Premiere Pro CC, Avid Media Composer, Photoshop and, of course, my iPhone.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
Every. Single. One. Facebook, Instagram, you name it. I’m easy to find and communicate with everywhere in the world.

DO YOU LISTEN TO MUSIC WHILE YOU WORK?
Music is essential to picture cutting. Not just popular music, but every genre. Some of my best friends are music composers. And usually (with rare exception) music is half of what you are emotionally feeling while watching.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
Yoga. Hiking. The beach. Walking my dog Bruno (who is in our main photo). And escaping into film. As Frank Capra famously said, “As with heroin, the antidote to film is more film.”

72andSunny adds audio and editing suites

Creative company 72andSunny has added 13 new uncompressed-4K video editing suites to its Los Angeles office. Part of the large-scale project also includes three new audio mixing and sound editing suites, two of which feature RedNet Dante network interfaces from Focusrite. The installation was done by MW Audio Visual.

The company’s video editing suites vary in terms of tools. “We designed and built our edit suites to work in a number of editorial pipelines,” says John Keaney, director of Hecho En 72 operations at 72andSunny, which is their in-house studio and bespoke production unit. “Because certain projects are better suited for one editorial platform than another or the talent for a particular job may be stronger in one system than another, we built our systems to handle a few different operations.”

Their two main editing systems are Avid Media Composer 8 and Adobe Premiere Pro 9.2 (Creative Cloud 2015), but they will call on other editorial, motion graphics and audio applications as needed.

“With 12-core Mac Pros, ultra widescreen LG monitors for our project files, HP DreamColor monitors for editor playback, 55-inch Pluras for client playback, Blackmagic Ultrastudio 4K capture/playout devices and a Quantum StorNext SAN with SAN Fusion, we have a systems infrastructure that is compatible in either Avid or Adobe environments,” reports Keaney.

In terms of sound, one of the new audio suites is a full cinema post production and mixing studio, featuring a JBL 4722 cinema monitoring system and a Panasonic laser projector. A second studio, used for recording and mixing, uses the same Avid S6 console and Pro Tools HDX system as the cinema studio. Flanking the suites are two voiceover booths.

“The client wanted to be able to route any audio from either studio or either booth to any other location, instantly,” says Michael Warren, president of MW Audio Visual. “There is a RedNet 5 in the console rack in each studio and a RedNet 4 in each V-O booth, with a digital snake attached to the DB25 inputs of the RedNet 4 units. These are also connected, via shielded Cat-6 cabling, to Cisco switches that we have in each control room, V-O booth and rack. So we can matrix the audio from anywhere to anywhere, through the RedNet units.”

In fact, adds Warren, 72andSunny can send its audio anywhere in the world from there, through a sharable SAN that connects its entire campus and out to any other location via the Internet. “This is the new face of media workflow,” he says. “People are creating content for television, cinema, online — it doesn’t matter. It’s all about their ability to connect with each other and share the process, between rooms, across a campus, or globally.”

72andSunny’s clients include Google, Samsung, Activision, ESPN, Starbucks and Samsung.

New version of Media Composer supports HDR

Avid’s latest version of its Avid Media Composer editing system offers support for high dynamic range (HDR) workflows, so users can now edit and grade projects using new color specs that display a greater dynamic range than standard video. The new version also features a more intuitive interface, making the tool more friendly to those editors who also work on Premiere Pro or Final Cut.

With what the company calls “better-organized tools,” the Media Composer’s tools and interface elements are now organized in a more logical order, enabling editors to work more efficiently and intuitively, regardless of system they have worked on before.

The new version of Media Composer also enables editors to work up to 64 audio tracks — 250 percent more than previously available — and delivers more power under the hood, allowing editors to work faster and focus on the creative.

The new version of Media Composer — which allows users to work in HD, 2K, 4K, 8K and HDR — is now available to purchase from the Avid Store and as a free download to current Media Composer customers with an active Upgrade and Support Plan or Subscription.

Here are some more details of the new version:
• Faster editing and better-organized tools – Users can access key tools and features faster thanks to several updates to menus and drop downs that make accessing tools more intuitive, productive, and fun. This delivers even greater efficiency, enabling editors to focus more time on their creative storytelling.
• Better visual feedback when editing – Users can edit with more precision thanks to high-visibility feedback displayed as they edit in the timeline.
• Ability to straighten images quickly with FrameFlex rotation – Users can rotate images a little or a lot by rotating the framing box in FrameFlex.
• Better performance – all played frames — and all effects applied to those clips — are now cached in RAM. This allows for a smoother, stutter-free edit when scrubbing or playing back a complex sequence multiple times.

Other enhancements include: full screen playback mode; sync Broadcast Wave audio files with video clips with subframe accuracy; add one or more custom columns to a bin easily through a contextual menu; copy and paste frame numbers in bin columns; find and filter effects faster and easier with the updated Effects palette; rename, edit and delete custom project presets with the Preset Manager; use Media Composer on OSX 10.11 (El Capitan) and Windows 10 computers; group master clips using audio waveform analysis; start a frame count at “1” instead of “0” (zero) for timecode burn-in segments; resize and configure the Audio Mixer for your project at hand; preserve field recorder metadata across multiple audio tracks when importing/linking

Review: Blackmagic’s DaVinci Resolve 12

This working editor is impressed with the color correction tool’s NLE offerings.

By Brady Betzel

If you’re looking for a nonlinear editor alternative to Adobe Premiere, Apple FCP X or Avid Media Composer you must check out Blackmagic Design’s DaVinci Resolve 12 Studio. The best part about the continuing evolution of Resolve is that Blackmagic has been adding NLE functionality to its color correction software, instead of building an editor from the ground up.

In terms of editing systems, Avid Media Composer has been in my life from the very first day I started working in television. At school we edited on Final Cut Pro 7 and Adobe Premiere, but once I hit the big time it was all Media Composer all the time. Now, of course, that is changing with Adobe Premiere Pro projects popping up more and more.

Many of today’s editors want to work on an NLE offering the latest and greatest features, such as resolution independence, wide codec support, occasional VFX integration and the all-mighty color correction. So that leaves us with Adobe, Avid and the newest player to the NLE game, Blackmagic and its Resolve product.

Resolve's multicam capabilities.

Resolve’s multicam capabilities.

Adobe realizes how important color is to an editor’s workflow and has added color correction inside of Premiere by incorporating Lumetri Color. In fact, Adobe’s After Effects also features Lumetri Color. But even with these new additions some are still wanting more. This is where Blackmagic’s DaVinci Resolve 12 is making its move into the nonlinear editing world.

Inside Resolve 12
With Version 12, Blackmagic has reinvented its internal NLE environment to catch the eye of any editor looking to make a change from their current editing system. In this review I’m looking at Resolve 12 from an editor’s perspective, not a colorist’s. Some NLEs say you can stay inside of their environment from offline to online, but oftentimes that’s not the case.

I think you will really like what Blackmagic is doing in Resolve 12 Studio — you will also like their visual effects and compositing app Fusion, which recently released its Version 8 public beta.

Blackmagic offers two versions of Resolve: Resolve Studio and Resolve. They also offer the DaVinci Resolve Advanced Panel, which retails for $29,995. Resolve Studio sells for $995, while plain Resolve is free, and you get a lot of horsepower for free. If $30K is too pricey for your budget, remember that a lot of high-end colorists use the Tangent Element coloring panels — they retail for under $3,500. (You can check out my review of the Tangent Element panels here.) Color panels will change the way you look at color correcting. Coloring by mouse or tablet compared to panels is like playing baseball with one arm tied behind your back.

The Resolve Panel

The differences between Resolve Studio and Resolve is Studio’s realtime noise reduction and motion blur parameters using CUDA and OpenGL GPUs and stereoscopic 3D grading. The free version has mastering limitations; very limited GPU and Red Rocket support; lack of collaborative teamwork-based features; lack of remote grading; limitation of proxy generation to the UHD frame size; limit of project frame sizes to UHD; and a lack in ability to render the Sony XAVC codec. But keep in mind that even the free Resolve will support the Tangent Element panel if you have it.

Powering It Up
Technically, you should have a pretty beefy workstation at your disposal to run Resolve, especially if you want to take advantage of the enhanced GPU processing and realtime playback of high-resolution sources. One common debate question is, “Do I transcode to a mezzanine format or stay native?” Personally, I like to transcode to a mezzanine format like DNxHD or ProRes, however with systems becoming the powerhouses they are today that need is slowly dying. Even though Resolve can chew through different native codecs such as AVCHD it will definitely be to your advantage to find a common intraframe codec such as ProRes 4444, Cineform or DNxHD/HR as opposed to an interframe codec such as XDCAM, which is very processor-intensive and can slow your system down during edit.

A very thorough explanation can be found over at Sareesh Sudhakaran’s website: http://wolfcrow.com/blog/intra-frame-vs-inter-frame-compression. The minimum requirements for Resolve on a Mac are OS X 10.10.3 Yosemite and 16GB of system memory, although 8GB is the minimum supported. For a Windows system you need Windows 8.1 Pro 64-bit with SP1 with 16GB of system memory, although 8GB is the minimum supported as well.

In addition, you will need up-to-date drivers from your GPU, and if I was you a high-end GPU (or two or three) with as much memory as possible. Many people report a couple prosumer Nvidia 980 Ti cards to be a great value if you aren’t able to jump up to the Quadro family of GPUs. In addition AMD and Intel GPUs are supported.

Let’s be real, you should either have a sweet X99 system with as much RAM as you can afford or something on the level of an HP z840 or recent Mac Pro to run smoothly. You will also want an SSD boot drive and a RAID (SSD if possible) to get the most out of your editing and color experience with minimal lag, especially when adding Power Windows, motion blur and grain.

The Interface
My immediate reaction to Resolve’s updated interface is that it looks and feels like an amalgamation of FCP X and Adobe Premiere CC 2015. If you like the way Adobe separates out their assembly, color and NLE interfaces then you will be right at home with Resolve’s Media, Edit, Color, and Deliver keys. In the timeline you will see a similar look to FCP X with rounded corners and an otherwise intriguing graphical user interface. I’m not going to lie, it felt a little shiny at first but coming from Media Composer almost every NLE interface will feel shiny and new. So the questions is: will it perform on the same level as a tried and true behemoth like Avid’s Media Composer?

Testing the NLE
There are a few key functions that I test on every NLE I jump into: trimming, multicam editing and media management. For the most part, every NLE can insert edit, assemble edit and replace an edit, but most can’t replicate Avid’s trimming and media management functionality.

Jumping into trim mode there are your standard ripple, overwrite, slip and slide trims. You can perform that multitrack asymmetric trim to pull time between those huge acts and even one type of trim that I really wish Avid would steal — the ability to trim durations of multiple clips simultaneously. The best way I can describe this is when you are building credits and you need to shorten them all by one frame. Typically, you could go in card by card and remove one frame from each card until you are done. In Resolve 12, you can trim multiple clips at the same time and in the same direction, i.e. trim one frame from every credit in a sequence simultaneously. It’s really a remarkable addition to a trim workflow, not to mention a time saver.

Second on my checklist for running an NLE is its ability to work smoothly with multiple camera angles in a grouped set of footage, sometimes referred to as groups. One of my personal pet peeves with Media Composer is the inability to change a group after it has been created (and by pet peeve I mean bane of my existence when I was assistant editor and a 12-hour group was off by one or two frames… but I digress.)

Luckily, Blackmagic has given us a solution inside of Resolve. After a group has been created, you can step “inside” of that group, add angles, add a final mix and even change sync. All of these changes ripple through the edits; it’s very impressive. My two favorite features in Resolve’s new multi group abilities are mixing frame rates within a group and auto syncing of audio and video based on waveforms. If you’ve ever needed Red Giant’s PluralEyes because there was no jam sync timecode on footage you received, then you will feel right at home inside of Resolve’s auto sync. Plus you can adjust the group after it’s been created! I love this… a lot.

Media management

Last on my list is media management. I have pretty high expectations when it comes to media management because I was an assistant editor for a little over four years working on Media Composer, and for the most part that system’s media management works rock solid — if you need to vent about how I am wrong you can tweet me @allbetzroff) — especially when used in conjunction with an Avid Shared Storage product like the Unity or ISIS. What I realize is that while Avid’s way of media organization is a little bit antiquated, it is reliable.

So what I’ve really started to embrace within the last year is metadata and I now recognize just how valuable it is with NLEs like FCP X and now Resolve. Metadata is only valuable, however, if someone actually enters it and enters it correctly.

If in Resolve you have properly kept your metadata game extremely up to date you can quickly and efficiently organize your media using Smart Bins. Smart Bins are incredible if they are set up properly; you can apply certain metadata filtering criteria to different bins such as interview shots, or have shots from a particular date to automatically populate. This is a huge time saver for assistant editors and editors without assistant editors; another feature I really love.

I couldn’t cover everything within Resolve in this space, but believe me when I tell you that the features not covered are just as great as the ones I have covered. In addition to the newly updated audio engine under the hood, there is a command to decompose a nested timeline in place — think of a nested sequence that you want to revisit but you don’t have to find the original and recut it into the sequence — one click and magically your nested sequence is un-nested. There is also compatibility with Open VFX, such as GenArts Sapphire and Boris FX BCC Continuum Complete. There is remote rendering and grading, plus many, many more features. One of my favorite resources is the Blackmagic DaVinci Resolve 12 manual written by Alexis Van Hurkman (@Hurkman on Twitter), who also wrote Color Correction Handbook: Professional Techniques for Video and Cinema, a phenomenal book on color correction techniques widely regarded as the manual for color correction.

Summing Up
In the end I can’t begin to touch on the power of Resolve 12 in this relatively small review; it’s constantly being updated! The latest 12.2 update includes compatibility with plug-ins like New Blue Titler from Media Composer via an AAF! I didn’t even get a chance to mention Resolve’s integration of Bezier curve adjustments to transitions and keyframe-able movements.

If you are looking for an upgrade in your color correction experience, you need to download the free version immediately. If you’re an editor and have never taken Resolve for a test drive, now is the time. With features like greatly improved dynamic trimming to the extremely useful and easy to set up Smart Bins to the new 3D tracker and foreground color matching, Resolve is quickly overtaking the color and NLE market in one solid and useful package.

Brady Betzel is an online editor at Margarita Mix in Hollywood. Previously, he was editing The Real World at Bunim-Murray Productions. You can email Brady at bradybetzel@gmail.com, and follow him on Twitter, @allbetzroff.

 

When video editors need to deliver a CALM-compliant mix

Outpost Worldwide is a Kansas City-based production and post company that creates content for a variety of TV series, network game shows, reality shows, commercials and corporate videos.

Their television work includes shows like Strange: Exorcist, Garden Earth and Project Runway Latin America. Documentaries they have worked on include The Barber’s Diaries, No Shortcuts and Let Freedom Ring: The Lessons Are Priceless. Films include Fight Night, Dogs of Eden and Last Ounce of Courage.

With the passage of the CALM (Commercial Advertisement Loudness Mitigation) Act by Congress, the responsibility to ensure audio mixes conform to the loudness standard falls not only on audio mixers, but also on video editors for shows that did not budget for a separate audio post session.

For Mark Renz, senior video editor at Outpost Worldwide, the task of delivering compliant mixes directly from his Avid Media Composer system was an extra burden in time and effort that should have been used for creative editing. If a show gets rejected by their Extreme Reach content management and delivery system, then further delays and costs are incurred either to send the show to audio post or have the Extreme Reach system fix the loudness itself.

“While many of our shows have budget for audio post, I frequently will also work on shows that have no separate audio budget, so it’s down to me to make sure audio coming out of my Media Composer system is compliant with the CALM Act,” explains Renz. “Since the majority of my time is spent putting a compelling story together, you can imagine that worrying about loudness is not something I really have a lot of time for.”

This is where iZotope’s RX Loudness Control comes in. “There’s not a whole lot to say, because it just works,” he says. “It’s two mouse clicks, and it’s much faster than realtime. The first click quickly analyzes the audio and displays a graph showing any problem areas. If I have time, I can quickly go in and manually adjust an area if I want; otherwise, clicking ‘Render’ is all that’s required to generate a compliant final mix.”

Mark is first to admit he’s not an “audio guy” so being able to rely on a tool that guarantees a compliant audio mix has been liberating. “I don’t have to worry what someone else might be doing to the audio to force it into compliance,” says Renz.

Mick Audsley: Editing ‘Everest’

This veteran editor walks us through his process

By Randi Altman

Mount Everest, the highest peak in the world and the Holy Grail for many climbers, often is the symbol of a personal struggle to achieve an incredibly difficult task. It also happens to be the subject of a new film from director Baltasar Kormákur that is based on the (sometimes contradictory) true story of two climbers who, in the spring of 1996, got caught in a violent blizzard — and fought to survive.

The goal of the filmmakers behind Universal’s Everest was to tell this story of tragedy and survival and, in doing so, make the audience feel the desperation of the characters on screen, and to tell this story of tragedy and survival. To give us a glimpse into the process, we reached out to Everest’s editor, Mick Audsley, whose work includes Harry Potter and the Goblet of Fire, Twelve Monkeys, Interview with the Vampire: The Vampire Chronicles, Dangerous Liaisons and many more.

Starting 4th from left: Director Baltasar Kormákur, Glenn Freemantle and Mick Audsley, with the audio post crew.

He acknowledges that, “from a storytelling point, there was a huge responsibility to make a film that worked, but also to be as honest as possible. We wanted to help preserve the legacy of the story for the families and climbers who are still alive.”

Audsley cut the film — shot with the Arri Alexa by DP 
Salvatore Totino — on an Avid Media Composer in DNX36 over 55 weeks, which, amazingly, isn’t the longest edit he’s been involved in. Goblet of Fire takes that award, coming in at 73 weeks.

Let’s find out more about Audsley’s process on this film and his philosophy on editing.

How did you give the audience that “you are here” feeling of peril in your edit?
There’s a montage early on, which shows the sorts of dangers they had on the way up, including the altitude, which has a huge impact on your health. There’s a great deal of peril in the sheer physics of it all, but as the story unfolds, we never felt we had to overly dramatize what went wrong, because it’s a series of small, rather human, mistakes and misjudgments with catastrophic consequences. Editorially, we felt it should just relentlessly build up, tightening its grip around the audience’s throat, if possible, in order to engage them.

How did you work with director Baltasar Kormákur, and how early did you get involved in the film?
I began at the start of shooting, although we weren’t together. Balthazar and the crew spent 10 shooting days in Nepal while we were setting up in the the mountains in Northern Italy —basically at a ski resort — where we were for about six to eight weeks doing the photography… with climbers in real snow. We were accessible to everybody and would show the work as it progressed. We then split up, because they built a base camp in Cinecitta Studios in Rome. That was only going to last two weeks, so it made sense to come back to London for the rest of the schedule, which was completed in Pinewood Studios on the big 007 stage.

We were all very busy, and I didn’t see a great deal of Balthazar during shooting, but we would meet. It was a very tough shoot, as you can imagine, and he was kind enough to trust me just to carry on.

storm

When did you get into a room with him?
After they finished shooting and Balt had gone back home to Reykjavik. We were meeting everyday, based at RVX https://www.rvx.is/, his visual effects company’s building in the center of Reykjavik. We then spent the best part of 14 weeks working together in Iceland.

Fourteen weeks, just in Iceland?
It was the director’s cut period, which is normally 10 weeks, but we stayed longer since it worked so well for Balt as he was able to carry on with things and visit my team and I almost every day. We would get together in the afternoon and I would show the work I’d done the day before, discuss it and make the plan for the next day.

Were you given direction in terms of how it was going to be edited? Or where you given free rein?
I was given a large amount of free rein. Balt is extremely trusting of me, and we would just bat ideas around and constantly try to move the film to where we felt it was functioning in the way we needed it to. There were many strands of the story that were shot, which then had to be whittled down or re-balanced or changed or taken out. The editorial process was not just cutting; there was a certain amount of changing of dialogue, rewording things and re-recording things in order to make the narrative move forward in the right way. It was a big job, and we were constantly throwing things at each other.

I obviously had the task of doing the actual editorial work of realizing it, cutting the scenes and putting it all together, but I was given an enormous amount of freedom and could express myself very freely. So it was very much a joint venture.

Everest Film Title: Everest

Can you describe your editing set up?
We had three Avid Media Composers with shared storage. Actually, we had four because my visual effects editor, Keith Mason, joined us in Iceland for that period. We had to turn over material as quickly as we could so the visual effects work could be started and run in parallel with us as the cut progressed.

I had two assistants on Everest because it was a very labor-intensive film. There was a lot of material. On average I was receiving between five and six hours a day from each day’s shooting. So over a period of 16 — 18 weeks that builds up quite a big library of material to be evaluated, understood and cut. It worked very smoothly and efficiently.

Do you typically cut on a Media Composer?
Yes, it’s a very good tool, one that I’ve been using for the last… God knows. What we need is something that’s reliable and fast and allows us the freedom to think and to store the many versions and permutations we need. A lot of the work that we do is keeping a very tidy cutting room in terms of organization of material and the different versions and what we’re doing and where we’re putting our efforts.

How do you work with your assistant editors, specifically on this film?
Pani Ahmadi-Moore is my first assistant, and we’ve worked together for about six years now. But she’s much more than just an assistant editor; she’s a filmmaker in her own right, in the sense of being a collaborator in the project. Although she’s not actually cutting the movie, she’s very much involved in it.

So, all of the prep work, and making things available to me, is handled by Pani and her assistant. They present bins for each scene of material. I keep an absolutely precise log of what comes in and when and what it relates to, which is also presented by Pani. This frees me up to concentrate on cutting the scenes, putting the film together and aiming towards a first cut. We generally present this within two weeks of the end of principal photography.

Everest 5

The film was released in 3D stereo and Imax. Can you talk about that?
We didn’t put on 3D glasses, or anything like that, in the cutting room. When we got back to London and we had a cut, we then started sending sections to the 3D house, StereoD, and the stereo process began to run in parallel with us and those scenes would be updated as the cut changed.

It’s a bit like a VFX process in its own right. There are three strands of things going along in parallel on the pictorial side: the cut developing and being shaped and editorializing in the traditional way; the turning over of visual effects within that cut and keeping them up-to-date with the changes editorially; and, similarly, the same process going on for turnovers to Stereo D in Burbank, California.

After the conversions are made, do you get to see the final product?
Yes, we do. In fact, though, in this case, I was so busy with the cut that Balthazar, bless him, took a lion’s share of directing the 3D. We had to divide our labor, as it were, and I was still very busy shaping the content of the film. It comes to a point when it’s “How do we best use our time and what is the best distribution of our time?”

You mentioned VFX, were you working with temp VFX? How did that work?
We did have temp VFX, and we would be given early versions of critical shots. A lot of the summit material, where we had the climbers on the set without the backgrounds, took quite a while for us to get. For me, it was quite hard to judge the impact of some of these shots until they were available in a form where we could see how good they were going to be and how extreme the environment was. It takes time… it’s a slow cooked meal.

Can you talk about the film’s sound?
We had extremely difficult audio. There was a high percentage of ADR and replacement on this film — we had wind machines, we had people on the real mountains with clothes blowing and making noise, so the audio in its early stages was very difficult to hear and use. It wasn’t until we got substantial ADR and tracks back that were clean that we could build it all up again. That was very challenging.

Who worked on the sound?
The sound designer was the wonderful Glenn Freemantle and the dialogue editor was my old friend Nina Heartstone. She did an amazing job scheduling the artists to come back for ADR. They also had to do very physical things — now in a studio environment —in order to get the vocalization and the physicality to sound convincing. The sound is quite extraordinary.
It wasn’t until we had a temp dubbed and temporary visuals that started to feel that the film was being realized how we had intended it to be, and we could start to read it properly.

Is there any one scene or section that you found the most challenging?
I think the whole film was challenging. Obviously, the storm sequence is a visceral experience that you want to have the audience experience — the complexity of what was going on apart from the physical hardship, and the way in which the tragedy unfolded.

We had filmmaking issues to resolve as well. The identification of people was one, actually seeing the climbers’ faces since they were hidden most of the time. There were lots of issues that we had to understand, to decide whether to solve or to accept. For example, the story of what happened with the oxygen is confusing, and nobody really understands exactly what when wrong. In filmmaking terms, that can be tricky to communicate. Whether we got away with that, I don’t know. Everybody was very confused about the oxygen, and that’s how it was.

It goes back to what I was saying at the beginning, Randi, which is this responsibility towards the subjected storytelling for those who survived and the reality of what happened.

What’s next for you?
I was to be making another film for Terry Gilliam (we worked together previously on The Imaginarium of Doctor Parnassus), which is his long-awaited Don Quixote movie, but this has been postponed until the spring..

In the meantime, I’m helping set up a film networking organization here in London. It’s called Sprocket Rocket Soho (@srsoho) It’s an international endeavor aimed at bringing young filmmakers together with older filmmakers, because in the digital world we’re all feeling a bit isolated and people need to get into a room and talk. I’m very pro-education for young filmmakers, and this is part of that incentive.

Review: Lenovo ThinkPad W550s Ultrabook mobile workstation

By Brady Betzel

Over the last few years, I’ve done a lot of workstation reviews, including ones for HP’s z800 and z840, Dell’s mobile workstations and now the Lenovo ThinkPad W550s mobile workstation.

After each workstation review goes live, I’m always asked the same question: Why would anyone pay the extra money for a professional workstation when you can buy something that performs almost as good if not better for half the price? That’s a great question.

What separates workstations from consumer or DIY systems is primarily ISV (Independent Software Vendor) certifications. Many companies, including Lenovo, work directly with software manufacturers like Autodesk and Adobe to ensure that the product you are receiving will work with the software you use, including drivers, displays, keypads, ports (like the mini display port) and so on. So while you are paying a premium to ensure compatibility, you are really paying for the peace of mind that your system will work with the software you use most. The Lenovo W550s has ISV-certified drivers with Autodesk, Dassault, Nemetscheck Vectorworks, PTC and Siemens, all relating to drivers for the Nvidia Quadro K620M graphics card.

W550s_Standard_05

Beyond ISV driver certifications, the Lenovo ThinkPad W550s is a lightweight powerhouse with the longest battery life I have ever seen in a mobile workstation — all for around $2,500.

Out of the box I noticed two batteries charging when I powered on Windows 8.1 — you can choose Windows 7 (64-bit) or 8.1 (64-bit). One of the best features I have seen in a mobile workstation is the ability to swap batteries without powering down (I guess that’s the old man in me coming out), and Lenovo has found a way to do it without charging an arm and a leg and physically only showing one battery. For $50 (included in the $2,500 price), you can have a three-cell (44Whr) battery in the front and a six-cell (72Whr) battery in the back. I was able to work about three days in a row without charging.

This was intermittent work ranging from sending out tweets with 10 tabs up in Chrome to encoding a 4K H.264 for YouTube in Adobe Media Encoder. It was a very welcome surprise, and if I had a second battery I could swap them out without losing power because of the battery in the front (built-in).

Under the Hood
The battery life is the biggest feature in my opinion, but let’s layout the rest of the specs… Processor: Intel Core i7-5600U (4MB Cache, up to 3.20GHz – I got 2.6); OS: Windows 8.1 Pro 64; Display: 15.5-inches 3K (2880×1620), IPS, Multi-touch, with WWAN; Graphics: Nvidia Quadro K620M 2GB; Memory: 16 PC3-12800 DDR3L; Keyboard: backlit with number keypad; Pointing Device: trackpoint (little red joystick looking mouse), Touchpad and Fingerprint Reader; Camera: 720p; Hard Drive: 512GB Serial ATA3, SSD; Battery: three-cell Li-Polymer 44Whr (Front), six-cell Li-ion 72Whr Cyl HC (Rear); Power Cord: 65W AC Adapter; Wireless: Intel 7265 AC/B/G/N dual band wireless plus Bluetooth; Warranty: one-year carry-in (diagnosed by phone first).

The W550s has a bunch of great inputs, like the mini display port, which I got to work instantly with an external monitor; three USB 3.0 ports with one of them always on for charging of devices; a smart card reader, which I used a lot; and even a VGA port.

W550s_Product tour_06 W550s_Product tour_05

In terms of power I received a nice Intel i7-5600U Quad Core CPU running at 2.6GHz or higher. Combined with the Nvidia Quadro K620M and 16GB of DDR3L, the Intel i7-5600U delivered enough power to encode my GoPro Hero 3+ Black Edition 4K timelapses quickly using the GoPro software and Adobe Media Encoder.

Encoding and layering effects is what really bogs a video editing system down, so what better way to see what the W550s is made of than by removing the fisheye on my clip with an effect on the image sequence containing about 2,400 stills in Adobe Premiere, speeding up the timelapse by 1,000 percent and sending the sequence to Adobe Media Encoder? In the end, the W550s chewed through the render and spit out a 4K YouTube-compatible H.264 in around 15 minutes. The CUDA cores in the Nvidia Quadro K620M really helped, although this did kick the fans on. I did about six of these timelapses to verify that my tests were conclusive. If you want to see them you can check them out on YouTube.

The Quadro K620M is on the lower end of the mobile Quadro family but boasts 384 CUDA cores that help with the encoding and transcoding of media using the Adobe Creative Suite. In fact, I needed a laptop to use in a presentation I did for the Editors’ Lounge. I wanted to run After Effects CC 2014 along with Video Copilot’s Element 3D V1.6 plug-in, Maxon Cinema 4D Studio R16 and Avid Media Composer 6.5, all while running Camtasia (screen capture software) the entire time. That’s a lot to run at once, and I decided to give the W550s the task.

In terms of processing power the W550s worked great — I even left After Effects running while I was inside of Cinema 4D doing some simple demos of House Builder and MoText work. I have to say I was expecting some lag when switching between the two powerhouse software programs, but I was running Element 3D without a hiccup, even replicating the text particle and adding materials and lighting to them – both a testament to a great plug-in as well as a great machine.

While the power was not a problem for the W550s, I did encounter some interesting problems with the screen resolution. I have to preface this by saying that it is definitely NOT Lenovo’s problem that I am describing, it has to do with Avid Media Composer not being optimized for this high resolution of a screen. Avid Media Composer was almost unusable on the 15.5-inch 3K (2880×1620), IPS, multi-touch screen. The user interface has not been updated for today’s high-resolution screens, including the W550s. It is something to definitely be aware of when purchasing a workstation like this.

I did a few benchmarks for this system using Maxon Cinebench R15 software, which tests the OpenGL and CPU performances as compared to other systems with similar specs. The OpenGL test revealed a score of 35.32fps while the CPU test revealed a score of 265cb. You can download Cinebench R15 here and test your current set-up against my tests of the W550s.

There are a couple of things cosmetically that I am not as fond of on the W550s. When you purchase the larger rear battery, keep in mind that it adds about ¼- to ½-inch lift — it will no longer sit flat. In addition the keyboard is very nice and I found myself really liking the addition of the numeric keypad, especially when typing in exact frames in Premiere, the touchpad has the three buttons on top instead of underneath like I have typically encountered. On one hand I can see how if you retrain yourself to use the three buttons with the left hand while using your right hand on the touch pad it may be more efficient. On the other hand it will get annoying. I like the idea of a touchscreen, in theory — It’s nice to move windows around. But practically speaking, from a video and motion graphics standpoint, it probably isn’t worth the extra money and I would stick to a non-touch screen for a mobile workstation.

The last item to cover is the warranty. Typically, workstations have a pretty good warranty. Lenovo gives you a one-year carry-in warranty with the purchase of the W550s, which to me is short. This really hurts the price of the workstation because to get more than a three-year warranty — one that will actually help you within a business day if a crisis arises – will cost you at least a few hundred dollars more.

Summing Up
In the end, the price and awesome battery life make the Lenovo ThinkPad W550s a lightweight mobile workstation that can crunch through renders quickly. If I was ordering one for myself I would probably max out the memory at 32GB, get rid of the touchscreen (maybe even keep the 1920×1080 resolution version) and keep everything else… oh, I would also upgrade to a better warranty.

Before you leave, take these highlights with you: extreme battery life, lightweight and durable, and powerful enough for multimedia use.

AMD FirePro supports Avid Media Composer 8.4 for HD, 4K workflows

AMD has certified Avid Media Composer 8.4 for HD and 4K broadcast and digital content creation workflows powered by AMD FirePro professional graphics on Microsoft Windows and Mac Pro workstations. The new support for Avid Media Composer enables AMD FirePro professional graphics customers to take advantage of 4K display, media management and editing capabilities throughout the video production process.

Avid Media Composer nonlinear video editing software is used extensively by professional editors in moviemaking, television, broadcast and streaming media. AMD FirePro professional graphics enable Avid Media Composer to support editing of high volumes of disparate file-based media for accelerated high-res and HD workflows, real-time collaboration and well-managed media management.