Category Archives: Editing

Avid at IBC with new post workflows based on MediaCentral 

At IBC2017, Avid introduced new MediaCentral solutions for post production. Dubbed MediaCentral for Post, the solutions integrate Media Composer video editing software with a new collaborative asset management module, new video I/O hardware and Avid Nexis software-defined storage.

MediaCentral for Post is a scalable solution for small and mid-sized creative teams to enhance collaboration and so teams can work more efficiently with 4K and other demanding formats, delivering their best work faster. Avid collected feedback from working editors while developing a collaborative workflow that goes beyond bin-locking and project-sharing to include integrated storage, editing, I/O acceleration and media management.

Besides Media Composer, MediaCentral solutions for post integrate Avid’s products in a single, open platform that includes:

• MediaCentral Editorial Management: This new media asset management tool enables everyone in a creative organization to collaborate in secure, reliable and simply-configured media workflows from a web browser. MediaCentral Editorial Management gives a view into an entire organization’s media assets. Without needing an NLE, assistants and producers can ingest files, create bins, add locators and metadata, create subclips and perform other asset management tasks — all from a simple browser interface. Users can collaborate using the new MediaCentral Panel for Media Composer, which provides direct access to MediaCentral content right in the Media Composer interface.

• MediaCentral Cloud UX: An easy-to-use and task-oriented graphical user interface, MediaCentral Cloud UX runs on any OS or mobile device, and is available to everyone connected to the platform. Creative team members can easily collaborate with each other from wherever they are.

• Artist DNxIVvideo: This interface offers a wide range of analog and digital I/O to plug into diverse media productions. It works with a broad spectrum of Avid and third-party video editing, audio, visual effects and graphics software.

• MediaCentral Panel for Media Composer: Within the Media Composer user interface, MediaCentral Panel allows users to see media outside of their active project as well as drag and drop assets from MediaCentral directly into any Media Composer project, bin or sequence.

• More Storage: Avid Nexis Pro now scales to 160 terabytes — twice its previous capacity – to give small post facilities the ease-of-use, security and performance advantages that larger Avid Nexis customers have access to. Avid Nexis E2 now supports SSD drives to deliver the extreme performance required when working with multiple streams of ultra-high-resolution media in real time. Additionally, Avid Nexis Enterprise now leverages 100 terabyte media packs to scale up to 4.8 petabytes.

LumaForge offering support for shared projects in Adobe Premiere

LumaForge, which designs and sells high-performance servers and shared storage appliances for video workflows, will be at IBC this year showing full support for new collaboration features in Adobe Premiere Pro CC. When combined with LumaForge’s Jellyfish or ShareStation post production servers, the new Adobe features — including multiple open projects and project locking —allow production groups and video editors to work more effectively with shared projects and assets. This is something that feature film and TV editors have been asking for from Adobe.

Project locking allows multiple users to work with the same content. In a narrative workflow, an editing team can divide their film into shared projects per reel or scene. An assistant editor can get to work synchronizing and logging one scene, while the editor begins assembling another. Once the assistant editor is finished with their scene, the editor can refresh their copy of the scene’s Shared Project and immediately see the changes.

An added benefit of using Shared Projects on productions with large amounts of footage is the significantly reduced load time of master projects. When a master project is broken into multiple shared project bins, footage from those shared projects is only loaded once that shared project is opened.

“Adobe Premiere Pro facilitates a broad range of editorial collaboration scenarios,” says Sue Skidmore, partner relations for Adobe Professional Video. “The LumaForge Jellyfish shared storage solution complements and supports them well.”

All LumaForge Jellyfish and LumaForge ShareStation servers will support the Premiere Pro CC collaboration features for both Mac OS and Windows users, connecting over 10Gb Ethernet.

Check out their video on the collaboration here.

Dell 6.15

Editor and Exile co-owner Eric Zumbrunnen has passed away at 52

Award-winning editor Eric Zumbrunnen has lost his battle with cancer. Known for his intelligence, kindness and dry wit as much as his editorial talent, Zumbrunnen left a memorable imprint that spanned the worlds of feature films, commercials, music videos, short films and documentaries.

In 2014, Zumbrunnen co-founded Santa Monica and New York-based post company Exile with partners Kirk Baxter, Matt Murphy and Carol Lynn Weaver. Zumbrunnen valued the opportunity to learn from the gifted writers, directors and editors he worked with. While he held himself and his co-workers to a high standard, he was also committed to encouraging and mentoring other editors and would-be editors.

Known to many as “EZ,”Zumbrunnen graduated from USC with a degree in journalism, and began his professional life in video post. A proficient guitarist, he brought his affinity for music to his early work editing music videos, including classics such as Weezer’s Buddy Holly, Smashing Pumpkins’ Tonight, Tonight, Beck’s Where It’s At and Bjork’s It’s Oh So Quiet.

He developed close collaborative relationships with a few directors, and subsequently expanded into commercials for clients such as Nike, Xbox and Apple, and ultimately into feature films. He had worked many times with director Spike Jonze. The pair’s collaboration spanned two decades and resulted in the Oscar-nominated Her, Adaptation, Where the Wild Things Are and Being John Malkovich, which earned Zumbrunnen an ACE Award for Best Edited Feature Film. More recently he was awarded a Bronze Lion for Editing at the Cannes Lions International Festival of Creativity for his work on Jonze’s Kenzo World fragrance ad My Mutant Brain. He was invited to join the Academy of Motion Picture Arts and Sciences this year.

Zumbrunnen is survived by his wife Suzanne and children Henry and Greta.


WAR FOR THE PLANET OF THE APES

Editor William Hoy — working on VFX-intensive War for the Planet of the Apes

By Mel Lambert

For William Hoy, ACE, story and character come first. He also likes to use visual effects “to help achieve that idea.” This veteran film editor points to director Zack Snyder’s VFX-heavy I, Robot, director Matt Reeves’ 2014 version of Dawn of the Planet of the Apes and his new installment, War for the Planet of the Apes, as “excellent examples of this tenet.”

War for the Planet of the Apes, the final part of the current reboot trilogy, follows a band of apes and their leader as they are forced into a deadly conflict with a rogue paramilitary faction known as Alpha-Omega. After the apes suffer unimaginable losses, their leader begins a quest to avenge his kind, and an epic battle that determines the fate of both their species and the future of our planet.

Marking the picture editor’s second collaboration with Reeves, Hoy recalls that he initially secured an interview with the director through industry associates. “Matt and I hit it off immediately. We liked each other,” Hoy recalls. “Dawn of the Planet of the Apes had a very short schedule for such a complicated film, and Matt had his own ideas about the script — particularly how the narrative ended. He was adamant that he ‘start over’ when he joined the film project.

“The previous Dawn script, for example, had [the lead ape character] Caesar and his followers gaining intelligence and driving motorized vehicles,” Hoy says. “Matt wanted the action to be incremental which, it turned out, was okay with the studio. But a re-written script meant that we had a very tight shoot and post schedule. Swapping its release date with X-Men: Days of Future Past gave us an additional four or five weeks, which was a huge advantage.”

William Hoy, ACE (left), Matt Reeves (right).

Such a close working relationship on Dawn of the Planet of the Apes meant that Hoy came to the third installment in the current trilogy with a good understanding of the way that Reeves likes to work. “He has his own way of editing from the dailies, so I can see what we will need on rough cut as the filmed drama is unfolding. We keep different versions in Avid Media Composer, with trusted performances and characters, and can see where they are going” with the narrative. Having worked with Reeves over the past two decades, Stan Salfas, ACE, served as co-editor on the project, joining prior to the Director’s Cut.

A member of The Academy of Motion Picture Arts And Sciences, Hoy also worked with director Randall Wallace on We Were Soldiers and The Man in the Iron Mask, with director Phillip Noyce on The Bone Collector and director Zack Snyder on Watchmen, a film “filled with emotional complexity and heavy with visual effects,” he says.

An Evolutionary Editing Process
“Working scene-by-scene with motion capture images and background artwork laid onto the Avid timeline, I can show Matt my point of view,” explains Hoy. “We fill in as we go — it’s an evolutionary process. I will add music and some sound effects for that first cut so we can view it objectively. We ask, ‘Is it working?’ We swap around ideas and refine the look. This is a film that we could definitely not have cut on film; there are simply too many layers as the characters move through these varied backgrounds. And with the various actors in motion capture suits giving us dramatic performances, with full face movements [CGI-developed facial animation], I can see how they are interacting.”

To oversee the dailies on location, Hoy set up a Media Composer editing system in Vancouver, close to the film locations used for principal photography. “War for the Planet of the Apes was shot on Arri Alexa 65 digital cameras that deliver 6K images,” the editor recalls. “These files were down-sampled to 4K and delivered to Weta Digital [in New Zealand] as source material, where they were further down-sampled to 2K for CGI work and then up-sampled back to 4K for the final release. I also converted our camera masters to 2K DNxHD 32/36 for editing color-timed dailies within my Avid workstation.”

In terms of overall philosophy, “we did not want to give away Caesar’s eventual demise. From the script, I determined that the key arc was the unfolding mystery of ‘What is going on?’ And ‘Where will it take us?’ We hid that Caesar [played by Andy Serkis] is shot with an arrow, and initially just showed the blood on the hand of the orangutan, Maurice [Karin Konoval]; we had to decide how to hide that until the key moment.”

Because of the large number of effect-heavy films that Hoy has worked on, he is considered an action/visual effects editor. “But I am drawn to performances of actors and their characters,” he stresses. “If I’m not invested in their fate, I cannot be involved in the action. I like to bring an emotional value to the characters, and visualize battle scenes. In that respect Matt and I are very much in tune. He doesn’t hide his emotion as we work out a lot of the moves in the editing room.”

For example, in Dawn of The Planet of The Apes, Koba, a human-hating Bonobo chimpanzee who led a failed coup against Caesar, is leading apes against the human population. “It was unsatisfying that the apes would be killing humans while the humans were killing apes. Instead, I concentrated on the POV of Caesar’s oldest son, Blue Eyes. We see the events through his eyes, which changed the overall idea of the battle. We shot some additional material but most of the scene — probably 75% — existed; we also spoke with the FX house about the new CGI material,” which involved re-imaged action of horses and backgrounds within the Virtual Sets that were fashioned by Weta Digital.

Hoy utilized VFX tools on various layers within his Media Composer sessions that carried the motion capture images, plus the 3D channels, in addition to different backgrounds. “Sometimes we could use one background version and other times we might need to look around for a new perspective,” Hoy says. “It was a trial-and-error process, but Matt was very receptive to that way of working; it was very collaborative.”

Twentieth Century Fox’s War for the Planet of the Apes.

Developing CGI Requests for Key Scenes
By working closely with Weta Digital, the editor could develop new CGI requests for key scenes and then have them rendered as necessary. “We worked with the post-viz team to define exactly what we needed from a scene — maybe to put a horse into a blizzard, for example. Ryan Stafford, the film’s co-producer and visual effects producer, was our liaison with the CGI team. On some scenes I might have as many as a dozen or more separate layers in the Avid, including Caesar, rendered backgrounds, apes in the background, plus other actors in middle and front layers” that could be moved within the frame. “We had many degrees of freedom so that Matt and I could develop alternate ideas while still preserving the actors’ performances. That way of working could be problematic if you have a director who couldn’t make up his mind; happily, Matt is not that way!”

Hoy cites one complex scene that needed to be revised dramatically. “There is a segment in which Bad Ape [an intelligent chimpanzee who lived in the Sierra Zoo before the Simian Flu pandemic] is seen in front of a hearth. That scene was shot twice because Matt did not consider it frenetic enough. The team returned to the motion capture stage and re-shot the scene [with actor Steve Zahn]. That allowed us to start over again with new, more frantic physical performances against resized backgrounds. We drove the downstream activities – asking Weta to add more snow in another scene, for example, or maybe bring Bad Ape forward in the frame so that we can see him more clearly. Weta was amazing during that collaborative process, with great input.”

The editor also received a number of sound files for use within his Avid workstation. “In the beginning, I used some library effects and some guide music — mostly some cues of composer Michael Giacchino’s Dawn score music from music editor Paul Apelgren. Later, when the picture was in one piece, I received some early sketches from the sound design team. For the Director’s Cut we had a rough cut with no CGI from Weta Digital. But when we received more sound design, I would create temp mixes on the Avid, with a 5.1-channel mix for the sound-editorial team using maybe 24 tracks of effects, dialog and music elements. It was a huge session, but Media Composer is versatile. After turning over that mix to Will Files, the film’s sound designer, supervising sound editor and co-mixer, I was present with Matt on the re-recording stage for maybe six weeks of the final mix as the last VFX elements came in. We were down to the wire!”

Hoy readily concedes that while he loves to work with new directors — “and share their point of view” — returning to a director with whom he has collaborated previously is a rewarding experience. “You develop a friendly liaison because it becomes easier once you understand the ways in which a director works. But I do like to be challenged with new ideas and new experiences.” He may get to work again with Reeves on the director’s next outing, The Batman, “but since Matt is still writing the screenplay, time will tell!”


Mel Lambert is principal of Content Creators, an LA-based copywriting and editorial service, and can be reached at mel.lambert@content-creators.com. Follow him on Twitter @MelLambertLAHe is also a long-time member of the UK’s National Union of Journalists.

 


Big Block adds comedy director Richard Farmer

Who doesn’t like to laugh? No one. Well hardly no one. So when a director is able to evoke that sort of response from an audience, it’s amazing. That was the thinking behind Big Block’s addition of comedy director Richard Farmer.

This Oklahoma native began his career as an agency producer in Los Angeles after spending time post-college living in London, Seattle and Prague, working on indie films and videos. After a few years, he went on to produce for Mindfield, a production, editorial, and animation company for commercial television and music videos.

Since stepping behind the lens, Farmer has directed a prolific amount of commercials, each featuring his absurdist humor. Whether it’s carnivorous bunnies for Wendy’s, a magically appearing Fancy Bear for Free Credit Score or creating ’90s R&B songs about iconic memes for LG V20 phones, Farmer knows just how to create a memorable and compelling spot.

The recent LG spots are an example of Farmer’s style. Shot exclusively on the LG V20 phone, Farmer took well-known memes, from Double Rainbow to Damn Daniel and “remastered” them in high quality, showcasing a mash-up of his skills across the realms of narrative, music and VFX. Farmer has already hit the ground running at Big Block, having just booked a job for Simon Malls.

We asked Farmer what he likes about working with editors on his projects: “I love it when the editor really embraces that they are a partner in the process and know they have the freedom to take risks. Editors that are brave enough to push the creative and what was shot to new areas. Freak me out. Open it up and move the boundary. Show me new possibilities. I’m always blown away when that magic happens.”

 


Timecode ships UltraSync One wireless sync units

First shown at NAB earlier this year, UltraSync One is the latest addition to Timecode Systems‘ range of timecode generators and transceivers. It is now shipping for $299.

Measuring 2.2in x 1.7in x 0.7in and 1.4 ounces, and with a battery life of more than 25 hours, it is designed to provide hassle-free sync for long shooting days. This generator and transceiver provides timecode, genlock for camera sync and word clock for sound.

With the demands of multi-camera filming driving the requirement for more than just timecode to guarantee a reliable sync, Timecode Systems treats genlock and word clock as a necessity for filming today.

UltraSync One earned an honorable mention from the panel of judges for the Post Production Technology category of the Cine Gear Expo 2017 Technical Awards in recognition of the huge benefits it offers for edit workflows. Thousands of hours of content can be recorded during the course of filming a multi-camera television series. UltraSync One makes it easy to capture, log, search and synchronize this content, which means significant time and cost savings throughout the production process, from acquisition to post.

According to Paul Scurrell, CEO of Timecode Systems, production teams who have adopted the system saved a ton of time when the footage gets to the edit suite. He says it’s not just a few hours saved; it’s often days or even weeks of edit time.


Michael Kammes’ 5 Things – Video editing software

By Randi Altman

Technologist Michael Kammes is back with a new episode of 5 Things, which focuses on simplifying film, TV and media technology. The web series answers, according to Kammes, the “five burning tech questions” people might have about technologies and workflows in the media creation space. This episode tackles professional video editing software being used (or not used) in Hollywood.

Why is now the time to address this segment of the industry? “The market for NLEs is now more crowded than it has been in over 20 years,” explains Kammes. “Not since the dawn of modern NLEs have there been this many questions over what tools should be used. In addition, the massive price drop of NLEs, coupled with the pricing shift (monthly/yearly, as opposed to outright) has created more confusion in the market.”

In his video, Kammes focuses on Avid Media Composer, Adobe Premiere, Apple Final Cut Pro, Lightworks, Blackmagic Resolve and others.

Considering its history and use on some major motion pictures, (such as The Wolf of Wall Street), why hasn’t Lightworks made more strides in the Hollywood community? “I think Lightworks has had massive product development and marketing issues,” shares Kammes. “I rarely see the product pushed online, at user groups or in forums.  EditShare, the parent company of LightWorks, also deals heavily in storage, so one can only assume the marketing dollars are being spent on larger ticket items like professional and enterprise storage over a desktop application.”

What about Resolve, considering its updated NLE tools and the acquisition of audio company Fairlight? Should we expect to see more Resolve being used as a traditional NLE? “I think in Hollywood, adoption will be very, very slow for creative editorial, and unless something drastic happens to Avid and Adobe, Resolve will remain in the minority. For dailies, transcodes or grading, I can see it only getting bigger, but I don’t see larger facilities adopting Resolve for creative editorial. Outside of Hollywood, I see it gaining more traction. Those outlets have more flexibility to pivot and try different tools without the locked-in TV and feature film machine in Hollywood.”

Check it out:

FMPX8.14

Jimmy Helm upped to editor at The Colonie

The Colonie, the Chicago-based editorial, visual effects and motion graphics shop, has promoted Jimmy Helm to editor. Helm has honed his craft over the past seven years, working with The Colonie’s senior editors on a wide range of projects. Most recently, he has been managing ongoing social media work with Facebook and conceptualizing and editing short format ads. Some clients he has collaborated with include Lyft, Dos Equis, Capital One, Heineken and Microsoft. He works on both Avid Media Composer and Adobe Premiere.

A filmmaking major at Columbia College Chicago, Helm applied for an internship at The Colonie in 2010. Six months later he was offered a full-time position as an assistant editor, working alongside veteran cutter Tom Pastorelle on commercials for McDonald’s, Kellogg’s, Quaker and Wrangler. During this time, Helm edited numerous projects on his own, including broadcast commercials for Centrum and Kay Jewelers.

“Tom is incredible to work with,” says Helm. “Not only is he a great editor but a great person. He shared his editorial methods and taught me the importance of bringing your instinctual creativity to the process. I feel fortunate to have had him as a mentor.”

In 2014, Helm was promoted to senior assistant editor and continued to hone his editing skills while taking on a leadership role.

“My passion for visual storytelling began when I was young,” says Helm “Growing up in Memphis, I spent a great deal of time watching classic films by great directors. I realize now that I was doing more than watching — I was studying their techniques and, particularly, their editing styles. When you’re editing a scene, there’s something addictive about the rhythm you create and the drama you build. I love that I get to do it every day.”

Helm joins The Colonie’s editorial team, comprised of Joe Clear, Keith Kristinat, Pastorelle and Brian Salazar, along with editors and partners Bob Ackerman and Brian Sepanik.

 

 


Baby Driver editors — Syncing cuts to music

By Mel Lambert

Writer/director Edgar Wright’s latest outing is a major departure from his normal offering of dark comedies. Unlike his Three Flavours Cornetto film trilogy — Shaun of the Dead, Hot Fuzz and The World’s End — and Scott Pilgrim vs. the World, TriStar Pictures’ Baby Driver has been best described as a romantic musical disguised as a car-chase thriller.

Wright’s regular pair of London-based picture editors, Paul Machliss, ACE, and Jonathan Amos, ACE, also brought a special brand of magic to the production. Machliss, who had worked with Wright on Scott Pilgrim, The World’s End and his TV series Spaced for Channel 4, recalls that, “very early on, Edgar decided that I should come along on the shoot in Atlanta to ensure that we had the material he’d already storyboarded in a series of complex animatics for the film [using animator Steve Markowski and editor Evan Schiff]. Jon Amos joined us when we returned to London for sound and picture post production, primarily handling the action sequences, at which he excels.”

Developed by Wright over the past two decades, Baby Driver tells the story of an eponymous getaway driver (Ansel Elgort), who uses earphones to drown out the “hum-in-the-drum” of tinnitus — the result of a childhood car accident — and to orchestrate his life to carefully chosen music. But now indebted to a sinister kingpin named Doc (Kevin Spacey), Baby becomes part of a seriously focused gang of bank robbers, including Buddy and Darling (Jon Hamm and Eiza González), Bats (Jamie Foxx) and Griff (Jon Bernthal). Debora, Baby’s love interest (Lily James), dreams of heading west “in a car I can’t afford, with a plan I don’t have.” Imagine, in a sense, Jim McBride’s Breathless rubbing metaphorical shoulders with Tony Scott’s True Romance.

The film also is indebted to Wright’s 2003 music video for Mint Royale’s Blue Song, during which UK comedian/actor Noel Fielding danced in a stationery getaway car. In that same vein, Baby Driver comprises a sequence of linked songs that tightly choreograph the action and underpin the dramatic arcs being played out, often keying off the songs’ lyrics.

The film’s opener, for example, features Elgort partly lipsyncing to “Bellbottoms,” by the Jon Spencer Blues Explosion, as the villains commit their first robbery. In subsequent scenes, our hero’s movements follow the opening bass riffs of The Damned’s “Neat Neat Neat,” then later to Golden Earring’s “Radar Love” before Queen’s “Brighton Rock” adds complex guitar cacophony to a key encounter scene.

Even the film’s opening titles are accompanied by Baby performing a casual coffee run in a continuous three-minute take to Bob & Earl’s “Harlem Shuffle” — a scene that reportedly took 28 takes on the first day of practical photography in Atlanta. And the percussion and horns of “Tequila” provide syncopation for a protracted gunfight. Fold in “Egyptian Reggae,” “Unsquare Dance,” and “Easy,” followed by “Debora,” and it’s easy to appreciate that Wright is using music as a key and underpinning component of this film. The director also brought in music video choreographer Ryan Heffington to achieve the timing precision he needed.

The swift action is reflected in a fast style of editing, including whip pans and crash zooms, with cuts that are tightly synchronized to the music. “Whereas the majority of Edgar’s previous TV series and films have been parodies, for Baby Driver he had a very different idea,” explains Machliss. Wright had accumulated a playlist of over 30 songs that would inspire various scenes in his script. “It’s something that’s very much a part of my previous films,” says director Wright, “and I thought of this idea of how to take that a stage further by having a character who listens to music the entire time.”

“Edgar had organized a table read of his script in the spring of 2012 in Los Angeles, at which he recorded all of the dialog,” says Machliss. “Taking that recording, some sound effects and the music tracks, I put together a 100-minute ‘radio play’ that was effectively the whole film in audio-only form that Edgar could then use as a selling tool to convince the studios that he had a viable idea. Remember, Baby Driver was a very different format for him and not what he is traditionally known for.”

Australia-native Machliss was on set to ensure that the gunshots, lighting effects, actors and camera movements, plus car hits, all happened to the beat of the accompanying music. “We were working with music that we could not alter or speed up or slow down,” he says. “We were challenged to make sure that each sequence fit in the time frame of the song, as well as following the cadence of the music.”

Almost 95% of music included in the first draft of Wright’s script made it into the final movie according to Machliss. “I laid up the relevant animatic as a video layer in my Avid Media Composer and then confirmed how each take worked against the choreographed timeline. This way I always had a reference to it as we were filming. It was a very useful guide to see if we were staying on track.”

Editing On Location
During the Atlanta shoot, Machliss used Apple ProRes digital files captured by an In2Core QTake video assist that was recording taps from the production’s 35mm cameras. “I connected to my Mac via Ethernet so I could create a network to the video assist’s storage. I had access to his QuickTime files the instant he stopped recording. I could use Avid’s AMA function to place the clip in the timeline without the need for transcoding. This allowed almost instantaneous feedback to Edgar as the sequence was built up.”

Paul Machliss on set.

While on location, Machliss used a 15-inch MacBook Pro, Avid Mojo DX and a JVC video monitor “which could double as a second screen for the Media Composer or show full-screen video output via the Mojo DX.” He also had a Wacom tablet, an 8TB Thunderbolt drive, a LaCie 500GB rugged drive — “which would shuttle my media between set and editorial” — and an APU “so that I wouldn’t lose power if the supply was shut down by the sparks!”

LA’s Fotokem handled film processing, with negative scanning by Efilm. DNX files were sent to Company 3 in Atlanta for picture editorial, “where we would also review rushes in 2K sent down the line from Efilm,” says Machliss. “All DI on-lining and grading took place at Molinare in London.” Bill Pope, ASC, was the film’s director of photography.

Picture and Sound Editorial in London
Instead of hiring out editorial suites at a commercial facility in London, Wright and his post teams opted for a different approach. Like an increasing number of London-based productions, they elected to rent an entire floor in an office building.

They located a suitable location on Berners Street, north of the Soho-based film community. As Machliss recalls: “That allowed us to have the picture editorial team in the same space as the sound crew,” which was headed up by Wright’s long-time collaborator Julian Slater, who served as sound designer, supervising sound editor and re-recording engineer on Baby Driver. “Having ready access to Julian and his team meant that we could collaborate very closely — as we had on Edgar’s other films — and share ideas on a regular basis,” as the 10-week Director’s Cut progressed.

British-born Slater then moved across Soho to Goldcrest Films for sound effects pre-dubs, while his co-mixer, Tim Cavagin, worked on dialog and Foley pre-mixes at Twickenham Studios. Print mastering of the Dolby Atmos soundtrack occurred in February 2017 at Goldcrest, with Slater handling music and SFX, while Cavagin oversaw dialog and Foley. “Following Edgar’s concept of threading together the highly choreographed songs with linking scenes, Jon and I began the cut in London against the pre-assembled material from Atlanta,” says Machliss.

To assist Machliss during his picture cut, the film’s sound designer had provided a series of audio stems for his Avid. “Julian [Slater] had been working on his sound effects and dialog elements since principal photography ended in Atlanta. He had prepared separate, color-coded left-center-right stems of the music, dialog and SFX elements he was working on. I laid these [high-quality tracks] into Media Composer so I could better appreciate the intricacies of Julian’s evolving soundtrack. It worked a lot better than a normal rough mix of production dialog, rough sound effects and guide music.”

“From its inception, this was a movie for which music and sound design worked together as a whole piece,” Slater recalls. “There is a large amount of syncopation of the diegetic sounds [implied by the film’s action] to the music track Baby is listening to. Sometimes it’s obvious because the action was filmed with that purpose in mind. For example, walking in tempo to the music track or guns being fired in tempo. But many times it’s more subtle, including police sirens or distant trains that have been pitched and timed to the music,” and hence blend into the overall musical journey. “We strived to always do this to support the story, and to never distract from it.”

Because of the lead character’s tinnitus, Slater worked with pitch changes to interweave elements of the film’s soundtrack. “Whenever Baby is not listening to music, his tinnitus is present to some degree. But it became apparent very soon in our design process that strident, high-pitched ‘whistle tones’ would not work for a sustained period of time. Working closely with composer Steven Price, we developed a varied set of methods to convey the tinnitus — it’s rarely the same sound twice. Much of the time, the tinnitus is pitched according to either the outgoing or incoming music track. This then enabled us to use more of it, yet at the same time be quite subtle.”

Meticulous Planning for Set Pieces and Car Chases
Picture editor Amos joined the project at the start of the Director’s Cut to handle the film’s set pieces. He says, “These set pieces were conceptually very different from the vast majority of action scenes in that they were literally built up around the music and then visualized. Meticulous development and planning went into these sequences before the shoot even began, which was decisive in making the action become musical. For example, the ‘Tequila’ gunfight started as a piece of music by Button Down Brass. It was then laced with gunfire and SFX pitched to the music, and in time with the drum hits — this was done at the script stage by Mark Nicholson (aka, Osymyso, a UK musician/DJ) who specializes in mashup/bastard pop and breakbeat.”

Storyboards then grew around this scripted sound collage, which became a precise shot list for the filmed sequences. “Guns were rigged to go off in time with the music; it was all a very deliberate thing,” adds Amos. “Clearly, there was a lot of editing still to be done, but this approach illustrates that there’s a huge difference between something that is shot and edited to music, and something that is built around the music.”

“All the car chases for Baby Driver were meticulously planned, and either prevised or storyboarded,” Amos explains. “This ensured that the action would always fit into the time slot permitted within the music. The first car chase [against the song ‘Bellbottoms’] is divided into 13 sections, to align to different progressions in the music. One of the challenges resulted from the decision to never edit the music, which meant that none of these could overrun. Stunts were tested and filmed by second unit director Darrin Prescott, and the footage passed back to editorial to test against the timing allowed in the animatic. If a stunt couldn’t be achieved in the time allowed, it was revised and tweaked until it worked. This detailed planning gave the perfect backbone to the sequences.”

Amos worked on the sequences sequentially, “using the animatic and Paul’s on-set assembly as reference,” and began to break down all the footage into rolls that aligned to specific passages of the music. “There was a vast amount of footage for all the set pieces, and things are not always shot in order. So generally I spent a lot of time breaking the material down very methodically. I then began to make selects and started to build the sequences from scratch, section by section. Once I completed a pass, I spent some time building up my sound layers. I find this helps evolve the cut, generating another level of picture ideas that further tighten the syncopation of sound and picture.”

Amos’ biggest challenge, despite all the planning, was finding ways to condense the material into its pre-determined time slot. “The real world never moves quite like animatics and boards. We had very specific points in every track where certain actions had to take place; we called these anchor points. When working on a section, we would often work backwards from the anchor point knowing, for instance, that we only had 20 seconds to tell a particular part of the story. Initially, it can seem quite restrictive, but the edits become so precise.

Jonathan Amos

“The time restriction led to a level of kineticism and syncopation that became a defining feature of the movie. While the music may be the driving force of the action scenes, editorial choices were always rooted in the story and the characters. If you lose sight of the characters, the audience will disengage with the sequence, and you’ll lose all the tension you’ve worked so hard to create. Every shot choice was therefore very considered, and we worked incredibly hard to ensure we never wasted a frame, telling the story in the most compelling, rhythmic and entertaining way we could.”

“Once we had our cut,” Machliss summarizes, “we could return the tracks to Julian for re-conforming,” to accommodate edit changes. “It was an excellent way of working, with full-sounding edit mixes.”

Summing up his experience in Baby Driver, Machliss considers the film to be “the hardest job I’ve ever done, but the most fun I’ve ever had. Ultimately, our task was to create a film that on one level could be purely enjoyed as an exciting/dramatic piece of cinema, but, on repeated viewing, would reveal all the little elements ‘under the surface’ that interlock together — which makes the film unique. It’s a testament to Edgar’s singular vision and, in that regard, he is a tremendously exciting director to work with.”


Mel Lambert has been involved with production industries on both sides of the Atlantic for more years than he cares to remember. He is principal of Content Creators, a LA-based copywriting and editorial service, and can be reached at mel.lambert@content-creators.com. He is also a long-time member of the UK’s National Union of Journalists.

Dailies and post for IFC’s Brockmire

By Randi Altman

When the name Brockmire first entered my vocabulary, it was thanks to a very naughty and extremely funny short video that I saw on YouTube, starring Hank Azaria. It made me laugh-cry.

Fast forward about seven years and the tale of the plaid-jacket-wearing, old-school baseball play-by-play man — who discovers his beloved wife’s infidelity and melts down in an incredibly dirty and curse-fueled way on air — is picked up by IFC, in the series aptly named Brockmire. It stars Azaria, Amanda Peet and features cameos from sportscasters like Joe Buck and Tim Kurkjian.

The Sim Group was called on to provide multiple services for Brockmire: Sim provided camera rentals, Bling Digital provided dailies and workflow services, and Chainsaw provided offline editorial facilities, post finishing services, and deliverables.

We reached out to Chainsaw’s VP of business development, Michael Levy, and Bling Digital’s workflow producer, James Koon, with some questions about workflow. First up is Levy.

Michael Levy

How early did you get involved on Brockmire?
Our role with Brockmire started from the very beginning stages of the project. This was through a working relationship I had with Elizabeth Baquet, who is a production executive at Funny or Die (which produces the show).

What challenges did you have to overcome?
One of the biggest challenges was related to scaling a short to a multi-episode series and having multiple episodes in both production and in post at the same time. However, all the companies that make up Sim Group have worked on many episodic series over the years, so we were in a really good position to offer advice in terms of how to plan a workflow strategy, how to document things properly and how to coordinate getting their camera and dailies offline media from Atlanta to Post Editorial in Los Angeles.

What tools did they need for post and how involved was Chainsaw?
Chainsaw worked very hard with our Sim Group colleagues in Atlanta to provide a level of coordination that I believe made life much simpler for the Brockmire production/editorial team.

Offline editing for the series was done on our Avid Media composer systems in cutting rooms here in the Chainsaw/SIM Group studio in Los Angeles at the Las Palmas Building.

The Avid dailies media created by Bling-Atlanta, our partner company in the SimGroup, was piped over the Internet each day to Chainsaw. When the Brockmire editorial crew walked into their cutting rooms, their offline dailies media was ready to edit with on their Avid Isis server workspace. Whenever needed, they were also able to access their Arri Alexa full-rez dailies media that had been shipped on Bling drives from Atlanta.

Bling-Atlanta’s workflow supervisor for Brockmire, James Koon, remained fully involved, and was able to supervise the pulling of any clips needed for VFX, or respond to any other dailies related needs.

Deb Wolfe, Funny or Die’s post producer for Brockmire, also had an office here at Chainsaw. She consulted regularly with Annalise Kurinsky (Chainsaw’s in-house producer for Brockmire) and I as they moved along locking cuts and getting ready for post finishing.

In preparation for the finishing work, we were able to set-up color tests with Chainsaw senior colorist Andy Lichtstein, who handled final color for the series in one of our FilmLight Baselight color suites. I should note that all of our Chainsaw finishing rooms were right downstairs on the second floor of the same Sim Group Las Palmas Building.

How closely did you work with Deb Wolfe?
Very closely, especially in dealing with an unexpected production problem. Co-star Amanda Peet was accidentally hit in the head by a thrown beer can (how Brockmire! as they would say in the series). We quickly called in Boyd Stepan, Chainsaw’s Senior VFX artist, and came up with a game plan to do Flame paint fixes on all of the affected Amanda Peet shots. We also provided additional VFX compositing for other planned VFX shots in several of their episodes.

What about the HD online finish?
That was done on Avid Symphony and Baselight by staff online editor Jon Pehlke, making full use of Chainsaw’s Avid/Baselight clip-based AAF workflow.

The last stop in the post process was the Chainsaw Deliverables Department, which took care of QC and requested videotape dubs and creation and digital upload of specified delivery files.

James Koon

Now for James Koon…

James, what challenges did you have to overcome if any?
I would say that the biggest challenge overall with Brockmire was the timeframe. Twenty-four days to shoot eight episodes is ambitious. While in general this doesn’t pose a specific problem in dailies, the tight shooting schedule meant that certain elements of the workflow were going to need more attention. The color workflow, in particular, was one that created a fair amount of discussion — with the tight schedules on set, the DP (Jeffrey Waldron) wanted to get his look, but wasn’t going to have much time, if any, on-set coloring. So we worked with the DP to set up looks before they started shooting that could be stored in the camera and monitored on set and would be applied and tweaked as needed back at the dailies lab with notes from the DP.

Episode information from set to editorial was also an important consideration as they were shooting material from all eight episodes at once. Making sure to cross reference and double check which episode a shot was for was important to make sure that editorial could quickly find what they needed.

Can you walk us through the workflow, and how you worked with the producers?
They shot with the Arri’s Amira and Alexa Mini, monitoring with the LUTs created before production. This material was offloaded to an on-set back-up and a shuttle drive  — we generally use G-Tech G-RAID 4TB Thunderbolt or USB3 and  for local storage a Promise Pegasus drive and a back up on our Facilis Terrablock SAN — that was sent to the lab along with camera notes and any notes from the DP and/or the DIT regarding the look for the material. Once received at the lab we would offload the footage to our local storage and process the footage in the dailies software, syncing the material to the audio mixers recording and logging the episode, scene and take information for every take, using camera notes, script notes and audio logs to make sure that the information was correct and consistent.

We also applied the correct LUT based on camera reports and tweaked color as needed to match cameras and make any adjustments needed from the DPs notes. Once all of that was completed, we would render Avid materials for editorial, create Internet streaming files for IFC’s Box service, as well as creating DVDs.

We would bring in the Avid files and organize them into bins per the editorial specs, and upload the files and bins to the editorial location in LA. These files were delivered directly to a dailies partition on their Isis, so once editorial arrived in the morning, everything was waiting for them.

Once dailies were completed, LTO backups of the media and dailies were written as well as additional temporary backups of the source material as a safety. These final backups were completed and verified by the following morning, and editorial and production were both notified, allowing production to clear cards from the previous day if needed.

What tools did you use for dailies?
We used DaVinci Resolve to set original looks with the DP before the show began shooting, Colorfront Express Dailies for dailies processing, Media Composer for Avid editorial prep and bin organization and Imagine’s PreRoll Post for LTO writing and verification.