Tag Archives: Avid Media Composer

AJA and Avid intro Avid Artist | DNxIP hardware interface

AJA has collaborated with Avid to develop Avid Artist | DNxIP, a new hardware interface option for Avid Media Composer users that supports high frame rate, deep color and HDR IP workflows. It is a Thunderbolt 3-equipped I/O device that enables the transfer of SMPTE standard HD video over 10 GigE IP networks, with high-quality local monitoring over 3G-SDI and HDMI 2.0.

Based on the new AJA Io IP, Avid Artist | DNxIP is custom engineered to Avid’s specifications and includes an XLR audio input on the front of the device for microphone or line-level sources. Avid Artist | DNxIP uses Thunderbolt 3 to enable simple, fast HD/SD video and audio ingest/output from/to IP networks. It features dual Thunderbolt 3 ports for daisy chaining and two SFP+ cages for video and audio routing over 10 GigE IP networks. The portable, aluminum encased device also supports SMPTE 2022-6 uncompressed video, audio and VANC data over IP, as well as SMPTE 2022-7 for redundancy protection.

“The increased agility and efficiency of IP workflows is a must-have for content creators and broadcasters in today’s competitive climate,” says Alan Hoff, VP of market solutions for Avid. “We’ve collaborated with AJA on the newest addition to our Avid Artist product line, Avid Artist DNxIP, which offers broadcasters and post production facilities a portable, yet powerful, video interface for IP workflows.”

Avid Artist | DNxIP feature highlights include:
– Laptop or desktop HD/SD capture and playback over IP across Thunderbolt 3
– Audio input for analog microphone to record single-channel 16-bit D/A analog audio, 48 kHz sample rate, balanced, using industry standard XLR
– Backwards compatibility with existing Thunderbolt hosts
– SMPTE 2022-6 and 2022-7 I/O
– Dual 10 GigE connectivity via two SFP+ cages compatible with 10 GigE SFP transceiver modules from leading third-party providers
– Two Thunderbolt 3 ports for daisy chaining of up to six Thunderbolt devices
– 3G-SDI and HDMI 2.0 video monitoring
– Audio I/O: 16-channel embedded SDI; 8-channel embedded HDMI; 4-channel analog audio In and 4-channel audio out via XLR breakout
– Small, rugged design suited for a variety o production environments
– Downstream keyer
– Standard 12v 4-pin XLR for AC or battery power

Detroit editors William Goldenberg, ACE, and Harry Yoon

By Chris Visser

Kathryn Bigelow’s Detroit is not an easy film to watch. It deals with some very ugly moments in our nation’s history — specifically, Detroit’s 1967 12th Street Riot — and the challenge of adapting that history into a narrative feature film was no easy task. What do you show? What perspective do you give space to, and which ones do you avoid?

I sat down to talk to William “Billy” Goldenberg, ACE, and Harry Yoon, the editors of the film Detroit, to tackle these and other questions related to the film and their careers.

Billy Goldenberg

First, here are some details about the edit: Detroit was cut on Avid Media Composer 8.5.3 using an ISIS 5000 shared storage solution. The film was shot on Alexa Mini in ArriRaw. Dailies were delivered at DNX36 and then swapped for identical DNX115 media at the end of each production week.

In addition to Goldenberg and Yoon, other members of the Detroit editorial team were additional editor Brett Reed, VFX editor Justin Yates, first assistant editor Peter Dudgeon and apprentice editor Jun Kim. The film will be available digitally on November 28 and on DVD/Blu-Ray December 12.

Ok, let’s dig in…

How did this project come about?
Billy: Kathryn called to meet several months before the project started shooting. She sent me the script, but it soon became clear that I wouldn’t be able to start the film because I was still finishing Ben Affleck’s Live by Night. Kathryn said, “Look, let’s bring another editor on until you’re available, and then both of you can finish together.”

I thought of Harry because he had done some great work on The Newsroom, he knew Kathryn, and I knew he was a smart and talented guy. I ran it by Kathryn and she thought it was a great idea. So Harry was able to start the film.

At the beginning, half of the time I was at an editing room for Live by Night down the hall from the editing room for Detroit. I would sort of run back and forth throughout the day cutting and doing the stuff for both films. We did that for two months, and then I came onto Detroit full time. We did the rest of it together up until the end of the director’s cut. I finished the film from there.

Harry Yoon

How did you guys approach tackling the project together?
Harry: We had our key assistant Peter Dudgeon — who had worked in Billy’s cutting room on Live by Night and on a couple of other projects — there to prep dailies for Billy. It was fortunate because the way Billy likes to organize his bins and media is very, very akin to what I like as well.

In the mornings we would get dailies, and Billy and I would talk about who would take different scenes. Sometimes Billy wanted to really work on a particular sequence. Sometimes, I did. We would split scenes in the morning, and then go off and do our work. In the evening we’d come back together. This was my favorite part of the day because we would watch each other’s scenes. I would learn so much by seeing what he’d done and how he would approach the material. Getting critique and feedback from someone like Billy was like a master class for me.

I was also impressed that as Billy was working, he would ask the opinion of not just me, but the assistant as well. To have somebody be so transparent in his process was not only incredibly instructive personally, but really helped us to have a consistent style and approach to the material as we were working day by day. That consistent approach is apparent, especially during the entire Algiers Motel sequence in the film. It was one of the most visceral and emotionally draining things I’ve ever seen.

When I saw the film for a second time, I timed that sequence at 42 minutes. Seeing it the first time, I remember thinking that the sequence felt realtime. It felt like you were living through 42 minutes of these people’s lives. How did you approach something of that magnitude?
Billy: They shot that in sequence order, for the most part, for about three weeks. But they did shoot sections at a time that ultimately had to be mixed together. We got everything cut individually and then sat down together and decided how to work all the simultaneous action. We used the benefit of having two heads as opposed to one and talked about where things should be. What we would see and what we wouldn’t see. How to make this all feel simultaneous, but at a certain point, it’s just a feel thing.

Harry: One of the interesting challenges of this segment was that because Kathryn was shooting in realtime, and because the annex building was an actual building — it wasn’t a stage — camera people would be positioned in areas of overlapping action because Kathryn really wanted to make sure that the actors were in the moment every step of the way.

We would often finish a scene but then get new material for that scene that we could mine for better moments. Or, it might make sense to use the new coverage instead of the coverage from the day before to better show which character was where at what time. It was like having a puzzle and you would keep getting new pieces for the puzzle every day. It was definitely difficult, especially as the scene started to take shape. It was impossible not to feel a kind of resonance with everyday events that we were seeing on the news or on YouTube. I think it was tough to grapple with, but at the same time incredibly motivating for both Billy and Kathy and I — really everybody involved with the project — to say, “We have to get this right.” But, also, you’re adapting history. This is historical fiction; it’s not a documentary.

At the end of the film it says, “No one knows fully what happened. This has been pieced together through testimonials and interviews.”
Billy: I don’t know that I’m objective about what happened, obviously, but I did feel like I was just trying to portray the events as they occurred. And, Kathryn and Mark [Boal, Detroit’s screenwriter] did extensive research. They had police reports and ballistic reports, and this is what happened to the best of anybody’s recollection.

I tried to tell it as it happened and not bring my own feelings to it. We wanted people to experience that hallway sequence and the film, as a whole, in a way so they could draw their own conclusions. Let the audience experience it and understand this is why attention needs to be paid to this kind of violence.

Harry: Our conversations with Kathryn were critical throughout that process. She and Mark did extensive interviews with eyewitnesses. So, I think she was relying upon them for some of the factual elements, or at least what they remembered. But, I think any time where there was some ambiguity we tried to be true to that to a certain extent. We checked in with her about what to show and what not to show through that process.
As Billy said, what we didn’t want was to try to be manipulative for cinematic effect. The nature of events were so tragic and so brutal that it was still a very difficult thing to go through. Even though we tried to be as measured as possible while we were putting it together, it was a tough balancing act.

What kind of prep work was involved in this for you?
Billy: A lot of movies in my career are based on true events and true stories. With the first couple, I did a tremendous amount of research, and it seemed to get me into a little bit of trouble. I would start to think, “Well, it really happened like this, or it really went like that. How come we’re not using this part of the book or that part of the book?” It took my mind away from the story we were telling. So, I’ve learned over the years to do just enough research to where I feel like I have an understanding of the subject at that time in history.

But with the specific events of the Algiers, because they’re disputed somewhat, I tried to learn as much as I could about that time in history in 1967. What was happening in the country, and how we got there. In terms of the specific events, I tried not to learn too much. I relied on Kathryn and Mark’s research to have gotten it as close as they could get it. It’s not a documentary, I was still trying to tell the story. It’s a little bit of a balancing act in these types of movies.

Harry: I agree with Billy, it’s best to not focus on research for the particular story at hand, but to understand the context. One thing that impacted our editorial process was we received several reels of stock footage from the Michigan Film Archive. It was a lot of news footage from that time — aerials of fires, street-level shots of the National Guard stopping people, store fronts and things like that. That was really inspiring to see because it just felt so real in the feel of things and felt very of the moment. This led us into an additional hunt for material that took us through YouTube and a lot of period films, including documentaries that were done either during or right after the rebellion that focused on questions of, “How did this happen?”

It was a really wonderful way to sort of deep dive into that moment. We actually ended up using some footage from those documentaries throughout the film. Not just adding original film from the archives, but using it as source material as well. It was a great way for us to sort of hear the voices and see the footage of the time versus through the distance of history.

Let’s pivot away from the film a little bit. Let’s talk about mentorship. What does it mean to you? How has both being a mentor and a mentee been beneficial for your careers?
Billy: I assisted Michael Kahn, Steven Spielberg’s editor, for four years. To say that he was my mentor is sort of short-changing it. He was like my graduate professor. Being his first assistant, taught me almost everything that I needed to know about editing and how to be an editor. Obviously, he couldn’t give me talent, but he made me realize I had talent. At least he thought I did. He taught me how to handle myself politically, how to take criticism and how to approach scenes. If it wasn’t for his mentorship, I certainly wouldn’t be where I am right now.

He saw something in me that I didn’t see in myself. I’ve in turn tried to help others. Brett Reed has been with me for 17 years. He started out as my PA and has been my first assistant for about 11 years. He just got his first job as a film editor, so I’m losing him. I hope that I’ve done for him what Michael did for me.

At the end of my assistant career with Michael, he called up Phil Gersh of the Gersh Agency and said, “You know, you should sign this guy. He’s going to be a really talented editor.” He signed me when I was still an assistant. I was able to do the same thing for Brett at ICM. They signed him without him ever having cut a film. It makes me so happy that I was able to do something for somebody that worked so hard and deserved it. Brett made my editing better. He’s smart and he was able to be a bit more objective sometimes since he wasn’t the one working with the footage all day long.

The people I have working for me are really good at running the room and prepping the dailies, But I also picked them because they have a lot of creative talent and they help me. Harry touched on it earlier about me having the generosity of having other people in the room. Well, it’s a little generosity, but it’s also a lot that I value their opinions and it makes my editing better to hear other smart, talented people’s opinions. It really is a give-and-take relationship I don’t think that there’s ever been a more important relationship in my professional life than the editor/assistant mentorship one.

Harry: After a couple of years working here in LA, I was lucky enough to be part of a mentorship program called, “Project Involve” at Film Independent. I was paired up with Stephen Mirrione. To be able to speak to someone of his level and with his dedication to the craft — and his understanding of not just the hard skills of editing but also the people skills — was an amazing introduction. It gave me a very vivid picture of the kind of things that I needed to learn in order to get to that place. And consistently through my career, I’ve been given timely, incredible advice from people that I’ve sought out to be my mentors, including Troy Takaki and Lisa Lassek and, most recently, Billy. We worked as colleagues, but he modeled every day.

So much of what you don’t know is the soft skills. You can be a good editor in front of your Avid, or whatever system, but so much of what determines success is how you are in a room… your people skills, your work ethic.  Understanding when to speak and when not to. When is it appropriate for you to give a note? How to read the dynamic going on in a particular room. These are things that are probably as critical or more critical than whether or not you can make a good cut.

I could listen to you guys talk all day, but I want to be respectful of your time. Anything you want to leave our audience with?
Billy: I know this sounds cheesy, but I think it’s how lucky I feel getting to work with someone like Kathryn on Detroit. Or to work with some of the directors I’ve gotten to work with, and I put Katherine at the top of that list.  I can’t believe how fortunate I have been to have the career that I have.

Harry: What that speaks to in relation to Detroit is what I’ve seen consistently in the people that I’ve been mentored by, and whose careers I’ve most admired — how important it is to continue to love the craft. I find it inspiring and endlessly fascinating. What I see in people is they’re motivated by this sense that there’s always more to learn. The sequence could always be better. The scene can always be better. That’s something that I definitely saw in Billy through this process.


Chris Visser is a Wisconsin kid who works and lives in LA. He’s currently an assistant editor in scripted TV, as well as the VP of BCPCWest, the Los Angeles-based chapter of the Blue Collar Post Collective. You can find him on Twitter (@chrisvisser)

Red Giant Universe 2.2 gets 11 new transitions, supports Media Composer

Red Giant is now offering Universe 2.2, which features 11 new transition tools — 76 transitions and effects in total — for editors and motion graphics artists. In addition to brand new transitions, Red Giant has made updates to two existing plugins and added support for Avid Media Composer. The Universe toolset, and more, can be seen in action in the brand new short film Hewlogram, written and directed by Red Giant’s Aharon Rabinowitz, and starring David Hewlett from the Stargate: Atlantis series.

The latest update to Red Giant’s collection of GPU-accelerated plugins, Universe 2.2’s transitions range from Retrograde, which creates an authentic film strip transition using real scans from 16mm and 8mm film to a Channel Surf transition that creates the effect of changing channels on an old CRT TV.

This release brings the complete set of Universe tools to Avid Media Composer, which means that all 76 Red Giant Universe effects and transitions now run in eight host applications, including: Adobe Premiere Pro CC, After Effects CC, Apple Final Cut Pro X, Blackmagic DaVinci Resolve and more.

Retrograde

Brand-new transition effects in Red Giant Universe 2.2 include:
• VHS Transition: A transition that mimics the effect that occurs when a VCR has been used to record over pre-existing footage.
• Retrograde Transition: A transition that that uses real scans of 16mm and 8mm film to create an authentic film strip transition.
• Carousel Transition: A transition that mimics advancing to the next slide in an old slide projector.
• Flicker Cut: A transition that rapidly cuts between two clips or a solid color, and which can invert the clips or add fades.
• Camera Shake Transition: A transition that mimics camera shake while it transitions between clips.
• Channel Surf: A transition that mimics the distortion you’d get by changing the channel on a cathode ray tube TV.
• Channel Blur: A transition that blurs each of the RGB channels separately for a unique chromatic effect.
• Linear Wipe: A classic linear wipe with the addition of wipe mirroring, as well as an inner/outer stroke with glow on the wipe border.
• Shape Wipe: A transition that uses an ellipse, rectangle or star shape to move between 2 pieces of footage. Includes control over points, size, stroke and fill.
• Color Mosaic: A Transition that overlays a variety of colors in a mosaic pattern as it transitions between 2 clips.
• Clock Wipe: A classic radial wipe transition with feathering and the option for a dual clock wipe.

Updates to existing effects in Universe 2.2 include:
• VHS: This update includes new VHS noise samples, VHS style text, timecode and function icons (like play, fast-forward, rewind), updated presets, and updated defaults for better results upon application.
• Retrograde: This update includes a small but valuable addition that allows Retrograde to use the original aspect ratio of your footage for the effect.

Existing Universe customers can download the new tools directly by launching Red Giant Link. Universe is available as an annual subscription ($99/year) or as a monthly subscription ($20/month). Red Giant Universe is available in Red Giant’s Volume Program, the flexible and affordable solution for customers who need five or more floating licenses.

Raising money and awareness for childhood cancer via doc short

Pablove One Another is a documentary short film produced by Riverstreet and directed by the company’s co-founders Tracy Pion and Michael Blum. The film explores Pablove’s Shutterbug program for children undergoing cancer treatment and its connection to the cancer research work that Pablove funds.

Blum and Pion spoke with us about the project, including the release of its title track “Spark” and the importance of giving back.

How did you become involved in the project?
Pion: We have known Pablove’s founders Jo Ann Thrailkill and Jeff Castelaz, for almost 11 years. Our sons were dear friends and classmates in preschool. When Jeff and Jo Ann lost their son Pablo to cancer eight years ago they set out to start a foundation named Pablove in his honor. We’ve been committed to helping Pablove whenever we can along the way by doing PSAs and other short films and TV spots in order to help raise awareness for the organization’s mission, including the Shutterbugs program and research funding.

Michael Blum, Mady and Tracy Pion.

What was the initial goal of the documentary?
Blum: The goal was always about awareness and fundraising. It first debuted at the annual Pablove Foundation gala fundraiser and helped raise over $500,000 in an hour. It continues to live online and hopefully it inspires people to connect with Pablove and support its amazing programs.

Beyond the amazing cause, why was this project a good fit for Riverstreet?
Pion: At the core of what we do — campaigns, commercials, interstitials, network specials — is emotionally-driven storytelling. We do development, scripting, design, animation, live-action production, editorial and completion for a variety of brands and networks and when possible we try to apply this advertising and production expertise to philanthropic causes. Our collaboration with Pablove came out of a deeply personal connection, but above and beyond that, we think that our industry has an obligation to use our resources to help raise awareness. Why not use our power of persuasion for the betterment of others?

How did you decide on the approach and the interweaving of stories?
Blum: The film tells the Pablove story from three experiences: a young girl who is being treated for cancer who is part of Pablove’s Shutterbug photography program; an instructor with Shutterbugs who is a cancer survivor; and a researcher whose innovation is supported in part by Pablove’s grants. We thought it was important to tell the human impact of the work of the Pablove Foundation through different vantage points to reflect the scope of what they do. We worked with a fundraising expert (Benevon) who advised Pablove and Riverstreet on how to design the film from a high-impact standpoint.

What were some unexpected or unique moments in the production of the film?
Pion: Well, for us it was a couple of things. Firstly, the power of the kids’ photos really caught us, especially those by Mady, who we were featuring. When she pulled out her “Light the End of the Tunnel” image we were doubly struck by the simple power of the image and its obvious meaning for her, and, as filmmakers, we knew we had our ending. We were also grateful of how sensitive our crew was with the Mady and Miles. Everyone was working for hardly any money and yet they didn’t want to be anywhere else. It was a moment of gratitude for the amazing crews that we have gathered together over the years.

What were some of the editing challenges to the above?
Pion: We had several hours of footage, and some very emotional interviews with our subjects, so it was a real but familiar challenge: how to pick the most salient footage and how to weave the threads together and how to capture the emotion.

What was the documentary edited on?
Pion: We use Avid Media Composer on an ISIS server.

How did the song come to be?
Blum: While working on the film, we were looking for a music track that would effectively unite these interweaving stories. We heard a girl singing on our daughter’s phone — a classmate — and thought, wouldn’t it be great to have a young teenager’s voice on a spot that is a for and about children. The Bird & The Bee’s “Spark,” paired with the luminous voice of Gracie Abrams, perfectly carries through the message of the Foundation’s impact on the lives of children through creativity and research funding. Written by Inara George and Greg Kurstin, the music production was handled by composer/producer Rob Cairns, who has worked with Riverstreet on numerous projects.

Pion: At the fundraiser, people were buzzing about the song, trying to Shazam it. We loved the song, and thought it was amazing for the film, but this reaction made us stop and consider, “Is there something more we can do with it to help Pablove?” Fortunately, everyone who worked on it felt the same way, and agreed to release the track with proceeds going to Pablove Foundation.

Cabin Editing Company opens in Santa Monica focusing on editing, VFX

Cabin Editing Company has opened in Santa Monica, started by three industry veterans: managing partner Carr Schilling and award-winning editors Chan Hatcher, Graham Turner and Isaac Chen.

“We are a company of film editors with a passion for storytelling who are committed to mentoring talent and establishing lasting relationships with directors and agencies,” says Schilling, who formerly worked alongside Hatcher, Turner and Chen at NO6.

L-R: Isaac Chen, Carr Schilling, Graham Turner and Chan Hatcher.

Cabin, which also features creative director/Flame artist Verdi Sevenhuysen and editor Lucas Spaulding, will offer creative editorial, visual effects, finishing, graphics and color. The boutique’s work spans mediums across broadcast, branded content, web, film and more.

Why was now the right time to open a studio? “Everything aligned to make it possible, and at Cabin we have a collective of top creative talent where each of us bring our individual style to our projects to create great work with our clients,” says Schilling.

The boutique studio has already been busy working with agencies such as 215 McCann, BBDO, CP+B, Deutsch, GSD&M, Mekanism and Saatchi & Saatchi.

In terms of tools, Cabin uses Avid Media Composer and Autodesk Flame Premium all centralized to the Facilis TerraBlock shared storage system via Fibre.

Michael Kammes’ 5 Things – Video editing software

By Randi Altman

Technologist Michael Kammes is back with a new episode of 5 Things, which focuses on simplifying film, TV and media technology. The web series answers, according to Kammes, the “five burning tech questions” people might have about technologies and workflows in the media creation space. This episode tackles professional video editing software being used (or not used) in Hollywood.

Why is now the time to address this segment of the industry? “The market for NLEs is now more crowded than it has been in over 20 years,” explains Kammes. “Not since the dawn of modern NLEs have there been this many questions over what tools should be used. In addition, the massive price drop of NLEs, coupled with the pricing shift (monthly/yearly, as opposed to outright) has created more confusion in the market.”

In his video, Kammes focuses on Avid Media Composer, Adobe Premiere, Apple Final Cut Pro, Lightworks, Blackmagic Resolve and others.

Considering its history and use on some major motion pictures, (such as The Wolf of Wall Street), why hasn’t Lightworks made more strides in the Hollywood community? “I think Lightworks has had massive product development and marketing issues,” shares Kammes. “I rarely see the product pushed online, at user groups or in forums.  EditShare, the parent company of LightWorks, also deals heavily in storage, so one can only assume the marketing dollars are being spent on larger ticket items like professional and enterprise storage over a desktop application.”

What about Resolve, considering its updated NLE tools and the acquisition of audio company Fairlight? Should we expect to see more Resolve being used as a traditional NLE? “I think in Hollywood, adoption will be very, very slow for creative editorial, and unless something drastic happens to Avid and Adobe, Resolve will remain in the minority. For dailies, transcodes or grading, I can see it only getting bigger, but I don’t see larger facilities adopting Resolve for creative editorial. Outside of Hollywood, I see it gaining more traction. Those outlets have more flexibility to pivot and try different tools without the locked-in TV and feature film machine in Hollywood.”

Check it out:

Jimmy Helm upped to editor at The Colonie

The Colonie, the Chicago-based editorial, visual effects and motion graphics shop, has promoted Jimmy Helm to editor. Helm has honed his craft over the past seven years, working with The Colonie’s senior editors on a wide range of projects. Most recently, he has been managing ongoing social media work with Facebook and conceptualizing and editing short format ads. Some clients he has collaborated with include Lyft, Dos Equis, Capital One, Heineken and Microsoft. He works on both Avid Media Composer and Adobe Premiere.

A filmmaking major at Columbia College Chicago, Helm applied for an internship at The Colonie in 2010. Six months later he was offered a full-time position as an assistant editor, working alongside veteran cutter Tom Pastorelle on commercials for McDonald’s, Kellogg’s, Quaker and Wrangler. During this time, Helm edited numerous projects on his own, including broadcast commercials for Centrum and Kay Jewelers.

“Tom is incredible to work with,” says Helm. “Not only is he a great editor but a great person. He shared his editorial methods and taught me the importance of bringing your instinctual creativity to the process. I feel fortunate to have had him as a mentor.”

In 2014, Helm was promoted to senior assistant editor and continued to hone his editing skills while taking on a leadership role.

“My passion for visual storytelling began when I was young,” says Helm “Growing up in Memphis, I spent a great deal of time watching classic films by great directors. I realize now that I was doing more than watching — I was studying their techniques and, particularly, their editing styles. When you’re editing a scene, there’s something addictive about the rhythm you create and the drama you build. I love that I get to do it every day.”

Helm joins The Colonie’s editorial team, comprised of Joe Clear, Keith Kristinat, Pastorelle and Brian Salazar, along with editors and partners Bob Ackerman and Brian Sepanik.

 

 

Baby Driver editors — Syncing cuts to music

By Mel Lambert

Writer/director Edgar Wright’s latest outing is a major departure from his normal offering of dark comedies. Unlike his Three Flavours Cornetto film trilogy — Shaun of the Dead, Hot Fuzz and The World’s End — and Scott Pilgrim vs. the World, TriStar Pictures’ Baby Driver has been best described as a romantic musical disguised as a car-chase thriller.

Wright’s regular pair of London-based picture editors, Paul Machliss, ACE, and Jonathan Amos, ACE, also brought a special brand of magic to the production. Machliss, who had worked with Wright on Scott Pilgrim, The World’s End and his TV series Spaced for Channel 4, recalls that, “very early on, Edgar decided that I should come along on the shoot in Atlanta to ensure that we had the material he’d already storyboarded in a series of complex animatics for the film [using animator Steve Markowski and editor Evan Schiff]. Jon Amos joined us when we returned to London for sound and picture post production, primarily handling the action sequences, at which he excels.”

Developed by Wright over the past two decades, Baby Driver tells the story of an eponymous getaway driver (Ansel Elgort), who uses earphones to drown out the “hum-in-the-drum” of tinnitus — the result of a childhood car accident — and to orchestrate his life to carefully chosen music. But now indebted to a sinister kingpin named Doc (Kevin Spacey), Baby becomes part of a seriously focused gang of bank robbers, including Buddy and Darling (Jon Hamm and Eiza González), Bats (Jamie Foxx) and Griff (Jon Bernthal). Debora, Baby’s love interest (Lily James), dreams of heading west “in a car I can’t afford, with a plan I don’t have.” Imagine, in a sense, Jim McBride’s Breathless rubbing metaphorical shoulders with Tony Scott’s True Romance.

The film also is indebted to Wright’s 2003 music video for Mint Royale’s Blue Song, during which UK comedian/actor Noel Fielding danced in a stationery getaway car. In that same vein, Baby Driver comprises a sequence of linked songs that tightly choreograph the action and underpin the dramatic arcs being played out, often keying off the songs’ lyrics.

The film’s opener, for example, features Elgort partly lipsyncing to “Bellbottoms,” by the Jon Spencer Blues Explosion, as the villains commit their first robbery. In subsequent scenes, our hero’s movements follow the opening bass riffs of The Damned’s “Neat Neat Neat,” then later to Golden Earring’s “Radar Love” before Queen’s “Brighton Rock” adds complex guitar cacophony to a key encounter scene.

Even the film’s opening titles are accompanied by Baby performing a casual coffee run in a continuous three-minute take to Bob & Earl’s “Harlem Shuffle” — a scene that reportedly took 28 takes on the first day of practical photography in Atlanta. And the percussion and horns of “Tequila” provide syncopation for a protracted gunfight. Fold in “Egyptian Reggae,” “Unsquare Dance,” and “Easy,” followed by “Debora,” and it’s easy to appreciate that Wright is using music as a key and underpinning component of this film. The director also brought in music video choreographer Ryan Heffington to achieve the timing precision he needed.

The swift action is reflected in a fast style of editing, including whip pans and crash zooms, with cuts that are tightly synchronized to the music. “Whereas the majority of Edgar’s previous TV series and films have been parodies, for Baby Driver he had a very different idea,” explains Machliss. Wright had accumulated a playlist of over 30 songs that would inspire various scenes in his script. “It’s something that’s very much a part of my previous films,” says director Wright, “and I thought of this idea of how to take that a stage further by having a character who listens to music the entire time.”

“Edgar had organized a table read of his script in the spring of 2012 in Los Angeles, at which he recorded all of the dialog,” says Machliss. “Taking that recording, some sound effects and the music tracks, I put together a 100-minute ‘radio play’ that was effectively the whole film in audio-only form that Edgar could then use as a selling tool to convince the studios that he had a viable idea. Remember, Baby Driver was a very different format for him and not what he is traditionally known for.”

Australia-native Machliss was on set to ensure that the gunshots, lighting effects, actors and camera movements, plus car hits, all happened to the beat of the accompanying music. “We were working with music that we could not alter or speed up or slow down,” he says. “We were challenged to make sure that each sequence fit in the time frame of the song, as well as following the cadence of the music.”

Almost 95% of music included in the first draft of Wright’s script made it into the final movie according to Machliss. “I laid up the relevant animatic as a video layer in my Avid Media Composer and then confirmed how each take worked against the choreographed timeline. This way I always had a reference to it as we were filming. It was a very useful guide to see if we were staying on track.”

Editing On Location
During the Atlanta shoot, Machliss used Apple ProRes digital files captured by an In2Core QTake video assist that was recording taps from the production’s 35mm cameras. “I connected to my Mac via Ethernet so I could create a network to the video assist’s storage. I had access to his QuickTime files the instant he stopped recording. I could use Avid’s AMA function to place the clip in the timeline without the need for transcoding. This allowed almost instantaneous feedback to Edgar as the sequence was built up.”

Paul Machliss on set.

While on location, Machliss used a 15-inch MacBook Pro, Avid Mojo DX and a JVC video monitor “which could double as a second screen for the Media Composer or show full-screen video output via the Mojo DX.” He also had a Wacom tablet, an 8TB Thunderbolt drive, a LaCie 500GB rugged drive — “which would shuttle my media between set and editorial” — and an APU “so that I wouldn’t lose power if the supply was shut down by the sparks!”

LA’s Fotokem handled film processing, with negative scanning by Efilm. DNX files were sent to Company 3 in Atlanta for picture editorial, “where we would also review rushes in 2K sent down the line from Efilm,” says Machliss. “All DI on-lining and grading took place at Molinare in London.” Bill Pope, ASC, was the film’s director of photography.

Picture and Sound Editorial in London
Instead of hiring out editorial suites at a commercial facility in London, Wright and his post teams opted for a different approach. Like an increasing number of London-based productions, they elected to rent an entire floor in an office building.

They located a suitable location on Berners Street, north of the Soho-based film community. As Machliss recalls: “That allowed us to have the picture editorial team in the same space as the sound crew,” which was headed up by Wright’s long-time collaborator Julian Slater, who served as sound designer, supervising sound editor and re-recording engineer on Baby Driver. “Having ready access to Julian and his team meant that we could collaborate very closely — as we had on Edgar’s other films — and share ideas on a regular basis,” as the 10-week Director’s Cut progressed.

British-born Slater then moved across Soho to Goldcrest Films for sound effects pre-dubs, while his co-mixer, Tim Cavagin, worked on dialog and Foley pre-mixes at Twickenham Studios. Print mastering of the Dolby Atmos soundtrack occurred in February 2017 at Goldcrest, with Slater handling music and SFX, while Cavagin oversaw dialog and Foley. “Following Edgar’s concept of threading together the highly choreographed songs with linking scenes, Jon and I began the cut in London against the pre-assembled material from Atlanta,” says Machliss.

To assist Machliss during his picture cut, the film’s sound designer had provided a series of audio stems for his Avid. “Julian [Slater] had been working on his sound effects and dialog elements since principal photography ended in Atlanta. He had prepared separate, color-coded left-center-right stems of the music, dialog and SFX elements he was working on. I laid these [high-quality tracks] into Media Composer so I could better appreciate the intricacies of Julian’s evolving soundtrack. It worked a lot better than a normal rough mix of production dialog, rough sound effects and guide music.”

“From its inception, this was a movie for which music and sound design worked together as a whole piece,” Slater recalls. “There is a large amount of syncopation of the diegetic sounds [implied by the film’s action] to the music track Baby is listening to. Sometimes it’s obvious because the action was filmed with that purpose in mind. For example, walking in tempo to the music track or guns being fired in tempo. But many times it’s more subtle, including police sirens or distant trains that have been pitched and timed to the music,” and hence blend into the overall musical journey. “We strived to always do this to support the story, and to never distract from it.”

Because of the lead character’s tinnitus, Slater worked with pitch changes to interweave elements of the film’s soundtrack. “Whenever Baby is not listening to music, his tinnitus is present to some degree. But it became apparent very soon in our design process that strident, high-pitched ‘whistle tones’ would not work for a sustained period of time. Working closely with composer Steven Price, we developed a varied set of methods to convey the tinnitus — it’s rarely the same sound twice. Much of the time, the tinnitus is pitched according to either the outgoing or incoming music track. This then enabled us to use more of it, yet at the same time be quite subtle.”

Meticulous Planning for Set Pieces and Car Chases
Picture editor Amos joined the project at the start of the Director’s Cut to handle the film’s set pieces. He says, “These set pieces were conceptually very different from the vast majority of action scenes in that they were literally built up around the music and then visualized. Meticulous development and planning went into these sequences before the shoot even began, which was decisive in making the action become musical. For example, the ‘Tequila’ gunfight started as a piece of music by Button Down Brass. It was then laced with gunfire and SFX pitched to the music, and in time with the drum hits — this was done at the script stage by Mark Nicholson (aka, Osymyso, a UK musician/DJ) who specializes in mashup/bastard pop and breakbeat.”

Storyboards then grew around this scripted sound collage, which became a precise shot list for the filmed sequences. “Guns were rigged to go off in time with the music; it was all a very deliberate thing,” adds Amos. “Clearly, there was a lot of editing still to be done, but this approach illustrates that there’s a huge difference between something that is shot and edited to music, and something that is built around the music.”

“All the car chases for Baby Driver were meticulously planned, and either prevised or storyboarded,” Amos explains. “This ensured that the action would always fit into the time slot permitted within the music. The first car chase [against the song ‘Bellbottoms’] is divided into 13 sections, to align to different progressions in the music. One of the challenges resulted from the decision to never edit the music, which meant that none of these could overrun. Stunts were tested and filmed by second unit director Darrin Prescott, and the footage passed back to editorial to test against the timing allowed in the animatic. If a stunt couldn’t be achieved in the time allowed, it was revised and tweaked until it worked. This detailed planning gave the perfect backbone to the sequences.”

Amos worked on the sequences sequentially, “using the animatic and Paul’s on-set assembly as reference,” and began to break down all the footage into rolls that aligned to specific passages of the music. “There was a vast amount of footage for all the set pieces, and things are not always shot in order. So generally I spent a lot of time breaking the material down very methodically. I then began to make selects and started to build the sequences from scratch, section by section. Once I completed a pass, I spent some time building up my sound layers. I find this helps evolve the cut, generating another level of picture ideas that further tighten the syncopation of sound and picture.”

Amos’ biggest challenge, despite all the planning, was finding ways to condense the material into its pre-determined time slot. “The real world never moves quite like animatics and boards. We had very specific points in every track where certain actions had to take place; we called these anchor points. When working on a section, we would often work backwards from the anchor point knowing, for instance, that we only had 20 seconds to tell a particular part of the story. Initially, it can seem quite restrictive, but the edits become so precise.

Jonathan Amos

“The time restriction led to a level of kineticism and syncopation that became a defining feature of the movie. While the music may be the driving force of the action scenes, editorial choices were always rooted in the story and the characters. If you lose sight of the characters, the audience will disengage with the sequence, and you’ll lose all the tension you’ve worked so hard to create. Every shot choice was therefore very considered, and we worked incredibly hard to ensure we never wasted a frame, telling the story in the most compelling, rhythmic and entertaining way we could.”

“Once we had our cut,” Machliss summarizes, “we could return the tracks to Julian for re-conforming,” to accommodate edit changes. “It was an excellent way of working, with full-sounding edit mixes.”

Summing up his experience in Baby Driver, Machliss considers the film to be “the hardest job I’ve ever done, but the most fun I’ve ever had. Ultimately, our task was to create a film that on one level could be purely enjoyed as an exciting/dramatic piece of cinema, but, on repeated viewing, would reveal all the little elements ‘under the surface’ that interlock together — which makes the film unique. It’s a testament to Edgar’s singular vision and, in that regard, he is a tremendously exciting director to work with.”


Mel Lambert has been involved with production industries on both sides of the Atlantic for more years than he cares to remember. He is principal of Content Creators, a LA-based copywriting and editorial service, and can be reached at mel.lambert@content-creators.com. He is also a long-time member of the UK’s National Union of Journalists.

WWE adds iPads, iPhones to production workflow

By Nick Mattingly

Creating TV style productions is a big operation. Lots of equipment, lots of people and lots of time. World Wrestling Entertainment (WWE) is an entertainment company and the largest professional wrestling organization in the world. Since its inception, it has amassed a global audience of over 36 million.

Each year, WWE televises over 100 events via its SmackDown, WWE Raw and Pay-Per-View events. That doesn’t include the hundreds of arena shows that the organization books in venues around the world.

“Putting this show on in one day is no small feat. Our shows begins to load-in typically around 4:00am and everything must be up and ready for production by 2:00pm,” explained Nick Smith, WWE’s director of remote IT and broadcast engineering. “We travel everything from the lighting, PA, screens, backstage sets, television production facilities, generators and satellite transmission facilities, down to catering. Everyone [on our team] knows precisely what to do and how to get it done.”

Now the WWE is experimenting with a new format for the some 300 events it hosts that are currently not captured on video. The goal? To see if using Switcher Studio with a few iPhones and iPads can achieve TV-style results. A key part of testing has been defining workflow using mobile devices while meeting WWE’s high standard of quality. One of the first requirements was moving beyond the four-camera setup. As a result, the Switcher Studio team produced a special version of Switcher that allows unlimited sources. The only limitation is network bandwidth.

Adding more cameras was an untested challenge. To help prevent bottlenecks over the local network, we lowered the resolution and bitrate on preview video feeds. We also hardwired the primary iPad used for switching using Apple dongles. Using the “Director Mode” function in Switcher Studio. WWE then triggered a recording on all devices.

For the first test using Switcher Studio, the WWE had a director and operator at the main iPad. The video from the iPad was output to an external TV monitor using Apple’s AirPlay. This workflow allowed the director to see a live video feed from all sources. They were also able to talk with the camera crew and “direct” the operator when to cut to each camera.

The WWE crew had three camera operators from their TV productions to run iPhones in and around the ring. To ensure the devices had enough power to make it through the four-hour-long event, iPhones were attached to batteries. Meanwhile, two camera operators captured wide shots of the ring. Another camera operator captured performer entrances and crowd reaction shots.

WWE setup a local WiFi network for the event to wirelessly sync cameras. The operator made edits in realtime to generate a line cut. After the event the line cut and a ISO from each angle was sent to the WWE post team in the United Kingdom.

Moving forward, we plan to make further improvements to the post workflow. This will be especially helpful for editors, using tools like Adobe Premiere or Avid Media Composer.

If future tests prove successful, WWE could use this new mobile setup to provide more content to their fans–building new revenue streams along the way.


Nick Mattingly is the CEO/co-founder of Switcher Studio. He has over 10 years of experience in video streaming, online monetization and new technologies. 

Bluefish444 releases IngeSTore 1.1, adds edit-while-record capability

Bluefish444 was at NAB with Version 1.1 of its IngeSTore multichannel capture software, which is now available free from the Bluefish444 website. Compatible with all Bluefish444 video cards, IngeSTore captures multiple simultaneous channels of 3G/HD/SD-SDI to popular media files for archive, edit, encoding or analysis. IngeSTore improves efficiency in the digitization workflow by enabling multiple simultaneous recordings from VTRs, cameras and any other SDI source.

The new version of IngeSTore software also adds “Edit-While-Record” functionality and additional support for shared storage including Avid. Bluefish444 has partnered with Drastic Technologies to bring additional CODEC options to IngeSTore v1.1 including XDCAM, DNxHD, JPEG 2000, AVCi and more. Uncompressed, DV, DVCPro and DVCPro HD codecs will be made available free to Bluefish444 customers in the IngeSTore update.

The Edit-While-Record functionality allows editors access captured files while they are still being recorded to disk. Content creation tools such as Avid Media Composer, Adobe Premiere Pro CC, and Assimilate Scratch can output SDI and HDMI with Bluefish444 video cards while IngeSTore is recording and the files are growing in size and length.