Category Archives: TV Series

Emmy Awards: OJ: Made in America composer Gary Lionelli

By Jennifer Walden

The aftermath of a tragic event plays out in front of the eyes of the nation. OJ Simpson, wanted for the gruesome murder of his wife and her friend, fails to turn himself in to the authorities. News helicopters follow the police chase that follows Simpson back to his Rockingham residence where they plan to take him into custody. Decades later, three-time Emmy-winning composer Gary Lionelli is presented with the opportunity to score that iconic Bronco chase.

Here, Lionelli talks about his approach to scoring ESPN’s massive documentary OJ: Made in America. His score on Part 3 is currently up for Emmy consideration for Outstanding Music Composition for a Limited Series. The entire OJ: Made in America score is available digitally through Lakeshore Records.

Gary Lionelli

Scoring OJ: Made in America seems like such a huge undertaking. It’s a five-part series, and each part is over 90 minutes long. How did you tackle this beast?
I’d never scored anything that long within such a short timeframe. Because each part was so long, it wasn’t like doing a TV series but more like scoring five 90-minute films back-to-back. I just focused on one cue at a time, putting one foot in front of the other so I wouldn’t feel overwhelmed by the full scope of the work and could relax enough to write the score! I knew I’d get to the finish line at some point, but it seemed so far away most of the time that I just didn’t want to dwell on that.

When you got this project, did they deliver it as one crazy, long piece? Or did they give it to you in its separate parts?
I got everything at once, which was totally mind-boggling. When you get any project, you need to watch it before you start working on it. For this one, it meant watching a seven-and-a-half-hour film, which was a feat in and of itself. The scale was just huge on this. Looking back, my eyelids still twitch.

It was a pretty nerve-racking time because the schedule was really tight. That was one of the most challenging parts of doing this project. I could have used a year to write this music, because five films are ordinarily what I‘d do in a year, not six months. But all of us who write music for film know that you have to work within extreme deadlines as a matter of course. So you say yes, and you find a way to do it

So you basically locked yourself up for 14 hours a day, and just plugged away at it?
Right, except it was actually about 15 hours a day, seven days a week, with no breaks. I finished the score 11 days before its theatrical release, which is insane. But, hey, that part is all in the past now, and it’s great to see the film out there getting such attention. One thing that made it worthwhile to me in the end was the quality of the filmmaking — I was riveted by the film the whole time I was working on it.

When composing, you worked only on one part at a time and not with an overall story arc in mind?
I watched all five parts over the course of four days. Once I’d watched the first two parts, I couldn’t wait to start writing so I did that for a bit and then went back to watch the rest.

The director Ezra Edelman wanted me to first score the infamous Bronco chase, which is in Part 3. It’s a 30-minute segment of that particular episode. It was a long sequence of events, all having to do with the chase itself, the events leading up to it and the aftermath of it. So that is what I scored first. It’s kind of strange to dive into a film by first scoring such a pivotal, iconic event. But it worked out — what I wrote for that segment stuck.

It was strange to be writing music for something I had seen on television 20 years before – just to think that there I was, watching the Bronco chase on TV along with everyone else, not having the remotest idea that 20 years down the line I was going to be writing music for this real-life event. It’s just a very odd thing.

The Bronco chase wasn’t a high-speed chase. It was a long police escort back to OJ’s house. The music you wrote for this segment was so brooding and it fit perfectly…
I loved when Zoe Tur, the helicopter pilot, said they were giving OJ a police motorcade. That’s exactly what he got. So I didn’t want to score the sequence by commenting literally on what was happening — what people were doing, or the fact that this was a “chase.” What I tried to do was focus on the subtext, which was the tragedy of the circumstances, and have that direct the course of the music, supplying an overarching musical commentary.

For your instrumentation, did the director let you be carried away by your own muse? Or did he request specific instruments?
He was specific about two things: one, that there would be a trumpet in the score, and two, he wanted an oboe. Other than those two instruments, it was up to me. I have a trumpet player, Jeff Bunnell, who I’ve worked with before. It’s a great partnership because he’s a gifted improviser, and sometimes he knows what I want even when I don’t. He did a fantastic job on the score.

I also had a 40-piece string section recorded at the Eastman Scoring Stage at Warner Bros. Studios. We used players here in town and they added a lot, really bringing the score to life.

Were you conducting the orchestra? Or did you stay close to the engineer in the booth?
I wanted to be next to the recording engineer so I could hear everything as it was being recorded. I had a conductor instead. Besides, I’m a terrible conductor.

What instruments did you choose for the Bronco chase score?
For one of the scenes, I used layers of distorted electric guitars. Another element of the score was musical sound manipulation of acoustic instruments through electronics. It’s a time-consuming way to conjure up sounds, with all the trial and error involved, but the results can sometimes give a film an identity beyond what you can do with an orchestra alone.

So you recorded real instruments and then processed them? Can you share an example of your processing chain?
Sometimes I will get my guitar out and play a phrase. I’ll take that phrase and play it backwards, drop it two octaves, put it through a ring modulator, and then I’ll chop it up into short segments and use that to create a rhythmic pattern. The result is nothing like a real guitar. I didn’t necessarily know what I was going for at the start, but then I’d end up with this cool beat. Then I’d build a cue around that.

The original sound could be anything. I could tap a pencil on a desk and then drop that three octaves, time compress it and do all sorts of other processing. The result is a weird drum sound that no one’s ever heard before. It’s all sorts of experimentation, with the end result being a sound that has some originality and that piques the interest of the person watching the film.

To break that down a little further, what program do you work in?
I work in Pro Tools. I went from Digital Performer to Logic — I think most film composers use Logic or Cubase, but there are a growing number who actually use Pro Tools. I don’t need MIDI to jump through a lot of hoops. I just need to record basic lines because most of that stuff gets replaced by real players anyhow.

When you work in Pro Tools, it’s already the delivery format for the orchestra, so you eliminate a conversion step. I’ve been using Pro Tools for the past four years, and so far it’s been working out great. It has some limitations in MIDI, but not that many and nothing that I can’t work around.

What are some of your favorite processing plug-ins?
For pitching, I use Melodyne by Celemony and Serato’s Pitch ‘n’ Time. There’s a new pitch shifter in Pro Tools called X-Form that’s also good. I also use Waves SoundShifter — whatever seems to do a better job for what I’m working on. I always experiment to see which one works the best to give me the sound I’m looking for.

Besides pitch shifters, I use GRM Tools by Ina-GRM. They make weird plug-ins, like one called Shift, that really convolute sound to the point where you can take a drum or rhythmic guitar and turn it into a high-hat sound. It doesn’t sound like a real high-hat. It sounds like a weird high-hat, not a real one. You never know what you’re going to get from this plug-in, and that’s why I like it so much.

I also use a lot of Soundtoys plug-ins, like Crystallizer, which can really change sounds in unexpected ways. Soundtoys has great distortion plug-ins too. I’m always on the hunt for something new.

A lot of times I use hardware, like guitar pedals. It’s great to turn real knobs and get results and ideas from that. Sometimes the hardware will have a punchier sound, and maybe you can do more extreme things with it. It’s all about experimentation.

You’ve talked before about using a Guitarviol. Was that how you created the long, suspended bass notes in the Bronco chase score?
Yes, I did use the Guitarviol in that and in other places in the score, too. It’s a very weird instrument, because it looks like a cello but doesn’t sound like one, and it definitely doesn’t sound like a guitar. It has a weird, almost Middle Eastern sound to it, and that makes you want to play in that scale sometimes. Sometimes I’ll use it to write an idea, and then I’ll have my cellist play the same thing on cello.

The Guitarviol is built by Jonathan Wilson, who lives in Los Angeles. He had no idea when he invented this thing that it was going to get adopted by the film composer community here in town. But it has, and he can’t make them fast enough.

Do you end up layering the Guitarviol and the cello in the mix? Or do you just go with straight cello?
It’s usually just straight cello. There are a couple of cellists I use who are great. I don’t want to dilute their performance by having mine in the background. The Guitarviol is an inspiration to write something for the cellists to hear, and then I’ll just have them take over from there.

The overall sound of Part 3 is very brooding, and the percussion choices have complementary deep tones. Can you tell me about some of the choices you made there?
Those are all real drums. I don’t use any samples. I love playing real drums. I have a real timpani, a big Brazilian Surdo drum, a gigantic steel bass drum that sounds like a Caribbean steel drum but only two octaves lower (it has a really odd sound), and I have a classic Ludwig Beatles drum kit. I have a marimba and a collection of small percussion instruments. I use them all.

Sometimes I will pitch the recordings down to make them sound bigger. The Surdo by itself sounds huge, and when you pitch that down half an octave it’s even bigger. So I used all of those instruments and I played them. I don’t think I used a single drum sample on the entire score.

When you use percussion samples, you have to hunt around in your entire hard drive for a great tom-tom or a Taiko drum. It’s so much easier to run over to one in your studio and just play it. You never know how it’s going to sound, depending on how you mic it that day. And it’s more inspiring to play the real thing. You get great variation. Every time you hit the drum it sounds different, but a sample sound pretty much sounds the same every time you trigger it.

For striking, did you choose mallets, brushes, sticks, your hands, or other objects?
For the Surdo, I used my hands. I use marimba mallets and timpani mallets for the other instruments. For example, I’ll use timpani mallets for the big steel bass drum. Sometimes I’ll use timpani mallets on my drum kit’s bass drum, because it gives a different sound. It has a more orchestral sound, not like a kick drum from a rock band.

I’m always experimenting. I use brushes a lot on cymbals, and I use the brushes on the steel drum because it gives it a weird sound. You can even use brushes on the timpani, and that creates a strange sound. There are definitely no rules. Whatever you think or can imagine having an effect on the drum, you just try it out. You never know what you’ll get — it’s always good to give it a chance.

In addition to the Bronco chase scene, are there any other tracks that stood out for you in Part 3?
When you score something this long, at a certain point everything starts to run together in your mind. You don’t remember what cue belongs to what scene. But there are many that I can remember. During the jury section of that episode, I used an oboe for Johnny Cochran speaking to the jury. That was an interesting pairing, the oboe and Johnny Cochran. In a way, the oboe became an extension of his voice during his closing argument. I can’t really explain why it worked, but somehow it was the right match.

For the beginning of Part 3, when the police arrive because there was a phone call from Nicole Brown Simpson saying she was afraid of OJ, the cue there was very understated. It had a lot of strange, low sounds to it. That one comes to mind.

At the end of Part 3, they go to OJ’s Rockingham residence, and his lawyers had staged the setting. I did a cue there that was sort of quizzical in a way, just to show the ridiculousness of the whole thing. It was like a farce, the way they set up his residence. So I made the score take a right turn into a different area for that part. It gets away from the dark, brooding undercurrent that the rest of Part 3’s score had.

Of all the parts you could have submitted for Emmy consideration, why did you choose Part 3?
It was a toss-up between Part 2 and Part 3. Part 2 had some of the more major trumpet themes, more of the signature sound with the trumpet and the orchestra. But there were a few examples of that in Part 3, too.

I just felt the Bronco chase, score-wise, had a lot of variation to it, and that it moved in a way that was unpredictable. I ultimately thought that was the way to go, though it was a close race between Part 2 and Part 3.

I found out later that ESPN had submitted Part 3 for Emmy consideration in other categories, so there is a bit of synergy there.

—————-

Jennifer Walden is a New Jersey-based audio engineer and writer.

Barry Sonnenfeld on Netflix’s A Series of Unfortunate Events

By Iain Blair

Director/producer/showrunner Barry Sonnenfeld has a gift for combining killer visuals with off-kilter, broad and often dark comedy, as showcased in such monster hits as the Men in Black and The Addams Family franchises.

He did learn from the modern masters of black comedy, the Coen brothers, beginning his prolific career as their DP on their first feature film, Blood Simple and then shooting such classics as Raising Arizona and Miller’s Crossing. He continued his comedy training as the DP on such films as Penny Marshall’s Big, Danny Devito’s Throw Momma from the Train and Rob Reiner’s When Harry Met Sally.

So maybe it was just a matter of time before Sonnenfeld — whose directing credits include Get Shorty, Wild Wild West, RV and Nine Lives — gravitated toward helming the acclaimed new Netflix show A Series of Unfortunate Events, based on the beloved and best-selling “Lemony Snicket” children’s series by Daniel Handler. After all, with the series’ rat-a-tat dialogue, bizarre humor and dark comedy, it’s a perfect fit for the director’s own strengths and sensibilities.

I spoke with Sonnenfeld, who won a 2007 Primetime Emmy and a DGA Award for his directorial achievement on Pushing Daisies, about making the series, the new golden age of TV, his love of post — and the real story behind why he never directed the film version of A Series of Unfortunate Events.

Weren’t you originally set to direct the 2004 film, and you even hired Handler to write the screenplay?
That’s true. I was working with producer Scott Rudin, who had done the Addams Family films with me, and Paramount decided they needed more money, so they brought in another studio, DreamWorks. But the DreamWorks producer — who had done the Men in Black films with me — and I don’t really get along. So when they came on board, Daniel and I were let go. I’d been very involved with it for a long time. I’d already hired a crew, sets were all designed, and it was very disappointing as I loved the books.

But there’s a happy ending. You are doing Netflix TV series, which seems much closer to the original books than the movie version. How important was finding the right tone?
The single most important job of a director is both finding and maintaining the right tone. Luckily, the tone of the books is exactly in my wheelhouse — creating worlds that are real, but also with some artifice in them, like the Men in Black and Addams Family movies, and Pushing Daisies. I tend to like things that are a bit dark, slightly quirky.

What did you think of the film version?
I thought it was slightly too big and loud, and I wanted to do something more like a children’s book, for adults.

The film version had to stuff Handler’s first three novels into a single movie, but the TV format, with its added length, must work far better for the books?
Far better, and the other great thing is that once Netflix hired me — and it was a long auditioning process — they totally committed. They take a long time finding the right material and pairing it with the right filmmaker but once they do, they really trust their judgment.

I really wanted to shoot it all on stages, so I could control everything. I didn’t want sun or rain. I wanted gloomy overhead. So we shot it all in Vancouver, and Netflix totally bought into that vision. I have an amazing team — the great production designer Bo Welch, who did Men in Black and other films with me, and DP Bernard Couture.

Patrick Warburton’s deadpan delivery as Lemony Snicket, the books’ unreliable narrator, is a great move compared with having just the film’s voiceover. How early on did you make that change?
When I first met with Netflix, I told them that Lemony should be an on-screen character. That was my goal. Patrick’s just perfect for the role. He’s the sort of Rod Serling/Twilight Zone presence — only more so, as he’s involved in the actual visual style of the show.

How early on do you deal with post and CG for each episode?
Even before we’re shooting. You don’t want to wait until you lock picture to start all that work, or you’ll never finish in time. I’m directing most of it — half the first season and over a third of the second. Bo’s doing some episodes, and we bring in the directors at least a month before the shoot, which is long for TV, to do a shot list. These shows, both creatively and in terms of budget, are made in prep. There should be very few decisions being made in the shoot or surprises in post because basically every two episodes equal one book, and they’re like feature films but on one-tenth of the budget and a quarter of the schedule.

We only have 24 days to do two hours worth of feature film. Our goal is to make it look as good as any feature, and I think we’ve done that. So once we have sequences we’re happy with, we show them to Netflix and start post, as we have a lot of greenscreen. We do some CGI, but not as much as we expected.

Do you also post in Vancouver?
No. We began doing post there for the first season, but we discovered that with our TV budget and my feature film demands and standards, it wasn’t working out. So now we work with several post vendors in LA and San Francisco. All the editorial is in LA.

Do you like the post process?
I’ve always loved it. As Truffaut said, the day you finish filming is the worst it’ll ever be, and then in post you get to make it great again, separating the wheat from the chaff, adding all the VFX and sound. I love prep and post — especially post as it’s the least stress and you have the most time to just think. Production is really tough. Things go wrong constantly.

You used two editors?
Yes, Stuart Bass and Skip MacDonald, and each edits two episodes/one book as we go. I’m very involved, but in TV the director gets a very short time to do their cut, and I like to give notes and then leave. My problem is I’m a micro-manager, so it’s best if I leave because I drive everyone crazy! Then the showrunner — which is also me — takes over. I’m very comfortable in post, with all the editing and VFX, and I represent the whole team and end up making all the post decisions.

Where did you mix the sound?
We did all the mixing on the Sony lot with the great Paul Ottosson who won Oscars for Zero Dark Thirty and The Hurt Locker. We go way back, as he did Men in Black 3 and other shows for me, and what’s so great about him is that he both designs the sound and then also mixes.

The show uses a lot of VFX. Who did them?
We used three main houses — Shade and Digital Sandbox in LA and Tippett in San Francisco. We also used EDI, an Italian company, who came in late to do some wire removal and clean up.

How important was the DI on this and where did you do it?
We did it all at Encore LA, and the colorist on the first season was Laura Jans Fazio, who was fantastic. It’s the equivalent to a movie DI, where you do all the final color timing, and getting the right look was crucial. The DP created very good LUTs, and our rough cut was very close to where we wanted it, and then the DP and myself piggy-backed sessions with the colorist. It’s a painful experience for me as it’s so slow, and like editing, I micro-manage. So I set looks for scenes and then leave.

Barry Sonnefeld directs Joan Cusack.

Is it a golden age for TV?
Very much so. The writing’s a very high standard, and now everyone has wide-screen TVs there’s no more protecting the 3:4 image, which is almost square. When I began doing TV, there was no such thing as a wide shot. Executives would look at my cut, and the first thing they’d always say was, “Do you have a close-up of so and so?” Now it’s all changed. But TV is so different from movies. I look back fondly at movie schedules!

How important are the Emmys and other awards?
They’re very important for Netflix and all the new platforms. If you have critical success, then they get more subscribers, more money and then they develop more projects. And it’s great to be acknowledged by your peers.

What’s next?
I’ll finish season two and we’re hopeful about season three, which would keep us busy through fall 2018. And Vancouver’s a perfect place to be as long as you’re shooting on stage and don’t have to deal with the weather.

Will there be a fourth Men in Black?
If there is, I don’t think Will or I will be involved. I suspect there won’t be one, as it might be just too expensive to make now, with all the back-end deals for Spielberg and Amblin and so on. But I hope there’s one.

Images: Joe Lederer/Netflix


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Dell 6.15

Is television versioning about to go IMF?

By Andy Wilson

If you’ve worked in the post production industry for the last 20 years, you’ll have seen the exponential growth of feature film versioning. What was once a language track dub, subtitled version or country specific compliance edit has grown into a versioning industry that has to feed a voracious number of territories, devices, platforms and formats — from airplane entertainment systems to iTunes deliveries.

Of course, this rise in movie versioning has been helped by the shift over the last 10 years to digital cinema and file-based working. In 2013, SMPTE ratified ST 2067-2, which created the Interoperable Master Format (IMF). IMF was designed to help manage the complexity of storing high-quality master rushes inside a file structure that allowed the flexibility to generate multiple variants of films through constraining what was included in the output and in the desired output formats.

Like any workflow and format change, IMF has taken time to be adopted, but it is now becoming the preferred way to share high-quality file masters between media organizations. These masters are all delivered in the J2K codec to support cinema resolutions and playback technologies.

Technologists in the broadcast community have been monitoring the growth in popularity and flexibility of IMF, with its distinctive solution to the challenge of multiple versioning. Most broadcasters have moved away from tape-based playout and are instead using air-ready playout files. These are medium-sized files (50-100Mb/s), derived from high quality rushes that can be used on playout servers to create broadcast streams. The most widespread of these includes the native XDCAM file format, but it is fast being overtaken by the AS-11 format. This format has proved very popular in the United Kingdom, where all major broadcasters made a switch to AS-11 UK DPP in 2014. AS-11 is currently rolling out in the US via the AS-11 X8 and X9 variants. However, these remain air-ready playout files, output from the 600+Mb/s ProRes and RAW files used in high-end productions. AS-11 brings some uniformity, but it doesn’t solve the versioning challenge.

Versioning is rapidly becoming as big an issue for high-end broadcast content as for feature films. Broadcasters are now seeing the sales lifecycle of some of their programs running for more than 10 years. The BBC’s Planet Earth is a great example of this, with dozens of versions being made over several years. So the need to keep high-quality files for re-versioning for new broadcast and online deliveries has become increasingly important. It is crucial for long-tail sales revenue, and productions are starting to invest in higher-resolution recordings for exactly this reason.

So, as the international high-end television market continues to grow, producers are having to look at ways that they can share much higher quality assets than air-ready files. This is where IMF offers significant opportunity for efficiencies in the broadcast and wider media market and why it is something that has the attention of producers, such as the BBC and Sky. Major broadcasters such as these have been working with global partners through the Digital Production Partnership (DPP) to help develop a new specification of IMF, specifically designed for television and online mastering.

The DPP, in partnership with the North American Broadcasters Association (NABA) and the European Broadcasting Union (EBU), have been exploring what the business requirements are for a mastering format for broadcasting. The outcome of this work was published in June 2017, and can be downloaded here.

The work explored three different user requirements: Program Acquisitions (incoming), Program Sales (outgoing) and Archive. The sales and acquisition of content can be significantly transformed with the ability to build new versions on the fly, via the Composition Playlist (CPL) and an Output Profile List (OPL). The ability to archive master rushes in a suitably high-quality package will be extremely valuable to broadcast archives. The addition of the ability to store ProRes as part of an IMF is also being welcomed, as many broadcaster archives are already full of ProRes material.

The EBU-QC group has already started to look at how to manage program quality from a broadcast IMF package, and how technical assessments can be carried out during the outputting of materials, as well as on the component assets. This work paves the way for some innovative solutions to future QC checks, whether carried out locally in the post suite or in the cloud.

The DPP will be working with SMPTE and its partners to fast track a constrained version of IMF ready for use in the broadcast and online delivery market in the first half of 2018.

As OTT video services rely heavily on the ability to output multiple different versions of the source content, this new variant of IMF could play a particularly important role in automatic content versioning and automated processes for file creation and delivery to distribution platforms — not to mention in advertising, where commercials are often re-versioned for multiple territories and states.

The DPP’s work will include the ability to add ProRes- and H.264-derived materials into the IMF package, as well as the inclusion of delivery specific metadata. The DPP are working to deliver some proof-of-concept presentations for IBC 2017 and will host manufacturer and supplier briefing days and plugfests as the work progresses on the draft version of the IMF specification. It is hoped that the work will be completed in time to have the IMF specification for broadcast and online integrated into products by NAB 2018.

It’s exciting to think about how IMF and Internet-enabled production and distribution tools will work together as part of the architecture of the future content supply chain. This supply chain will enable media companies to respond more quickly and effectively to the ever-growing and changing demands of the consumer. The DPP sees this shift to more responsive operational design as the key to success for media suppliers in the years ahead.


Andy Wilson is head of business development at DPP.


Dailies and post for IFC’s Brockmire

By Randi Altman

When the name Brockmire first entered my vocabulary, it was thanks to a very naughty and extremely funny short video that I saw on YouTube, starring Hank Azaria. It made me laugh-cry.

Fast forward about seven years and the tale of the plaid-jacket-wearing, old-school baseball play-by-play man — who discovers his beloved wife’s infidelity and melts down in an incredibly dirty and curse-fueled way on air — is picked up by IFC, in the series aptly named Brockmire. It stars Azaria, Amanda Peet and features cameos from sportscasters like Joe Buck and Tim Kurkjian.

The Sim Group was called on to provide multiple services for Brockmire: Sim provided camera rentals, Bling Digital provided dailies and workflow services, and Chainsaw provided offline editorial facilities, post finishing services, and deliverables.

We reached out to Chainsaw’s VP of business development, Michael Levy, and Bling Digital’s workflow producer, James Koon, with some questions about workflow. First up is Levy.

Michael Levy

How early did you get involved on Brockmire?
Our role with Brockmire started from the very beginning stages of the project. This was through a working relationship I had with Elizabeth Baquet, who is a production executive at Funny or Die (which produces the show).

What challenges did you have to overcome?
One of the biggest challenges was related to scaling a short to a multi-episode series and having multiple episodes in both production and in post at the same time. However, all the companies that make up Sim Group have worked on many episodic series over the years, so we were in a really good position to offer advice in terms of how to plan a workflow strategy, how to document things properly and how to coordinate getting their camera and dailies offline media from Atlanta to Post Editorial in Los Angeles.

What tools did they need for post and how involved was Chainsaw?
Chainsaw worked very hard with our Sim Group colleagues in Atlanta to provide a level of coordination that I believe made life much simpler for the Brockmire production/editorial team.

Offline editing for the series was done on our Avid Media composer systems in cutting rooms here in the Chainsaw/SIM Group studio in Los Angeles at the Las Palmas Building.

The Avid dailies media created by Bling-Atlanta, our partner company in the SimGroup, was piped over the Internet each day to Chainsaw. When the Brockmire editorial crew walked into their cutting rooms, their offline dailies media was ready to edit with on their Avid Isis server workspace. Whenever needed, they were also able to access their Arri Alexa full-rez dailies media that had been shipped on Bling drives from Atlanta.

Bling-Atlanta’s workflow supervisor for Brockmire, James Koon, remained fully involved, and was able to supervise the pulling of any clips needed for VFX, or respond to any other dailies related needs.

Deb Wolfe, Funny or Die’s post producer for Brockmire, also had an office here at Chainsaw. She consulted regularly with Annalise Kurinsky (Chainsaw’s in-house producer for Brockmire) and I as they moved along locking cuts and getting ready for post finishing.

In preparation for the finishing work, we were able to set-up color tests with Chainsaw senior colorist Andy Lichtstein, who handled final color for the series in one of our FilmLight Baselight color suites. I should note that all of our Chainsaw finishing rooms were right downstairs on the second floor of the same Sim Group Las Palmas Building.

How closely did you work with Deb Wolfe?
Very closely, especially in dealing with an unexpected production problem. Co-star Amanda Peet was accidentally hit in the head by a thrown beer can (how Brockmire! as they would say in the series). We quickly called in Boyd Stepan, Chainsaw’s Senior VFX artist, and came up with a game plan to do Flame paint fixes on all of the affected Amanda Peet shots. We also provided additional VFX compositing for other planned VFX shots in several of their episodes.

What about the HD online finish?
That was done on Avid Symphony and Baselight by staff online editor Jon Pehlke, making full use of Chainsaw’s Avid/Baselight clip-based AAF workflow.

The last stop in the post process was the Chainsaw Deliverables Department, which took care of QC and requested videotape dubs and creation and digital upload of specified delivery files.

James Koon

Now for James Koon…

James, what challenges did you have to overcome if any?
I would say that the biggest challenge overall with Brockmire was the timeframe. Twenty-four days to shoot eight episodes is ambitious. While in general this doesn’t pose a specific problem in dailies, the tight shooting schedule meant that certain elements of the workflow were going to need more attention. The color workflow, in particular, was one that created a fair amount of discussion — with the tight schedules on set, the DP (Jeffrey Waldron) wanted to get his look, but wasn’t going to have much time, if any, on-set coloring. So we worked with the DP to set up looks before they started shooting that could be stored in the camera and monitored on set and would be applied and tweaked as needed back at the dailies lab with notes from the DP.

Episode information from set to editorial was also an important consideration as they were shooting material from all eight episodes at once. Making sure to cross reference and double check which episode a shot was for was important to make sure that editorial could quickly find what they needed.

Can you walk us through the workflow, and how you worked with the producers?
They shot with the Arri’s Amira and Alexa Mini, monitoring with the LUTs created before production. This material was offloaded to an on-set back-up and a shuttle drive  — we generally use G-Tech G-RAID 4TB Thunderbolt or USB3 and  for local storage a Promise Pegasus drive and a back up on our Facilis Terrablock SAN — that was sent to the lab along with camera notes and any notes from the DP and/or the DIT regarding the look for the material. Once received at the lab we would offload the footage to our local storage and process the footage in the dailies software, syncing the material to the audio mixers recording and logging the episode, scene and take information for every take, using camera notes, script notes and audio logs to make sure that the information was correct and consistent.

We also applied the correct LUT based on camera reports and tweaked color as needed to match cameras and make any adjustments needed from the DPs notes. Once all of that was completed, we would render Avid materials for editorial, create Internet streaming files for IFC’s Box service, as well as creating DVDs.

We would bring in the Avid files and organize them into bins per the editorial specs, and upload the files and bins to the editorial location in LA. These files were delivered directly to a dailies partition on their Isis, so once editorial arrived in the morning, everything was waiting for them.

Once dailies were completed, LTO backups of the media and dailies were written as well as additional temporary backups of the source material as a safety. These final backups were completed and verified by the following morning, and editorial and production were both notified, allowing production to clear cards from the previous day if needed.

What tools did you use for dailies?
We used DaVinci Resolve to set original looks with the DP before the show began shooting, Colorfront Express Dailies for dailies processing, Media Composer for Avid editorial prep and bin organization and Imagine’s PreRoll Post for LTO writing and verification.


Harbor’s Bobby Johanson discusses ADR for TV and film

By Jennifer Walden

A lot of work comes in and out of the ADR department at New York City’s Harbor Picture Company. A lot.

Over the past year alone, ADR mixer Bobby Johanson has been cranking out ADR and loop group for films such as Beauty and the Beast, The Light Between Oceans, Patriots Day, The Girl on the Train, Triple 9, Hail, Caesar! and more.

His expertise goes beyond film though. Johanson also does ADR for series, for shows like Amazon’s Red Oaks and their upcoming series The Marvelous Mrs. Maisel, and Netflix’s Master of None, which we will touch on lightly in a bit. First, let’s talk the art of ADR.

According to Johanson, “Last week, I did full days on three different films. Some weeks we record full days, nights and weekends, depending on the season, film festivals, what’s in post, actor availability and everything else that goes on with scheduling. Some sessions will book for two hours out of a day, while another client will want eight hours because of actor availability.”

With so many projects passing through his studio, efficiency is essential, but not at the cost of a job well done. “You have an actor on the stage and the director in the room, and you have to make things efficient,” says Johanson. “You have to play lines back as they are going to be in the show. You want to play the line and hear, ‘Was that ADR?’ Instantly, it’s a whole new world. People have been burned by not so good ADR in the past, and I feel like that compromises the performance. It’s very important for the talent to feel like they’re in good hands, so they forget about the technical side and just focus on their acting.”

Johanson got his start in ADR at New York’s Sound One facility, first as a messenger running reels around, and then moving up to the machine room when there was an opening for Sound One’s new ADR stage. “We didn’t really have anyone teaching us. The job was shown to us once; then we just had to figure out how to thread the dubbers and the projector. Once we got those hung, we would sit in the ADR studio and watch. I picked up a lot of my skills old-school. I’ve learned to incorporate those techniques into current technology and that works well for us.”

Tools
Gear-wise, one staple of his ADR career has been the Soundmaster ADR control system. Johanson calls it an “old-school tool,” probably 25 years old at this point, but he hasn’t found anything faster for recording ADR. “I used it at Sound One, and I used it at Digital Cinema, and now I use it here at Harbor. Until someone can invent another ADR synchronizer, this is the best for me.”

Johanson integrates the Soundmaster system with Avid Pro Tools 12 and works as a two-man team with ADR recordist Mike Rivera. “You can’t beat the efficiency and the attention to detail that you can get with the two-man team.”

Rivera tags the takes and makes minor edits while Johanson focuses on the director and the talent. “Because we are working on a synchronizer, the ADR recordist can do things that you couldn’t do if you were just shooting straight to Pro Tools,” explains Johanson. “We can actually edit on the fly and instantly playback the line in sync. I have the time to get the reverb on it and sweeten it. I can mix the line in because I’m not cutting it or pulling it into the track. That is being done while the system is moving on the pre-roll for a playback.”

For reverb, Johanson chooses an outboard Lexicon PCM80. This puts the controls easily within reach, and he can quickly add or change the reverb on the fly, helping the clean ADR line to sync into the scene. “The reverb unit is pretty old, but it is single-handedly the easiest reverb unit that you can use. There are four room sizes, and then you can adjust the delay of the reverb four times. I have been using this reverb for so many years now that I can match any reverb from any movie or TV show because I know this unit so well.”

Another key piece of gear in his set-up is an outboard Eventide H3000 SE sampler, which Johanson uses to sample the dialogue line they need to replace and play it back over and over for the actor to re-perform. “We offer a variety of ways to do ADR, like using beeps and having the actor perform to picture, but many actors prefer an older method that goes back to ‘looping.’ Back in the day, you would just run a line over and over again and the actor would emulate it. Then we put the select take of that line to picture. It’s a method that 60 percent of our actors who come in here love to do, and I can do that using the sampler.”

He also uses the sampler for playback. By sampling background noise from the scene, he can play that under the ADR line during playback and it helps the ADR to sit in the scene. “I keep the sampler and reverb as outboard gear because I can control them quickly. I’m doing things freestyle and we don’t have to stop the session. We don’t have to stop the system and wait for a playback or wait to do a record pass. Because we are a two-man operation, I can focus on these pieces of gear while Mike is tagging the takes with their cue numbers and managing them in the Pro Tools session for delivery. I can’t find an easier or quicker way to do what I do.”

While Johanson’s set-up may lack the luster of newly minted audio tools, it’s hard to argue with results. It’s not a case of “if it’s not broke then don’t fix it,” but rather a case of “don’t mess with perfection.”

Master of None
The set-up served them well while recording ADR and loop group for Netflix’s Emmy-winning comedy series Master of None. “Kudos to production sound mixer Michael Barosky because there wasn’t too much dialogue that we needed to replace with ADR for Season 2,” says Johanson. “But we did do a lot of loop group — sweetening backgrounds and walla, and things like that.”

For the Italian episodes, they brought in bilingual actors to record Italian language loop group. One scene that stood out for Johanson was the wedding scene in Italy, where the guests start jumping into the swimming pool. “We have a nice-sized ADR stage and so that frees us up to do a lot of movement. We were directing the actors to jump in front of the mic and run by the mic, to give us the effect of people jumping into the pool. That worked quite nicely in the track.”


Netflix’s The Last Kingdom puts Foley to good use

By Jennifer Walden

What is it about long-haired dudes strapped with leather, wielding swords and riding horses alongside equally fierce female warriors charging into bloody battles? There is a magic to this bygone era that has transfixed TV audiences, as evident by the success of HBO’s Game of Thrones, History Channel’s Vikings series and one of my favorites, The Last Kingdom, now on Netflix.

The Last Kingdom, based on a series of historical fiction novels by Bernard Cornwell, is set in late 9th century England. It tells the tale of Saxon-born Uhtred of Bebbanburg who is captured as a child by Danish invaders and raised as one of their own. Uhtred gets tangled up in King Alfred of Wessex’s vision to unite the three separate kingdoms (Wessex, Northumbria and East Anglia) into one country called England. He helps King Alfred battle the invading Danish, but Uhtred’s real desire is to reclaim his rightful home of Bebbanburg from his duplicitous uncle.

Mahoney Audio Post
The sound of the series is gritty and rich with leather, iron and wood elements. The soundtrack’s tactile quality is the result of extensive Foley work by Mahoney Audio Post, who has been with the series since the first season. “That’s great for us because we were able to establish all the sound for each character, village, environment and more, right from the first episode,” says Foley recordist/editor/sound designer Arran Mahoney.

Mahoney Audio Post is a family-operated audio facility in Sawbridgeworth, Hertfordshire, UK. Arran Mahoney explains the studio’s family ties. “Clare Mahoney (mum) and Jason Swanscott (cousin) are our Foley artists, with over 30 years of experience working on high-end TV shows and feature films. My brother Billy Mahoney and I are the Foley recordists and editors/sound designers. Billy Mahoney, Sr. (dad) is the founder of the company and has been a dubbing mixer for over 40 years.”

Their facility, built in 2012, houses a mixing suite and two separate audio editing suites, each with Avid Pro Tools HD Native systems, Avid Artist mixing consoles and Genelec monitors. The facility also has a purpose-built soundproof Foley stage featuring 20 different surfaces including grass, gravel, marble, concrete, sand, pebbles and multiple variations of wood.

Foley artists Clare Mahoney and Jason Swanscott.

Their mic collection includes a Røde NT1-A cardioid condenser microphone and a Røde NTG3 supercardioid shotgun microphone, which they use individually for close-micing or in combination to create more distant perspectives when necessary. They also have two other studio staples: a Neumann U87 large-diaphragm condenser mic and a Sennheiser MKH-416 short shotgun mic.

Going Medieval
Over the years, the Mahoney Foley team has collected thousands of props. For The Last Kingdom specifically, they visited a medieval weapons maker and bought a whole armory of items: swords, shields, axes, daggers, spears, helmets, chainmail, armor, bridles and more. And it’s all put to good use on the series. Mahoney notes, “We cover every single thing that you see on-screen as well as everything you hear off of it.” That includes all the feet (human and horses), cloth, and practical effects like grabs, pick-ups/put downs, and touches. They also cover the battle sequences.

Mahoney says they use 20 to 30 tracks of Foley just to create the layers of detail that the battle scenes need. Starting with the cloth pass, they cover the Saxon chainmail and the Vikings leather and fur armor. Then they do basic cloth and leather movements to cover non-warrior characters and villagers. They record a general weapons track, played at low volume, to provide a base layer of sound.

Next they cover the horses from head to hoof, with bridles and saddles, and Foley for the horses’ feet. When asked what’s the best way to Foley horse hooves, Mahoney asserts that it is indeed with coconuts. “We’ve also purchased horseshoes to add to the stable atmospheres and spot FX when required,” he explains. “We record any abnormal horse movements, i.e. crossing a drawbridge or moving across multiple surfaces, and sound designers take care of the rest. Whenever muck or gravel is needed, we buy fresh material from the local DIY stores and work it into our grids/pits on the Foley stage.”

The battle scenes also require Foley for all the grabs, hits and bodyfalls. For the blood and gore, they use a variety of fruit and animal flesh.

Then there’s a multitude of feet to cover the storm of warriors rushing at each other. All the boots they used were wrapped in leather to create an authentic sound that’s true to the time. Mahoney notes that they didn’t want to capture “too much heel in the footsteps, while also trying to get a close match to the sync sound in the event of ADR.”

Surfaces include stone and marble for the Saxon castles of King Alfred and the other noble lords. For the wooden palisades and fort walls, Mahoney says they used a large wooden base accompanied by wooden crates, plinths, boxes and an added layer of controlled creaks to give an aged effect to everything. On each series, they used 20 rolls of fresh grass, lots of hay for the stables, leaves for the forest, and water for all the sea and river scenes. “There were many nights cleaning the studio after battle sequences,” he says.

In addition to the aforementioned props of medieval weapons, grass, mud, bridles and leather, Mahoney says they used an unexpected prop: “The Viking cloth tracks were actually done with samurai suits. They gave us the weight needed to distinguish the larger size of a Danish man compared to a Saxon.”

Their favorite scenes to Foley, and by far the most challenging, were the battle scenes. “Those need so much detail and attention. It gives us a chance to shine on the soundtrack. The way that they are shot/edited can be very fast paced, which lends itself well to micro details. It’s all action, very precise and in your face,” he says. But if they had to pick one favorite scene, Mahoney says it would be “Uhtred and Ragnar storming Kjartan’s stronghold.”

Another challenging-yet-rewarding opportunity for Foley was during the slave ship scenes. Uhtred and his friend are sold into slavery as rowers on a Viking ship, which holds a crew of nearly 30 men. The Mahoney team brought the slave ship to life by building up layers of detail. “There were small wood creaks with small variations of wood and big creaks with larger variations of wood. For the big creaks, we used leather and a broomstick to work into the wood, creating a deep creak sound by twisting the three elements against each other. Then we would pitch shift or EQ to create size and weight. When you put the two together it gives detail and depth. Throw in a few tracks of rigging and pulleys for good measure and you’re halfway there,” says Mahoney.

For the sails, they used a two-mic setup to record huge canvas sheets to create a stereo wrap-around feel. For the rowing effects, they used sticks, brooms and wood rubbing, bouncing, or knocking against large wooden floors and solid boxes. They also covered all the characters’ shackles and chains.

Foley is a very effective way to draw the audience in close to a character or to help the audience feel closer to the action on-screen. For example, near the end of Season 2’s finale, a loyal subject of King Alfred has fallen out of favor. He’s eventually imprisoned and prepares to take his own life. The sound of his fingers running down the blade and the handling of his knife make the gravity of his decision palpable.

Mahoney shares another example of using Foley to draw the audience in — during the scene when Sven is eaten by Thyra’s wolves (following Uhtred and Ragnar storming Kjartan’s stronghold). “We used oranges and melons for Sven’s flesh being eaten and for the blood squirts. Then we created some tracks of cloth and leather being ripped. Specially manufactured claw props were used for the frantic, ravenous wolf feet,” he says. “All the action was off-screen so it was important for the audience to hear in detail what was going on, to give them a sense of what it would be like without actually seeing it. Also, Thyra’s reaction needed to reflect what was going on. Hopefully, we achieved that.”


The long, strange trip of Amir Bar-Lev’s new Dead doc

Deadheads take note — Long Strange Trip, director Amir Bar-Lev’s four-hour documentary on rock’s original jam band, the Grateful Dead, is now available for viewing. While the film had a theatrical release in New York and Los Angeles on May 26, the doc was made available on Amazon Video as a six-episode series.

L-R: Jack Lewars and Keith Jenson.

Encompassing the band’s rise and decades-long career, the film, executive produced by Martin Scorsese, was itself 14 years in the making. That included three months of final post at Technicolor PostWorks New York, where colorist Jack Lewars and online editor Keith Jenson worked with Bar-Lev to finalize the film’s form and look.

The documentary features scores of interviews conducted by Bar-Lev with band members and their associates, as well as a mountain of concert footage and other archival media. All that made editorial conforming complex as Jenson (using Autodesk Flame) had to keep the diverse source material organized and make it fit properly into a single timeline. “We had conversions that were made from old analog tapes, archival band footage, DPX scans from film and everything in between,” he recalls. “There was a lot of cool stuff, which was great, but it required attention to detail to ensure it came out nice and smooth.”

The process was further complicated as creative editorial was ongoing throughout post. New material was arriving constantly. “We do a lot of documentary work here, so that’s something we’re used to,” Jenson says. “We have workflows and failsafes in place for all formats and know how to translate them for the Lustre platform Jack uses. Other than the sheer amount, nothing took us by surprise.”

Lewars faced a similar challenge during grading as he was tasked with bringing consistency to material produced over a long period of time by varying means. The overall visual style, he says, recalls the band’s origins in the psychedelic culture of the 1960s. “It’s a Grateful Dead movie, so there are a lot of references to their experiments with drugs,” he explains. “Some sections have a trippy feel where the visuals go in and out of different formats. It almost gives the viewer the sense of being on acid.”

The color palette, too, has a psychedelic feel, reflecting the free-spirited essence of the band and its co-founder. “Jerry Garcia’s life, his intention and his outlook, was to have fun,” Lewars observes. “And that’s the look we embraced. It’s very saturated, very colorful and very bright. We tried to make the movie as fun as possible.”

The narrative is frequently punctuated by animated sequences where still photographs, archival media and other elements are blended together in kaleidoscopic patterns. Finalizing those sequences required a few extra steps. “For the animation sequences, we had to cut in the plates and get them to Jack to grade,” explains Jenson. “We’d then send the color-corrected plates to the VFX and animation department for treatment. They’d come back as completed elements that we’d cut into the conform.”

The documentary climaxes with the death of Garcia and its aftermath. The guitarist suffered a heart attack in 1995 after years of struggling with diabetes and drug addiction. As those events unfold, the story undergoes a mood change that is mirrored in shifts in the color treatment. “There is a four-minute animated sequence in the last reel where Jerry has just passed and they are recapping the film,” Lewars says. “Images are overlaid on top of images. We colored those plates in hyper saturation, pushing it almost to the breaking point.

“It’s a very emotional moment,” he adds. “The earlier animated sequences introduced characters and were funny. But it’s tied together at the end in a way that’s sad. It’s a whiplash effect.”

Despite the length of the project and the complexity of its parts, it came together with few bumps. “Supervising producer Stuart Macphee and his team were amazing,” says Jenson. “They were very well organized, incredibly so. With so many formats and conversions coming from various sources, it could have snowballed quickly, but with this team it was a breeze.”

Lewars concurs. Long Strange Trip is an unusual documentary both in its narrative style and its looks, and that’s what makes it fascinating for Deadheads and non-fans alike. “It’s not a typical history doc,” Lewars notes. “A lot of documentaries go with a cold, bleach by-pass look and gritty feel. This was the opposite. We were bumping the saturation in parts where it felt unnatural, but, in the end, it was completely the right thing to do. It’s like candy.”

You can binge it now on Amazon Video.

FMPX8.14

Pixelogic acquires Sony DADC NMS’ creative services unit

Pixelogic, a provider of localization and distribution services, has completed the acquisition of the creative services business unit of Sony DADC New Media Solutions, which specializes in 4K, UHD, HDR and IMF workflows for features and episodics. The move brings an expansion of Pixelogic’s significant services to the media and entertainment industry and provides additional capabilities, including experienced staff, proprietary technology and an extended footprint.

According to John Suh, co-president of Pixelogic, the acquisition “expands our team of expert media engineers and creative talent, extends our geographic reach by providing a fully established London operation and further adds to our capacity and capability within an expansive list of tools, technologies, formats and distribution solutions.”

Seth Hallen

Founded less than a year ago, Pixelogic currently employs over 240 worldwide and is led by industry veterans Suh and Rob Seidel. While the company is headquartered in Burbank, California, it has additional operations in Culver City, California, London and Cairo.

Sony DADC NMS Creative Services was under the direction of Seth Hallen, who joins Pixelogic as senior VP of business development and strategy. All Sony DADC NMS Creative Services staff, technology and operations are now part of Pixelogic. “Our business model is focused on the deep integration of localization and distribution services for movies and television products,” says Hallen. “This supply chain will require significant change in order to deliver global day and date releases with collapsed distribution windows, and by partnering closely with our customers we are setting out to innovate and help lead this change.”


FX’s Fargo features sounds as distinctive as its characters

By Jennifer Walden

In Fargo, North Dakota, in the dead of winter, there’s been a murder. You might think you’ve heard this story before, but Noah Hawley keeps coming up with a fresh, new version of it for each season of his Fargo series on FX. Sure, his inspiration was the Coen brothers’ Oscar-winning Fargo film, but with Season 3 now underway it’s obvious that Hawley’s series isn’t simply a spin-off.

Martin Lee and Kirk Lynds.

Every season of the Emmy-winning Fargo series follows a different story, with its own distinct cast of characters, set in its own specified point in time. Even the location isn’t always the same — Season 3 takes place in Minnesota. What does link the seasons together is Hawley’s distinct black humor, which oozes from these disparate small-town homicides. He’s a writer and director on the series, in addition to being the showrunner and an executive producer. “Noah is very hands-on,” confirms re-recording mixer Martin Lee at Tattersall Sound & Picture in Toronto, part of the SIM Group family of companies, who has been mixing the show with re-recording mixer Kirk Lynds since Season 2.

Fargo has a very distinct look, feel and sound that you have to maintain,” explains Lee. “The editors, producers and Noah put a lot of work into the sound design and sound ideas while they are cutting the picture. The music is very heavily worked while they are editing the show. By the time the soundtrack gets to us there is a pretty clear path as to what they are looking for. It’s up to us to take that and flesh it out, to make it fill the 5.1 environment. That’s one of the most unique parts of the process for us.”

Season 3 follows rival brothers, Emmit and Ray Stussy (both played by Ewan McGregor). Their feud over a rare postage stamp leads to a botched robbery attempt that ultimately ends in murder (don’t worry, neither Ewan character meets his demise…yet??).

One of the most challenging episodes to mix this season, so far, was Episode 3, “The Law of Non-Contradiction.” The story plays out across four different settings, each with unique soundscapes: Minnesota, Los Angeles in 2010, Los Angeles in 1975 and an animated sci-fi realm. As police officer Gloria Burgle (Carrie Coon) unravels the homicide in Eden Valley, Minnesota, her journey leads her to Los Angeles. There the story dives into the past, to 1975, to reveal the life story of science fiction writer Thaddeus Mobley (Thomas Mann). The episode side-trips into animation land when Gloria reads Mobley’s book titled The Planet Wyh.

One sonic distinction between Los Angeles in 2010 and Los Angeles of 1975 was the density of traffic. Lee, who mixed the dialogue and music, says, “All of the scenes that were taking place in 2010 were very thick with traffic and cars. That was a technical challenge, because the recordings were very heavy with traffic.”

Another distinction is the pervasiveness of technology in social situations, like the bar scene where Gloria meets up with a local Los Angeles cop to talk about her stolen luggage. The patrons are all glued to their cell phones. As the camera pans down the bar, you hear different sounds of texting playing over a contemporary, techno dance track. “They wanted to have those sounds playing, but not become intrusive. They wanted to establish with sound that people are always tapping away on their phones. It was important to get those sounds to play through subtly,” explains Lynds.

In the animated sequences, Gloria’s voice narrates the story of a small android named MNSKY whose spaceman companion dies just before they reach Earth. The robot carries on the mission and records an eon’s worth of data on Earth. The robot is eventually reunited with members of The Federation of United Planets, who cull the android’s data and then order it to shut down. “Because it was this animated sci-fi story, we wanted to really fill the room with the environment much more so than we can when we are dealing with production sound,” says Lee. “As this little robotic character is moving through time on Earth, you see something like the history of man. There’s voiceover, sound effects and music through all of it. It required a lot of finesse to maintain all of those elements with the right kind of energy.”

The animation begins with a spaceship crashing into the moon. MNSKY wakes and approaches the injured spaceman who tells the android he’s going to die. Lee needed to create a vocal process for the spaceman, to make it sound as though his voice is coming through his helmet. With Audio Ease’s Altiverb, Lee tweaked the settings on a “long plastic tube” convolution reverb. Then he layered that processed vocal with the clean vocal. “It was just enough to create that sense of a helmet,” he says.

At the end, when MNSKY rejoins the members of the Federation on their spaceship it’s a very different environment from Earth. The large, ethereal space is awash in long, warm reverbs which Lynds applied using plug-ins like PhoenixVerb 5.1 and Altiverb. Lee also applied a long reverb treatment to the dialogue. “The reverbs have quite a significant pre-delay, so you almost have that sense of a repeat of the voice afterwards. This gives it a very distinctive, environmental feel.”

Lynds and Lee spend two days premixing their material on separate dub stages. For the premix, Lynds typically has all the necessary tracks from supervising sound editor Nick Forshager while Lee’s dialogue and music tracks come in more piecemeal. “I get about half the production dialogue on day one and then I get the other half on day two,” says Lee. “ADR dribbles in the whole time, including well into the mixing process. ADR comes in even after we have had several playbacks already.”

Fortunately, the show doesn’t rely heavily on ADR. Lee notes that they put a lot of effort into preserving the production. “We use a combination of techniques. The editors find the cleanest lines and takes (while still keeping the performance), then I spent a lot of time cleaning that up,” he says.

This season Lee relies more on Cedar’s DNS One plug-in for noise reduction and less on the iZotope RX5 (Connect version). “I’m finding with Fargo that the showrunners are uniquely sensitive to the effects of the iZotope processing. This year it took more work to find the right sound. It ends up being a combination of both the Cedar and the RX5,” reports Lee.

After premixing, Lee and Lynds bring their tracks together on Tattersall’s Stage 1. They have three days for the 5.1 final mix. They spend one (very) long day building the episode in 5.1 and then send their mix to Los Angeles for Forshager and co-producer Gregg Tilson to review. Then Lee and Lynds address the first round of notes the next morning and send the mix back to Los Angeles for another playback. Each consecutive playback is played for more people. The last playback is for Hawley on the third day.

“One of the big challenges with the workflow is mixing an episode in one day. It’s a long mix day. At least the different time zones help. We send them a mix to listen to typically around 6-7pm PST, so it’s not super late for them. We start at 8am EST the next morning, which is three hours ahead of their time. By the time they’re in the studio and ready to listen, it is 10am their time and we’ve already spent three or four hours handling the revisions. That really works to our advantage,” says Lee.

Sound in the Fargo series is not an afterthought. It’s used to build tension, like a desk bell that rings for an uncomfortably long time, or to set the mood of a space, like an overly noisy fish tank in a cheap apartment. By the time the tracks have made it to the mixers, there’s been “a lot of time and effort spent thinking about what the show was going to sound like,” says Lynds. “From that sense, the entire mix for us is a creative opportunity. It’s our chance to re-create that in a 5.1 environment, and to make that bigger and better.”

You can catch new episodes of Fargo on FX Networks, Wednesdays at 10pm EST.


Jennifer Walden is a New Jersey-based audio engineer and writer.

Assistant Editor’s Bootcamp coming to Burbank in June

The new Assistant Editors’ Bootcamp, founded by assistant/lead editors Noah Chamow (The Voice) and Conor Burke (America’s Got Talent), is a place for a assistant editors and aspiring assistants to learn and collaborate with one another in a low-stakes environment. The next Assistant Editors’ Bootcamp classes will be held on June 10-11, along with a Lead Assistant Editors’ class geared toward understanding troubleshooting and system performance on June 24-25. All classes, sponsored by AlphaDogs’ Editor’s Lounge, will be held at Skye Rentals in Burbank.

The classes will cover such topics as The Fundamentals of Video, Media Management, Understanding I/O and Drive Speed, Prepping Footage for Edit, What’s New in Media Composer, Understanding System Performance Bottlenecks and more. Cost is $199 for two days for the Assistant Editor class, and $299 for two days for the Lead Assistant Editor class. Space is on a first-come, first-served basis and is limited to 25 participants per course. You can register here.

A system with Media Composer 8.6 or later and an external hard drive is required to take the class (30-day Avid trial available) 8GB of system memory and Windows 7/OS X 10.9 or later are needed to run Media Composer 8.6. Computer rentals are available for as little as $54 a week from Hi-Tech Computer Rental in Burbank.

Chamow and Burke came up with the idea for Assistant Editors’ Bootcamp when they realized how challenging it is to gain any real on-the-job experience in today’s workplace. With today’s focus being primarily on doing things faster and more efficiently, it’s almost impossible to find the time to figure out why one method of doing something is faster than another. Having worked extensively in reality television and creating the “The Super Grouper,” a multi-grouping macro for Avid that is now widely used in reality post workflows, Chamow understands first-hand the landscape of the assistant editor’s world. “One of the most difficult things about working in the entertainment industry, especially in a technical position, is that there is never time to learn,” he says. “I’m very passionate about education and hope by hosting these classes, I can help other assistants hone their skills as well as helping those who are new to the business get the experience they need.”

Having worked as both an assistant editor and lead assistant editor, Burke has created workflows and overseen post for up to 10 projects at a time, before moving into his current position at NBC’s America’s Got Talent. “In my years of experience and working on grueling deadlines, I completely understand how difficult the job of an assistant editor can be, having little or no time to learn anything other than what’s right in front of you,” he says. “In teaching this class, I hope to make peers feel more confident and have a better understanding in their work, taking them to the next level in their careers.”

Main Image (L-R): Noah Chamow and Conor Burke.