NBCUni 7.26

Category Archives: TV Series

Showtime’s Homeland: Producer/director Lesli Linka Glatter

By Iain Blair

Since it first premiered back in 2011, the provocative, edgy and timely spy thriller Homeland has been a huge hit with audiences and critics alike. It has also racked up dozens of awards, including Primetime Emmys and Golden Globes.

The show, which features an impressive cast — namely Claire Danes and Mandy Patinkin — is Showtime’s number one drama series is produced by Fox 21 Television Studios and was developed for American television by Alex Gansa and Howard Gordon. Homeland is based on the Israeli series Prisoners of War from Gideon Raff.

Lesli Linka Glatter

Producer Lesli Linka Glatter is an award-winning director of film and episodic dramas. Her TV work includes The Newsroom, The Walking Dead, Justified, Ray Donovan, Masters of Sex, Nashville, True Blood, Mad Men, The Good Wife, House, The West Wing, NYPD Blue, ER and Freaks and Geeks, just to name a few. Most recently, she directed the first two episodes of Dick Wolf’s limited series Law & Order: True Crime — The Menendez Murders for NBC.

Glatter was nominated for a fifth Emmy for directing the Homeland episode “America First,” and in 2015 and 2016 she was also among the producers acknowledged when Homeland received back-to-back Emmy nominations for Best Drama. 

Glatter began her directing career through American Film Institute’s Directing Workshop for Women, and her short film Tales of the Meeting and Parting was nominated for an Academy Award. Her first series was Amazing Stories, followed by Twin Peaks, for which she received her first Directors Guild Award nomination. She made her feature film directorial debut with Now and Then, followed by The Proposition. For HBO she directed State of Emergency, Into the Homeland and The Promise.

To say her career has been prolific is an understatement. I recently spoke with Glatter about making Homeland, the Emmys, her love of post and mentoring other women.

Have you started Season 8?
Not yet. We’re not even prepping yet since we just finished Season 7. The first thing that happens is the writers, myself, Claire, Mandy and the DP go to DC to meet with the intelligence community, and what we find out from talking to these people then becomes the next season.

Is it definitely the final one?
I think that’s unclear yet. It might go on.

Do you like being a showrunner?
I love it. As a producing director I love being involved with the whole novel, the whole big picture of the season, as well as the individual chapters. There’s an overall look and feel and tone to each season, and I also get to direct four of the 12 episodes. We have other amazing directors who come in, and that creates energy and brings in a different point of view, yet it fits into the whole, overall storyline and feel of the season. We have this wonderful working environment on the show where the best idea wins, so it’s very creative. Then every year we reinvent the wheel, with a new look and feel for the show.

What are the big challenges of showrunning?
A complex show like this is filled with all sorts of challenges and joys, in equal parts. Obviously, everything starts with the material and the script, then I have my partners in crime — Claire and Mandy — who’re so creative and collaborative. The big challenge is that we try to make each season new and fresh. People might look at one of Season 7’s shows and think we have it all dialed in with the same sets, the same crew in place and so on, but we’re always going to a new place with a new crew and new sets, and we shoot for 11 days, but nine of those are usually on location, so we have very few on stage. In terms of logistics, that is really challenging. Every episode’s different, but that’s generally how it works. Then we’re exploring very relevant and timely issues. We just dealt with “a nation divided” and Russian meddling, and these are things that everyone’s talking about right now.

As mentioned, you direct a lot of shows. Do you prefer doing that?
It’s more that I see myself as a director first and foremost, although I love showrunning and producing as well. I want to be the producer that every director would love to have, since I try to give them whatever they need to tell their best stories. I have a great line producer/partner named Michael Klick. He’s the magic man who makes it all happen. The key in TV is to have great partners, and our core creative team — DPs David Klein and Giorgio Scali, our editors, production designers, costume designers — are all so talented. You want the smartest team you can get, and then let the best idea win, and we always aim for a very cinematic look.

Where do you post?
We did all the editing on the Fox lot and all the sound mixing at Universal. Encore does the VFX.

Do you like the post process?
I love it. It’s where it all comes together, and you get to look at everything you’ve done and re-shape it and make it the best it can be. Along with everyone else, I have my idea of what each episode will be, and then we have our editing team and they bring all their ideas to it, so it’s very exciting to watch it evolve.

Talk about editing. You have several editors, I assume because of the time factor. How does that work?
We have three editors — Jordan Goldman, Harvey Rosenstock and Philip Carr Neel — because of the tight schedule, and they each handle different episodes and focus solely on those… unless we run into a problem.

You have a big cast and a lot of stuff going on in each episode. What are the big editing challenges?
Telling the best possible story and staying true to the theme and subtext and intent of that story. The show really lives in shades of gray with a lot of ambiguity. A classic Homeland scene will feature two characters on completely opposing sides of an issue, and they’re both right and both wrong. So maybe that makes you think more about that issue and question your beliefs, and I love that about the show.

This show has a great score by Sean Callery, as well as great sound design. Can you talk about the importance of sound and music.
Sean’s an amazing storyteller and brilliant at what he does, as the show has a huge amount of anxiety in it, and he captures that and helps amplify it — but without making it obvious. He’s been on the show since the start, and we’ve also worked with the same sound team for a long time, and sound design’s such a key element in our show. We spend a lot of time on all the little details that you may not notice in a scene.

How important are the Emmys to you and a show like this?
You can’t ever think about awards while you’re working. You just focus on trying to tell the best possible story, but in this golden age of TV it’s great to be recognized by your peers. It’s huge!

There’s been a lot of talk about lack of opportunity for women in movies. Are things better in TV?
Things are changing and improving. I’ve been involved with mentoring women directors for many, many years, and I hope we soon get to a point where gender is no longer an issue. If you’d asked me back when I began directing over 20 years ago if we’d still be discussing all this today, I’d have said, “Absolutely not!” But here we still are. The truth is, showrunning and directing are hard and challenging jobs, but women should have the same opportunities as men. Simple as that.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Luke Scott to run newly created Ridley Scott Creative Group

Filmmaker Ridley Scott has brought together all of his RSA Films-affiliated companies together in a multi-business restructure to form the Ridley Scott Creative Group. The Ridley Scott Creative Group aims to strengthen the network across the related companies to take advantage of emerging opportunities across all entertainment genres as well as their existing work in film, television, branded entertainment, commercials, VR, short films, documentaries, music video, design and animation, and photography.

Ridley Scott

Luke Scott will assume the role of global CEO, working with founder Ridley Scott and partners Jake and Jordan Scott to oversee the future strategic direction of the newly formed group.

“We are in a new golden age of entertainment,” says Ridley Scott. “The world’s greatest brands, platforms, agencies, new entertainment players and studios are investing hugely in entertainment. We have brought together our talent, capabilities and creative resources under the Ridley Scott Creative Group, and I look forward to maximizing the creative opportunities we now see unfolding with our executive team.”

The companies that make up the RSCG will continue to operate autonomously but will now offer clients synergy under the group offering.

The group includes commercial production company RSA Films, which produced such ads such as Apple’s 1984, Budweiser’s Super Bowl favorite Lost Dog and more recently, Adidas Originals’ Original is Never Finished campaign, as well as branded content for Johnnie Walker, HBO, Jaguar, Ford, Nike and the BMW Films series; the music video production company founded by Jake Scott, Black Dog Films (Justin Timberlake, Maroon 5, Nicki Minaj, Beyoncé, Coldplay, Björk and Radiohead); the entertainment marketing company 3AM; commercial production company Hey Wonderful founded by Michael Di Girolamo; newly founded UK commercial production company Darling Films; and film and television production company Scott Free (Gladiator, Taboo, The Martian, The Good Wife), which continues to be led by David W. Zucker, president, US television; Kevin J. Walsh, president, US film; and Ed Rubin-Managing, director, UK television/film.

“Our Scott Free Films and Television divisions have an unprecedented number of movies and shows in production,” reports Luke Scott. “We are also seeing a huge appetite for branded entertainment from our brand and agency partners to run alongside high-quality commercials. Our entertainment marketing division 3AM is extending its capabilities to all our partners, while Black Dog is moving into short films and breaking new, world-class talent. It is a very exciting time to be working in entertainment.”

 

 

 

 

 

NBCUni 7.26

Netflix’s Lost in Space: New sounds for a classic series

By Jennifer Walden

Netflix’s Lost in Space series, a remake of the 1965 television show, is a playground for sound. In the first two episodes alone, the series introduces at least five unique environments, including an alien planet, a whole world of new tech — from wristband communication systems to medical analysis devices — new modes of transportation, an organic-based robot lifeform and its correlating technologies, a massive explosion in space and so much more.

It was a mission not easily undertaken, but if anyone could manage it, it was four-time Emmy Award-winning supervising sound editor Benjamin Cook of 424 Post in Culver City. He’s led the sound teams on series like Starz’s Black Sails, Counterpart and Magic City, as well as HBO’s The Pacific, Rome and Deadwood, to name a few.

Benjamin Cook

Lost in Space was a reunion of sorts for members of the Black Sails post sound team. Making the jump from pirate ships to spaceships were sound effects editors Jeffrey Pitts, Shaughnessy Hare, Charles Maynes, Hector Gika and Trevor Metz; Foley artists Jeffrey Wilhoit and Dylan Tuomy-Wilhoit; Foley mixer Brett Voss; and re-recording mixers Onnalee Blank and Mathew Waters.

“I really enjoyed the crew on Lost in Space. I had great editors and mixers — really super-creative, top-notch people,” says Cook, who also had help from co-supervising sound editor Branden Spencer. “Sound effects-wise there was an enormous amount of elements to create and record. Everyone involved contributed. You’re establishing a lot of sounds in those first two episodes that are carried on throughout the rest of the season.”

Soundscapes
So where does one begin on such a sound-intensive show? The initial focus was on the soundscapes, such as the sound of the alien planet’s different biomes, and the sound of different areas on the ships. “Before I saw any visuals, the showrunners wanted me to send them some ‘alien planet sounds,’ but there is a huge difference between Mars and Dagobah,” explains Cook. “After talking with them for a bit, we narrowed down some areas to focus on, like the glacier, the badlands and the forest area.”

For the forest area, Cook began by finding interesting snippets of animal, bird and insect recordings, like a single chirp or little song phrase that he could treat with pitching or other processing to create something new. Then he took those new sounds and positioned them in the sound field to build up beds of creatures to populate the alien forest. In that initial creation phase, Cook designed several tracks, which he could use for the rest of the season. “The show itself was shot in Canada, so that was one of the things they were fighting against — the showrunners were pretty conscious of not making the crash planet sound too Earthly. They really wanted it to sound alien.”

Another huge aspect of the series’ sound is the communication systems. The characters talk to each other through the headsets in their spacesuit helmets, and through wristband communications. Each family has their own personal ship, called a Jupiter, which can contact other Jupiter ships through shortwave radios. They use the same radios to communicate with their all-terrain vehicles called rovers. Cook notes these ham radios had an intentional retro feel. The Jupiters can send/receive long-distance transmissions from the planet’s surface to the main ship, called Resolute, in space. The families can also communicate with their Jupiters ship’s systems.

Each mode of communication sounds different and was handled differently in post. Some processing was handled by the re-recording mixers, and some was created by the sound editorial team. For example, in Episode 1 Judy Robinson (Taylor Russell) is frozen underwater in a glacial lake. Whenever the shot cuts to Judy’s face inside her helmet, the sound is very close and claustrophobic.

Judy’s voice bounces off the helmet’s face-shield. She hears her sister through the headset and it’s a small, slightly futzed speaker sound. The processing on both Judy’s voice and her sister’s voice sounds very distinct, yet natural. “That was all Onnalee Blank and Mathew Waters,” says Cook. “They mixed this show, and they both bring so much to the table creatively. They’ll do additional futzing and treatments, like on the helmets. That was something that Onna wanted to do, to make it really sound like an ‘inside a helmet’ sound. It has that special quality to it.”

On the flipside, the ship’s voice was a process that Cook created. Co-supervisor Spencer recorded the voice actor’s lines in ADR and then Cook added vocoding, EQ futz and reverb to sell the idea that the voice was coming through the ship’s speakers. “Sometimes we worldized the lines by playing them through a speaker and recording them. I really tried to avoid too much reverb or heavy futzing knowing that on the stage the mixers may do additional processing,” he says.

In Episode 1, Will Robinson (Maxwell Jenkins) finds himself alone in the forest. He tries to call his father, John Robinson (Toby Stephens — a Black Sails alumni as well) via his wristband comm system but the transmission is interrupted by a strange, undulating, vocal-like sound. It’s interference from an alien ship that had crashed nearby. Cook notes that the interference sound required thorough experimentation. “That was a difficult one. The showrunners wanted something organic and very eerie, but it also needed to be jarring. We did quite a few versions of that.”

For the main element in that sound, Cook chose whale sounds for their innate pitchy quality. He manipulated and processed the whale recordings using Symbolic Sound’s Kyma sound design workstation.

The Robot
Another challenging set of sounds were those created for Will Robinson’s Robot (Brian Steele). The Robot makes dying sounds, movement sounds and face-light sounds when it’s processing information. It can transform its body to look more human. It can use its hands to fire energy blasts or as a tool to create heat. It says, “Danger, Will Robinson,” and “Danger, Dr. Smith.” The Robot is sometimes a good guy and sometimes a bad guy, and the sound needed to cover all of that. “The Robot was a job in itself,” says Cook. “One thing we had to do was to sell emotion, especially for his dying sounds and his interactions with Will and the family.”

One of Cook’s trickiest feats was to create the proper sense of weight and movement for the Robot, and to portray the idea that the Robot was alive and organic but still metallic. “It couldn’t be earthly technology. Traditionally for robot movement you will hear people use servo sounds, but I didn’t want to use any kind of servos. So, we had to create a sound with a similar aesthetic to a servo,” says Cook. He turned to the Robot’s Foley sounds, and devised a processing chain to heavily treat those movement tracks. “That generated the basic body movement for the Robot and then we sweetened its feet with heavier sound effects, like heavy metal clanking and deeper impact booms. We had a lot of textures for the different surfaces like rock and foliage that we used for its feet.”

The Robot’s face lights change color to let everyone know if it’s in good-mode or bad-mode. But there isn’t any overt sound to emphasize the lights as they move and change. If the camera is extremely close-up on the lights, then there’s a faint chiming or tinkling sound that accentuates their movement. Overall though, there is a “presence” sound for the Robot, an undulating tone that’s reminiscent of purring when it’s in good-mode. “The showrunners wanted a kind of purring sound, so I used my cat purring as one of the building block elements for that,” says Cook. When the Robot is in bad-mode, the sound is anxious, like a pulsing heartbeat, to set the audience on edge.

It wouldn’t be Lost in Space without the Robot’s iconic line, “Danger, Will Robinson.” Initially, the showrunners wanted that line to sound as close to the original 1960’s delivery as possible. “But then they wanted it to sound unique too,” says Cook. “One comment was that they wanted it to sound like the Robot had metallic vocal cords. So we had to figure out ways to incorporate that into the treatment.” The vocal processing chain used several tools, from EQ, pitching and filtering to modulation plug-ins like Waves Morphoder and Dehumaniser by Krotos. “It was an extensive chain. It wasn’t just one particular tool; there were several of them,” he notes.

There are other sound elements that tie into the original 1960’s series. For example, when Maureen Robinson (Molly Parker) and husband John are exploring the wreckage of the alien ship they discover a virtual map room that lets them see into the solar system where they’ve crashed and into the galaxy beyond. The sound design during that sequence features sound material from the original show. “We treated and processed those original elements until they’re virtually unrecognizable, but they’re in there. We tried to pay tribute to the original when we could, when it was possible,” says Cook.

Other sound highlights include the Resolute exploding in space, which caused massive sections of the ship to break apart and collide. For that, Cook says contact microphones were used to capture the sound of tin cans being ripped apart. “There were so many fun things in the show for sound. From the first episode with the ship crash and it sinking into the glacier to the black hole sequence and the Robot fight in the season finale. The show had a lot of different challenges and a lot of opportunities for sound.”

Lost in Space was mixed in the Anthony Quinn Theater at Sony Pictures in 7.1 surround. Interestingly, the show was delivered in Dolby’s Home Atmos format. Cook explains, “When they booked the stage, the producer’s weren’t sure if we were going to do the show in Atmos or not. That was something they decided to do later so we had to figure out a way to do it.”

They mixed the show in Atmos while referencing the 7.1 mix and then played those mixes back in a Dolby Home Atmos room to check them, making any necessary adjustments and creating the Atmos deliverables. “Between updates for visual effects and music as well as the Atmos mixes, we spent roughly 80 days on the dub stage for the 10 episodes,” concludes Cook.


JoJo Whilden/Hulu

Color and audio post for Hulu’s The Looming Tower

Hulu’s limited series, The Looming Tower, explores the rivalries and missed opportunities that beset US law enforcement and intelligence communities in the lead-up to the 9/11 attacks. Based on the Pulitzer Prize-winning book by Lawrence Wright, who also shares credit as executive producer with Dan Futterman and Alex Gibney, the show’s 10 episodes paint an absorbing, if troubling, portrait of the rise of Osama bin Laden and al-Qaida, and offer fresh insight into the complex people who were at the center of the fight against terrorism.

For The Looming Tower’s sound and picture post team, the show’s sensitive subject matter and blend of dramatizations and archival media posed significant technical and creative challenges. Colorist Jack Lewars and online editor Jeff Cornell of Technicolor PostWorks New York, were tasked with integrating grainy, run-and-gun news footage dating back to 1998 with crisply shot, high-resolution original cinematography. Supervising sound designer/effects mixer Ruy García and re-recording mixer Martin Czembor from PostWorks, along with a Foley team from Alchemy Post Sound, were charged with helping to bring disparate environments and action to life, but without sensationalizing or straying from historical accuracy.

L-R: colorist Jack Lewars and editor Jeff Cornell

Lewars and Cornell mastered the series in Dolby Vision HDR, working from the production’s camera original 2K and 3.4K ArriRaw files. Most of the color grading and conforming work was done with a light touch, according to Lewars, as the objective was to adhere to a look that appeared real and unadulterated. The goal was for viewers to feel they are behind the scenes, watching events as they happened.

Where more specific grades were applied, it was done to support the narrative. “We developed different look sets for the FBI and CIA headquarters, so people weren’t confused about where we were,” Lewars explains. “The CIA was working out of the basement floors of a building, so it’s dark and cool — the light is generated by fluorescent fixtures in the room. The FBI is in an older office building — its drop ceiling also has fluorescent lighting, but there is a lot of exterior light, so its greener, warmer.”

The show adds to the sense of realism by mixing actual news footage and other archival media with dramatic recreations of those same events. Lewars and Cornell help to cement the effect by manipulating imagery to cut together seamlessly. “In one episode, we matched an interview with Osama bin Laden from the late ‘90s with new material shot with an Arri Alexa,” recalls Lewars. “We used color correction and editorial effects to blend the two worlds.”

Cornell degraded some scenes to make them match older, real-world media. “I took the Alexa material and ‘muddied’ it up by exporting it to compressed SD files and then cutting it back into the master timeline,” he notes. “We also added little digital hits to make it feel like the archival footage.”

While the color grade was subtle and adhered closely to reality, it still packed an emotional punch. That is most apparent in a later episode that includes the attack on the Twin Towers. “The episode starts off in New York early in the morning,” says Lewars. “We have a series of beauty shots of the city and it’s a glorious day. It’s a big contrast to what follows — archival footage after the towers have fallen where everything is a white haze of dust and debris.”

Audio Post
The sound team also strove to remain faithful to real events. García recalls his first conversations about the show’s sound needs during pre-production spotting sessions with executive producer Futterman and editor Daniel A. Valverde. “It was clear that we didn’t want to glamorize anything,” he says. “Still, we wanted to create an impact. We wanted people to feel like they were right in the middle of it, experiencing things as they happened.”

García says that his sound team approached the project as if it were a documentary, protecting the performances and relying on sound effects that were authentic in terms of time and place. “With the news footage, we stuck with archival sounds matching the original production footage and accentuating whatever sounds were in there that would connect emotionally to the characters,” he explains. “When we moved to the narrative side with the actors, we’d take more creative liberties and add detail and texture to draw you into the space and focus on the story.”

He notes that the drive for authenticity extended to crowd scenes, where native speakers were used as voice actors. Crowd sounds set in the Middle East, for example, were from original recordings from those regions to ensure local accents were correct.

Much like Lewars approach to color, García and his crew used sound to underscore environmental and psychological differences between CIA and FBI headquarters. “We did subtle things,” he notes. “The CIA has more advanced technology, so everything there sounds sharper and newer versus the FBI where you hear older phones and computers.”

The Foley provided by artists and mixers from Alchemy Post Sound further enhanced differences between the two environments. “It’s all about the story, and sound played a very important role in adding tension between characters,” says Leslie Bloome, Alchemy’s lead Foley artist. “A good example is the scene where CIA station chief Diane Marsh is berating an FBI agent while casually applying her makeup. Her vicious attitude toward the FBI agent combined with the subtle sounds of her makeup created a very interesting juxtaposition that added to the story.”

In addition to footsteps, the Foley team created incidental sounds used to enhance or add dimension to explosions, action and environments. For a scene where FBI agents are inspecting a warehouse filled with debris from the embassy bombings in Africa, artists recorded brick and metal sounds on a Foley stage designed to capture natural ambience. “Normally, a post mixer will apply reverb to place Foley in an environment,” says Foley artist Joanna Fang. “But we recorded the effects in our live room to get the perspective just right as people are walking around the warehouse. You can hear the mayhem as the FBI agents are documenting evidence.”

“Much of the story is about what went wrong, about the miscommunication between the CIA and FBI,” adds Foley mixer Ryan Collison, “and we wanted to help get that point across.”

The soundtrack to the series assumed its final form on a mix stage at PostWorks. Czembor spent weeks mixing dialogue, sound and music elements into what he described as a cinematic soundtrack.

L-R: Martin Czember and Ruy Garcia

Czembor notes that the sound team provided a wealth of material, but for certain emotionally charged scenes, such as the attack on the USS Cole, the producers felt that less was more. “Danny Futterman’s conceptual approach was to go with almost no sound and let the music and the story speak for themselves,” he says. “That was super challenging, because while you want to build tension, you are stripping it down so there’s less and less and less.”

Czembor adds that music, from composer Will Bates, is used with great effect throughout the series, even though it might go by unnoticed by viewers. “There is actually a lot more music in the series than you might realize,” he says. “That’s because it’s not so ‘musical;’ there aren’t a lot of melodies or harmonies. It’s more textural…soundscapes in a way. It blends in.”

Czembor says that as a longtime New Yorker, working on the show held special resonance for him, and he was impressed with the powerful, yet measured way it brings history back to life. “The performances by the cast are so strong,” he says. “That made it a pleasure to work on. It inspires you to add to the texture and do your job really well.”


Showrunner Dan Pyne — Amazon Studios’ Bosch

By Iain Blair

How popular is Amazon’s Emmy-nominated detective show Bosch? So popular that the streaming service ordered up Season 5 before Season 4 even debuted in April. This critically acclaimed hour-long series is Amazon’s longest-running Prime Original.

Based on the best-selling novels by Michael Connelly, the show stars Titus Welliver (Lost) as LAPD homicide detective Harry Bosch, alongside a large ensemble cast that includes James Hector (The Wire), Amy Aquino (Being Human), Madison Lintz (The Walking Dead) and Lance Reddick (The Wire).

Dan Pyne

Season 4 kicked off with the murder of a high-profile attorney on the eve of his civil rights trial against the LAPD. Bosch is assigned to lead a task force — that suspects fellow cops — to solve the crime before the city erupts in a riot. Bosch must pursue every lead, even if it turns the spotlight back on his own department. One murder intertwines with another, and Bosch must reconcile his not-so-simple past to find a justice that has long eluded him.

Bosch was developed for television by Eric Overmyer (Treme, The Wire, Homicide: Life on the Streets) and is executive produced by Dan Pyne, whose film credits include The Manchurian Candidate, Pacific Heights, Sum of All Fears and Fracture. He also co-created and co-produced The Street, a syndicated police procedural starring Stanley Tucci.

I recently spoke with Pyne about making the show, the Emmys, production and post.

Eric Overmyer, who took a break to work on another Amazon production, The Man in the High Castle, is coming back to act as co-showrunner with you on Season 5. How will you split duties?
Good question! We’re making it up as we go along. I’d never worked with him before, but I did have a longtime partner before. Basically, we talk a lot and come to an agreement about any issues. The great thing about this show is that every season is its own entity, with its own rhythm and voice.

Have you started on Season 5?
We have, and we have almost six episodes plotted out and we start shooting in early August.

Where do you shoot?
We’re based at Red Studios in Hollywood, which isn’t far from the local police station, and we recreated that interior on a set, and it’s so uncannily similar — apart from a few details — that it’s hard to tell them apart. We shoot a lot in Hollywood and then locations all over the city and further afield.

Bosch has a very cinematic feel and look.
Yes, and that’s in part due to our producer, Pieter Jan Brugge, who comes from film and who’s worked a lot with Michael Mann on movies like Heat and Miami Vice — this is his first TV show.  I come from a film background too, so we take more of a film approach and discuss stuff like the sound and visuals and what they should be like and how a scene should play before we even start shooting

Do you like being a showrunner?
I do, and I was well-trained. I came up intending to write movies and then fell into TV and got lucky with the show Hard Copy. I became a specialist in 10-episode arcs (like Bosch). I got to work with several legendary showrunners, including Richard Levinson and William Link, who created such classic TV shows as Columbo and Murder, She Wrote, and William Sackheim who did Gidget and The Flying Nun. I haven’t done showrunning that much, but I always ran my own shows. And it’s the closest thing to being a film director because you have control and get to collaborate with everyone else, including post, which is so much fun.

Where do you post?
At Red Studios. We have a great team, including post producer Mark Douglas and post supervisor Tayah Geist, and we do color correction at Warner Bros. with colorist Scott Klein. He works closely with Pieter and the DPs, and sometimes we’ll make an entire season cooler or warmer. We’ll get inspiration from movies — maybe Japanese films or Blade Runner and so on — to help us with the color palette.

Do you like the post process?
Very much. It’s so true when people say, like with movies, there are three TV shows — the one you write, the one you shoot and the one you make in post. It’s in post where you start all over again each time, see what you’ve got, what works and doesn’t. I’ve always enjoyed collaborating with editors and all the sound people. I love what they bring to storytelling: the way they can help say things and elevate the material and help make stuff clear for the audience, and show or hide emotionality.

Let’s talk about editing. You have several editors, I assume because of the time factor. How does that work?
Yes, it’s the tight schedule, and in order to get it done we rotate three editors. One does four shows, and the other two do three, and we alternate. That way they have enough time to cut a show and pretty much finish it before moving to the next one. If you only have two editors, the workload’s overwhelming, so we use three — Steve Cohen, Jacque Toberen and Kevin Casey. There are three assistant editors — Rafael Nur, Judy Guerrero and Knar Kitabjian.

You have a big cast, and a lot of stuff going on in each episode. What are the big editing challenges?
We have two big ones. As it’s an investigative show it can be very detailed, so clarity is always a priority — making sure that all the story and plot points get pulled in the right way and land correctly. Mike’s books are very twisty, and they have a lot of tiny clues, so as detectives walk through scenes they’ll see things. There’s a lot of visual foreshadowing of things that come back later. The other one is making sure the episodes have pace. Police procedurals tend to fall into a pattern of walk-and-talk, and we try and avoid that.

Who does the VFX, and what’s involved?
Moving Target, and their Alan Munro is our VFX supervisor. We use a lot more VFX than you would imagine. One of the show’s hallmarks is making it all as real as possible, so when we recently did a show with scenes in tunnels we used a lot of masking and CGI as we were limited in the way we could light and shoot the actual tunnels. I find that visual effects really make TV a lot easier. We do some plates depending on the situation, but often it’s really small stuff and cleanup to make it all even more realistic.

This show has a great score by Jesse Voccia and great sound design. Can you talk about the importance of sound and music?
Sound is always difficult because it’s TV. It’s the nature of the beast. TV often shoots so fast that the production sound can be problematic, what with traffic noise and so on, and you have to fix all that. But I love playing with sound and working with the sound designers and ADR guys. We do all the mixing at Technicolor at Paramount, and we have a great crew. There’s not a lot of music in the show, but we try to make it all count and not use short little stingers like TV usually does, or score a chase. We’ll use sound design or natural sound instead.

How important are the Emmys to a show like this, which seems a little underrated?
It would be great to be nominated, although maybe the fans don’t care that much. We do fly under the radar a bit, I think, so more recognition would be very welcome.

Thanks to the ongoing source material, the show could easily run for many more years. Will you do more seasons of the show?
I’d love to. What’s so great is that every season is brand new and different, with its own beginning, middle and end. It plays like a book, so we can really work on the tone and feel.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.


Netflix’s Godless offers big skies and big sounds

By Jennifer Walden

One of the great storytelling advantages of non-commercial television is that content creators are not restricted by program lengths or episode numbers. The total number of episodes in a show’s season can be 13 or 10 or less. An episode can run 75 minutes or 33 minutes. This certainly was the case for writer/director/producer Scott Frank when creating his series Godless for Netflix.

Award-winning sound designer, Wylie Stateman, of Twenty Four Seven Sound explains why this worked to their advantage. “Godless at its core is a story-driven ‘big-sky’ Western. The American Western is often as environmentally beautiful as it is emotionally brutal. Scott Frank’s goal for Godless was to create a conflict between good and evil set around a town of mostly female disaster survivors and their complex and intertwined pasts. The Godless series is built like a seven and a half hour feature film.”

Without the constraints of having to squeeze everything into a two-hour film, Frank could make the most of his ensemble of characters and still include the ride-up/ride-away beauty shots that show off the landscape. “That’s where Carlos Rafael Rivera’s terrific orchestral music and elements of atmospheric sound design really came together,” explains Stateman.

Stateman has created sound for several Westerns in his prodigious career. His first was The Long Riders back in 1980. Most recently, he designed and supervised the sound on writer/director Quentin Tarantino’s Django Unchained (which earned a 2013 Oscar nom for sound, an MPSE nom and a BAFTA film nom for sound) and The Hateful Eight (nominated for a 2016 Association of Motion Picture Sound Award).

For Godless, Stateman, co-supervisor/re-recording mixer Eric Hoehn and their sound team have already won a 2018 MPSE Award for Sound Editing for their effects and Foley work, as well as a nomination for editing the dialogue and ADR. And don’t be surprised if you see them acknowledged with an Emmy nom this fall.

Capturing authentic sounds: L-R) Jackie Zhou, Wylie Stateman and Eric Hoehn.

Capturing Sounds On Set
Since program length wasn’t a major consideration, Godless takes time to explore the story’s setting and allows the audience to live with the characters in this space that Frank had purpose-built for the show. In New Mexico, Frank had practical sets constructed for the town of La Belle and for Alice Fletcher’s ranch. Stateman, Hoehn and sound team members Jackie Zhou and Leo Marcil camped out at the set locations for a couple weeks, capturing recordings of everything from environmental ambience to gunfire echoes to horse hooves on dirt.

To avoid the craziness that is inherent to a production, the sound team would set up camp in a location where the camera crew was not. This allowed them to capture clean, high-quality recordings at various times of the day. “We would record at sunrise, sunset and the middle of the night — each recording geared toward capturing a range of authentic and ambient sounds,” says Stateman. “Essentially, our goal was to sonically map each location. Our field recordings were wide in terms of channel count, and broad in terms of how we captured the sound of each particular environment. We had multiple independent recording setups, each capable of recording up to eight channels of high bandwidth audio.”

Near the end of the season, there is a big shootout in the town of La Belle, so Stateman and Hoehn wanted to capture the sounds of gunfire and the resulting echoes at that location. They used live rounds, shooting the same caliber of guns used in the show. “We used live rounds to achieve the projectile sounds. A live round sounds very different than a blank round. Blanks just go pop-pop. With live rounds you can literally feel the bullet slicing through the air,” says Stateman.

Eric Hoehn

Recording on location not only supplied the team with a wealth of material to draw from back in the studio, it also gave them an intensive working knowledge of the actual environments. Says Hoehn, “It was helpful to have real-world references when building the textures of the sound design for these various locations and to know firsthand what was happening acoustically, like how the wind was interacting with those structures.”

Stateman notes how quiet and lifeless the location was, particularly at Alice’s ranch. “Part of the sound design’s purpose was to support the desolate dust bowl backdrop. Living there, eating breakfast in the quiet without anybody from the production around was really a wonderful opportunity. In fact, Scott Frank encouraged us to look deep and listen for that feel.”

From Big Skies to Big City
Sound editorial for Godless took place at Light Iron in New York, which is also where the show got its picture editing — by Michelle Tesoro, who was assisted by Hilary Peabody and Charlie Greene. There, Hoehn had a Pro Tools HDX 3 system connected to the picture department’s Avid Media Composer via the Avid Nexis. They could quickly pull in the picture editorial mix, balance out the dialog and add properly leveled sound design, sending that mix back to Tesoro.

“Because there were so many scenes and so much material to get through, we really developed a creative process that centered around rapid prototype mixing,” says Hoehn. “We wanted to get scenes from Michelle and her team as soon as possible and rapidly prototype dialogue mixing and that first layer of sound design. Through the prototyping process, we could start to understand what the really important sounds were for those scenes.”

Using this prototyping audio workflow allowed the sound team to very quickly share concepts with the other creative departments, including the music and VFX teams. This workflow was enhanced through a cloud-based film management/collaboration tool called Pix. Pix let the showrunners, VFX supervisor, composer, sound team and picture team share content and share notes.

“The notes feature in Pix was so important,” explains Hoehn. “Sometimes there were conversations between the director and editor that we could intuitively glean information from, like notes on aesthetic or pace or performance. That created a breadcrumb trail for us to follow while we were prototyping. It was important for us to get as much information as we could so we could be on the same page and have our compass pointed in the right direction when we were doing our first pass prototype.”

Often their first pass prototype was simply refined throughout the post process to become the final sound. “Rarely were we faced with the situation of having to re-cut a whole scene,” he continues. “It was very much in the spirit of the rolling mix and the rolling sound design process.”

Stateman shares an example of how the process worked. “When Michelle first cut a scene, she might cut to a beauty shot that would benefit from wind gusts and/or enhanced VFX and maybe additional dust blowing. We could then rapidly prototype that scene with leveled dialog and sound design before it went to composer Carlos Rafael Rivera. Carlos could hear where/when we were possibly leveraging high-density sound. This insight could influence his musical thinking — if he needed to come in before, on or after the sound effects. Early prototyping informed what became a highly collaborative creative process.”

The Shootout
Another example of the usefulness of Pix was shootout in La Belle in Episode 7. The people of the town position themselves in the windows and doorways of the buildings lining the street, essentially surrounding Frank Griffin (Jeff Daniels) and his gang. There is a lot of gunfire, much of it bridging action on and off camera, and that needed to be represented well through sound.

Hoehn says they found it best to approach the gun battle like a piece of music by playing with repeated rhythms. Breaking the anticipated rhythm helped to make the audience feel off-guard. They built a sound prototype for the scene and shared it via Pix, which gave the VFX department access to it.

“A lot of what we did with sound helped the visual effects team by allowing them to understand the density of what we were doing with the ambient sounds,” says Hoehn. “If we found that rhythmically it was interesting to have a wind gust go by, we would eventually see a visual effect for that wind going by.”

It was a back-and-forth collaboration. “There are visual rhythms and sound rhythms and the fact that we could prototype scenes early led us to a very efficient way of doing long-form,” says Stateman. “It’s funny that features used to be considered long-form but now ‘long-form’ is this new, time-unrestrained storytelling. It’s like we were making a long-form feature, but one that was seven and a half hours. That’s really the beauty of Netflix. Because the shows aren’t tethered to a theatrical release timeframe, we can make stories that linger a little bit and explore the wider eccentricities of character and the time period. It’s really a wonderful time for this particular type of filmmaking.”

While program length may be less of an issue, production schedule lengths still need to be kept in line. With the help of Pix, editorial was able to post the entire show with one team. “Everyone on our small team understood and could participate in the mission,” says Stateman. Additionally, the sound design rapid prototype mixing process allowed everyone in editorial to carry all their work forward, from day one until the last day. The Pro Tools session that they started with on day one was the same Pro Tools session that they used for print mastering seven months later.

“Our sound design process was built around convenient creative approval and continuous refinement of the complete soundtrack. At the end of the day, the thing that we heard most often was that this was a wonderful and fantastic way to work, and why would we ever do it any other way,” Stateman says.

Creating a long-form feature like Godless in an efficient manner required a fluid, collaborative process. “We enjoyed a great team effort,” says Stateman. “It’s always people over devices. What we’ve come to say is, ‘It’s not the devices. It’s people left to their own devices who will discover really novel ways to solve creative problems.’”


Jennifer Walden is a New Jersey-based audio engineer and writer. Follow her on Twitter at @audiojeney.


The Duffer Brothers: Showrunners on Netflix’s Stranger Things

By Iain Blair

Kids in jeopardy! The Demogorgon! The Hawkins Lab! The Upside Down! Thrills and chills! Since they first pitched their idea for Stranger Things, a love letter to 1980’s genre films set in 1983 Indiana, twin brothers Matt and Ross Duffer have quickly established themselves as masters of suspense in the science-fiction and horror genres.

The series was picked up by Netflix, premiered in the summer of 2016, and went on to become a global phenomenon, with the brothers at the helm as writers, directors and executive producers.

The Duffer Brothers

The atmospheric drama, about a group of nerdy misfits and strange events in an outwardly average small town, nailed its early ’80s vibe and overt homages to that decade’s master pop storytellers: Steven Spielberg and Stephen King. It quickly made stars out of its young ensemble cast — Millie Bobby Brown, Natalia Dyer, Charlie Heaton, Joe Keery, Gaten Matarazzo, Caleb McLaughlin, Noah Schnapp, Sadie Sink and Finn Wolfhard.

It also quickly attracted a huge, dedicated fan base, critical plaudits and has won a ton of awards, including Emmys, a SAG Award for Best Ensemble in a Drama Series and two Critics Choice Awards for Best Drama Series and Best Supporting Actor in a Drama Series. The show has also been nominated for a number of Golden Globes.

I recently talked with the Duffers, who are already hard at work on the highly anticipated third season (which will premiere on Netflix in 2019) about making the ambitious hit series, their love of post and editing, and VFX.

How’s the new season going?
Matt Duffer: We’re two weeks into shooting, and it’s going great. We’re very excited about it as there are some new tones and it’s good to be back on the ground with everyone. We know all the actors better and better, the kids are getting older and are becoming these amazing performers — and they were great before. So we’re having a lot of fun.

Are you shooting in Atlanta again?
Ross Duffer: We are, and we love it there. It’s really our home base now, and we love all these pockets of neighborhoods that have not changed at all since the ‘80s, and there is an incredible variety of locations. We’re also spreading out a lot more this season and not spending so much time on stages. We have more locations to play with.

Will all the episodes be released together next year, like last time? That would make binge-watchers very happy.
Matt: Yes, but we like to think of it more as like a big movie release. To release one episode per week feels so antiquated now.

The show has a very cinematic look and feel, so how do you balance that with the demands of TV?
Ross: It’s interesting, because we started out wanting to make movies and we love genre, but with a horror film they want big scares every few minutes. That leaves less room for character development. But with TV, it’s always more about character, as you just can’t sustain hours and hours of a show if you don’t care about the people. So ‘Stranger Things’ was a world where we could tell a genre story, complete with the monster, but also explore character in far more depth than we could in a movie.

Matt: Movies and TV are almost opposites in that way. In movies, it’s all plot and no character, and in TV it’s about character and you have to fight for plot. We wanted this to have pace and feel more like a movie, but still have all the character arcs. So it’s a constant balancing act, and we always try and favor character.

Where do you post the show?
Matt: All in Hollywood, and the editors start working while we’re shooting. After we shoot in Atlanta, we come back to our offices and do all the post and VFX work right there. We do all the sound mix and all the color timing at Technicolor down the road. We love post. You never have enough time on the set, and there’s all this pressure if you want to redo a shot or scene, but in post if a scene isn’t working we can take time to figure it out.

Tell us about the editing. I assume you’re very involved?
Ross: Very. We have two editors this season. We brought back one of our original editors, Dean Zimmerman, from season one. We are also using Nat Fuller, who was on season two. He was Dean’s assistant originally and then moved up, so they’ve been with us since the start. Editing’s our favorite part of the whole process, and we’re right there with them because we love editing. We’re very hands on and don’t just give notes and walk away. We’re there the whole time.

Aren’t you self-taught in terms of editing?
Matt: (Laughs) I suppose. We were taught the fundamentals of Avid at film school, but you’re right. We basically taught ourselves to edit as kids, and we started off just editing in-camera, stopping and starting, and playing the music from a tape recorder. They weren’t very good, but we got better.

When iMovie came out we learned how to put scenes together, so in college the transition to Avid wasn’t that hard. We fell in love with editing and just how much you can elevate your material in post. It’s magical what you can do with the pace, performances, music and sound design, and then you add all the visual effects and see it all come together in post. We love seeing the power of post as you work to make your story better and better.

How early on do you integrate post and VFX with the production?
Ross: On day one now. The biggest change from season one to two was that we integrated post far earlier in the second season — even in the writing stage. We had concept artists and the VFX guys with us the whole time on set, and they were all super-involved. So now it all kind of happens together.

All the VFX are a much bigger deal. For last season we had a lot more VFX than the first year — about 1,400 shots, which is a huge amount, like a big movie. The first season it wasn’t a big deal. It was a very old-school approach, with mainly practical effects, and then in the middle we realized we were being a bit naïve, so we brought in Paul Graff as our VFX supervisor on season two, and he’s very experienced. He’s worked on big movies like The Wolf of Wall Street as well as Game of Thrones and Boardwalk Empire, and he’s doing this season too. He’s in Atlanta with us on the shoot.

We have two main VFX houses on the show — Atomic Fiction and Rodeo — they’re both incredible, and I think all the VFX are really cinematic now.

But isn’t it a big challenge in terms of a TV show’s schedule?
Ross: You’re right, and it’s always a big time crunch. Last year we had to meet that Halloween worldwide release date and we were cutting it so close trying to finish all the shots in time.

Matt: Everyone expects movie-quality VFX — just in a quarter of the time, or less. So it’s all accelerated.

The show has a very distinct, eerie, synth-heavy score by Kyle Dixon and Michael Stein, the Grammy nominated duo. How important is the music and sound, which won several Emmys last year?
Ross: It’s huge. We use it so much for transitions, and we have great sound designers — including Brad North and Craig Henighan — and great mixers, and we pay a lot of attention to all of it. I think TV has always put less emphasis on great sound compared to film, and again, you’re always up against the scheduling, so it’s always this balancing act.

You can’t mix it for a movie theater as very few people have that set up at home, so you have to design it for most people who’re watching on iPhones, iPads and so on, and optimize it for that, so we mostly mix in stereo. We want the big movie sound, but it’s a compromise.

The DI must be vital?
Matt: Yes, and we work very closely with colorist Skip Kimball (who recently joined Efilm), who’s been with us since the start. He was very influential in terms of how the show ended up looking. We’d discussed the kind of aesthetic we wanted, and things we wanted to reference and then he played around with the look and palette. We’ve developed a look we’re all really happy with. We have three different LUTs on set designed by Skip and the DP Tim Ives will choose the best one for each location.

Everyone’s calling this the golden age of TV. Do you like being showrunners?
Ross: We do, and I feel we’re very lucky to have the chance to do this show — it feels like a big family. Yes, we originally wanted to be movie directors, but we didn’t come into this industry at the right time, and Netflix has been so great and given us so much creative freedom. I think we’ll do a few more seasons of this, and then maybe wrap it up. We don’t want to repeat ourselves.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.


Capturing, creating historical sounds for AMC’s The Terror

By Jennifer Walden

It’s September 1846. Two British ships — the HMS Erebus and HMS Terror — are on an exploration to find the Northwest Passage to the Pacific Ocean. The expedition’s leader, British Royal Navy Captain Sir John Franklin, leaves the Erebus to dine with Captain Francis Crozier aboard the Terror. A small crew rows Franklin across the frigid, ice-choked Arctic Ocean that lies north of Canada’s mainland to the other vessel.

The opening overhead shot of the two ships in AMC’s new series The Terror (Mondays 9/8c) gives the audience an idea of just how large those ice chunks are in comparison with the ships. It’s a stunning view of the harsh environment, a view that was completely achieved with CGI and visual effects because this series was actually shot on a soundstage at Stern Film Studio, north of Budapest, Hungary.

 Photo Credit: Aidan Monaghan/AMC

Emmy- and BAFTA-award-winning supervising sound editor Lee Walpole of Boom Post in London, says the first cut he got of that scene lacked the VFX, and therefore required a bit of imagination. “You have this shot above the ships looking down, and you see this massive green floor of the studio and someone dressed in a green suit pushing this boat across the floor. Then we got the incredible CGI, and you’d never know how it looked in that first cut. Ultimately, mostly everything in The Terror had to be imagined, recorded, treated and designed specifically for the show,” he says.

Sound plays a huge role in the show. Literally everything you hear (except dialogue) was created in post — the constant Arctic winds, the footsteps out on the packed ice and walking around on the ship, the persistent all-male murmur of 70 crew members living in a 300-foot space, the boat creaks, the ice groans and, of course, the creature sounds. The pervasive environmental sounds sell the harsh reality of the expedition.

Thanks to the sound and the CGI, you’d never know this show was shot on a soundstage. “It’s not often that we get a chance to ‘world-create’ to that extent and in that fashion,” explains Walpole. “The sound isn’t just there in the background supporting the story. Sound becomes a principal character of the show.”

Bringing the past to life through sound is one of Walpole’s specialties. He’s created sound for The Crown, Peaky Blinders, Klondike, War & Peace, The Imitation Game, The King’s Speech and more. He takes a hands-on approach to historical sounds, like recording location footsteps in Lancaster House for the Buckingham Palace scenes in The Crown, and recording the sounds on-board the Cutty Sark for the ships in To the Ends of the Earth (2005). For The Terror, his team spent time on-board the Golden Hind, which is a replica of Sir Francis Drake’s ship of the same name.

During a 5am recording session, the team — equipped with a Sound Devices 744T recorder and a Schoeps CMIT 5U mic — captured footsteps in all of the rooms on-board, pick-ups and put-downs of glasses and cups, drops of various objects on different surfaces, gun sounds and a selection of rigging, pulleys and rope moves. They even recorded hammering. “We took along a wooden plank and several hammers,” describes Walpole. “We laid the plank across various surfaces on the boat so we could record the sound of hammering resonating around the hull without causing any damage to the boat itself.”

They also recorded footsteps in the ice and snow and reached out to other sound recordists for snow and ice footsteps. “We wanted to get an authentic snow creak and crunch, to have the character of the snow marry up with the depth and freshness of the snow we see at specific points in the story. Any movement from our characters out on the pack ice was track-laid, step-by-step, with live recordings in snow. No studio Foley feet were recorded at all,” says Walpole.

In The Terror, the ocean freezes around the two ships, immobilizing them in pack ice that extends for miles. As the water continues to freeze, the ice grows and it slowly crushes the ships. In the distance, there’s the sound of the ice growing and shifting (almost like tectonic plates), which Walpole created from sourced hydrophone recordings from a frozen lake in Canada. The recordings had ice pings and cracking that, when slowed and pitched down, sounded like massive sheets of ice rubbing against each other.

Effects editor Saoirse Christopherson capturing sounds on board a kayak in the Thames River.

The sounds of the ice rubbing against the ships were captured by one of the show’s sound effects editor, Saoirse Christopherson, who along with an assistant, boarded a kayak and paddled out onto the frozen Thames River. Using a Røde NT2 and a Roland R26 recorder with several contact mics strapped to the kayak’s hull, they spent the day grinding through, over and against the ice. “The NT2 was used to directionally record both the internal impact sounds of the ice on the hull and also any external ice creaking sounds they could generate with the kayak,” says Walpole.

He slowed those recordings down significantly and used EQ and filters to bring out the low-mid to low-end frequencies. “I also fed them through custom settings on my TC Electronic reverbs to bring them to life and to expand their scale,” he says.

The pressure of the ice is slowly crushing the ships, and as the season progresses the situation escalates to the point where the crew can’t imagine staying there another winter. To tell that story through sound, Walpole began with recordings of windmill creaks and groans. “As the situation gets more dire, the sound becomes shorter and sharper, with close, squealing creaks that sound as though the cabins themselves are warping and being pulled apart.”

In the first episode, the Erebus runs aground on the ice and the crew tries to hack and saw the ice away from the ship. Those sounds were recorded by Walpole attacking the frozen pond in his backyard with axes and a saw. “That’s my saw cutting through my pond, and the axe material is used throughout the show as they are chipping away around the boat to keep the pack ice from engulfing it.”

Whether the crew is on the boat or on the ice, the sound of the Arctic is ever-present. Around the ships, the wind rips over the hulls and howls through the rigging on deck. It gusts and moans outside the cabin windows. Out on the ice, the wind constantly groans or shrieks. “Outside, I wanted it to feel almost like an alien planet. I constructed a palette of designed wind beds for that purpose,” says Walpole.

He treated recordings of wind howling through various cracks to create a sense of blizzard winds outside the hull. He also sourced recordings of wind at a disused Navy bunker. “It’s essentially these heavy stone cells along the coast. I slowed these recordings down a little and softened all of them with EQ. They became the ‘holding airs’ within the boat. They felt heavy and dense.”

Below Deck
In addition to the heavy-air atmospheres, another important sound below deck was that of the crew. The ships were entirely occupied by men, so Walpole needed a wide and varied palette of male-only walla to sustain a sense of life on-board. “There’s not much available in sound libraries, or in my own library — and certainly not enough to sustain a 10-hour show,” he says.

So they organized a live crowd recording session with a group of men from CADS — an amateur dramatics society from Churt, just outside of London. “We gave them scenarios and described scenes from the show and they would act it out live in the open air for us. This gave us a really varied palette of worldized effects beds of male-only crowds that we could sit the loop group on top of. It was absolutely invaluable material in bringing this world to life.”

Visually, the rooms and cabins are sometimes quite similar, so Walpole uses sound to help the audience understand where they are on the ship. In his cutting room, he had the floor plans of both ships taped to the walls so he could see their layouts. Life on the ship is mainly concentrated on the lower deck — the level directly below the upper deck. Here is where the men sleep. It also has the canteen area, various cabins and the officers’ mess.

Below that is the Orlop deck, where there are workrooms and storerooms. Then below that is the hold, which is permanently below the waterline. “I wanted to be very meticulous about what you would hear at the various levels on the boat and indeed the relative sound level of what you are hearing in these locations,” explains Walpole. “When we are on the lower two decks, you hear very little of the sound of the men above. The soundscapes there are instead focused on the creaks and the warping of the hull and the grinding of the ice as it crushes against the boat.”

One of Walpole’s favorite scenes is the beginning of Episode 4. Capt. Francis Crozier (Jared Harris) is sitting in his cabin listening to the sound of the pack ice outside, and the room sharply tilts as the ice shifts the ship. The scene offers an opportunity to tell a cause-and-effect story through sound. “You hear the cracks and pings of the ice pack in the distance and then that becomes localized with the kayak recordings of the ice grinding against the boat, and then we hear the boat and Crozier’s cabin creak and pop as it shifts. This ultimately causes his bottle to go flying across the table. I really enjoyed having this tale of varying scales. You have this massive movement out on the ice and the ultimate conclusion of it is this bottle sliding across the table. It’s very much a sound moment because Crozier is not really saying anything. He’s just sitting there listening, so that offered us a lot of space to play with the sound.”

The Tuunbaq
The crew in The Terror isn’t just battling the elements, scurvy, starvation and mutiny. They’re also being killed off by a polar bear-like creature called the Tuunbaq. It’s part animal, part mythical creature that is tied to the land and spirits around it. The creature is largely unseen for the first part of the season so Walpole created sonic hints as to the creature’s make-up.

Walpole worked with showrunner David Kajganich to find the creature’s voice. Kajganich wanted the creature to convey a human intelligence, and he shared recordings of human exorcisms as reference material. They hired voice artist Atli Gunnarsson to perform parts to picture, which Walpole then fed into the Dehumaniser plug-in by Krotos. “Some of the recordings we used raw as well, says Walpole. “This guy could make these crazy sounds. His voice could go so deep.”

Those performances were layered into the track alongside recordings of real bears, which gave the sound the correct diaphragm, weight, and scale. “After that, I turned to dry ice screeches and worked those into the voice to bring a supernatural flavor and to tie the creature into the icy landscape that it comes from.”

Lee Walpole

In Episode 3, an Inuit character named Lady Silence (Nive Nielsen) is sitting in her igloo and the Tuunbaq arrives snuffling and snorting on the other side of the door flap. Then the Tuunbaq begins to “sing” at her. To create that singing, Walpole reveals that he pulled Lady Silence’s performance of The Summoning Song (the song her people use to summon the Tuunbaq to them) from a later episode and fed that into Dehumaniser. “This gave me the creature’s version. So it sounds like the creature is singing the song back to her. That’s one for the diehards who will pick up on it and recognize the tune,” he says.

Since the series is shot on a soundstage, there’s no usable bed of production sound to act as a jumping off point for the post sound team. But instead of that being a challenge, Walpole finds it liberating. “In terms of sound design, it really meant we had to create everything from scratch. Sound plays such a huge role in creating the atmosphere and the feel of the show. When the crew is stuck below decks, it’s the sound that tells you about the Arctic world outside. And the sound ultimately conveys the perils of the ship slowly being crushed by the pack ice. It’s not often in your career that you get such a blank canvas of creation.”


Jennifer Walden is a New Jersey-based audio engineer and writer. You can follow her on Twitter at @audiojeney.


Netflix’s Altered Carbon: the look, the feel, the post

By Randi Altman

Netflix’s Altered Carbon is a new sci-fi series set in a dystopian future where people are immortal thanks to something called “stacks,” which contain their entire essence — their personalities, their memories, everything. The one setback is that unless you are a Meth (one of the rich and powerful), you need to buy a “sleeve” (a body) for your stack, and it might not have any resemblance to your former self. It could be a different color, a different sex, a different age, a different everything. You have to take what you can get.

Based on a 2002 novel by Richard K. Morgan, it stars Swedish actor Joel Kinnaman.

Jill Bogdanowicz

We reached out to the show’s colorist, Jill Bogdanowicz, as well as post producer Allen Marshall Palmer to find out more about the show’s varied and distinctive looks.

The look has a very Blade Runner-type feel. Was that in homage to the films?
Bogdanowicz: The creators wanted a film noir look. Blade Runner is the same genre, but the show isn’t specifically an homage to Blade Runner.

Palmer: I’ll leave that for fans to dissect.

Jill, can you talk about your process? What tools did you use?
Bogdanowicz: I designed a LUT to create that film noir look before shooting. I actually provided a few options, and they chose my favorite one and used it throughout. After they shot everything and I had all 10 episodes in my bay, I got familiar with the content, wrapped my head around the story and came up with ideas to tell that story with color.

The show covers many different times and places so scenes needed to be treated visually to show audiences where the story is and what’s happened. I colored both HDR (Dolby Vision) and SDR passes using DaVinci Resolve.

I worked very closely with both DPs — Martin Ahlgren and Neville Kidd — in pre-timing the show, and they gave me a nice idea of what they were looking for so I had a great starting point. They were very close knit. The entire team on this project was an absolute pleasure, and it was a great creative collaboration, which comes through in the final product of the show.

The show is shot and posted like a feature and has a feature feel. Was that part of your marching orders?
Bogdanowicz: I’m primarily a features colorist, so I’m very familiar with the film noir look and heavy VFX, and that’s one reason I was included on this project. It was right up my alley.

Palmer: We approached Altered Carbon as a 10-part feature rather than a television series. I coined the term “feature episodic entertainment,” which describes what we were aspiring to — destination viewing instead of something merely disposable. In a world with so many viewing options, we wanted to command the viewer’s full attention, and fans are rewarded for that attention.

We were very concerned about how images, especially VFX, were going to look in HDR so we had weekly VFX approval sessions with Jill, our mastering colorist, in her color timing bay.

Executive producers and studio along with the VFX and post teams were able to sit together — adjusting color corrections if needed before giving final approval on shots. This gave us really good technical and creative quality control. Despite our initial concerns about VFX shots in HDR, we found that with vendors like Double Negative and Milk with their robust 16-bit EXR pipelines we weren’t “breaking” VFX shots when color correcting for HDR.

How did the VFX affect the workflow?
Bogdanowicz: Because I was brought on so early, the LUT I created was shared with the VFX vendors so they had a good estimation of the show’s contrast. That really helped them visualize the look of the show so that the look of the shots was pretty darn close by the time I got them in my bay.

Was there a favorite scene or scenes?
Bogdanowicz: There are so many spectacular moments, but the emotional core for me is in episode 104 when we see the beginning of the Kovacs and Quell love story in the past and how that love gives Kovacs the strength to survive in the present day.

Palmer: That’s a tough question! There are so many, it’s hard to choose. I think the episode that really jumps out is the one in which Joel Kinnaman’s character is being tortured and the content skips back and forth in time, changes and alternates between VR and reality. It was fun to create a different visual language for each space.

Can you talk about challenges in the process and how you overcame them?
Bogdanowicz: The show features a lot of VFX and they all need to look as real as possible, so I had to make sure they felt part of the worlds. Fortunately, VFX supervisor Everett Burrell and his team are amazing and the VFX is top notch. Coming up with different ideas and collaborating with producers James Middleton and Laeta Kalogridis on those ideas was a really fun creative challenge. I used the Sapphire VFX plugin for Resolve to heavily treat and texture VR looks in different ways.

Palmer: In addition to the data management challenges on the picture side, we were dealing with mixing in Dolby Atmos. It was very easy to get distracted with how great the Atmos mix sounds — the downmixes generally translated very well, but monitoring in 5.1 and 2.0 did reveal some small details that we wanted to adjust. Generally, we’re very happy with how both the picture and sound is translating into viewer’s homes.

Dolby Vision HDR is great at taking what’s in the color bay into the home viewing environment, but there are still so many variables in viewing set-ups that you can still end up chasing your own tail. It was great to see the behind the scenes of Netflix’s dedication to providing the best picture and sound quality through the service.

The look of the AI hotel was so warm. I wanted to live there. Can you talk about that look?
Bogdanowicz: The AI hotel look was mostly done in design and lighting. I saw the warm practical lights and rich details in the architecture and throughout the hotel and ran with it. I just aimed to keep the look filmic and inviting.

What about the look of where the wealthy people lived?
Bogdanowicz: The Meth houses are above the clouds, so we kept the look very clean and cool with a lot of true whites and elegant color separation.

Seems like there were a few different looks within the show?
Bogdanowicz: The same LUT for the film noir look is used throughout the show, but the VR looks are very different. I used Sapphire to come up with different concepts and textures for the different VR looks, from rich quality of the high-end VR to the cheap VR found underneath a noodle bar.

Allen, can you walk us through the workflow from production to post?
Palmer: With the exception of specialty shots, the show was photographed on Alexa 65 — mostly in 5K mode, but occasionally in 6.5K and 4K for certain lenses. The camera is beautiful and a large part of the show’s cinematic look, but it generates a lot of data (about 1.9TB/hour for 5K) so this was the first challenge. The camera dictates using the Codex Vault system, and Encore Vancouver was up to the task for handling this material. We wanted to get the amount of data down for post, so we generated 4096×2304 ProRes 4444XQ “mezzanine” files, which we used for almost all of the show assembly and VFX pulls.

During production and post, all of our 4K files were kept online at Efilm using their portal system. This allowed us fast, automated access to the material so we could quickly do VFX pulls, manage color, generate 16-bit EXR frames and send those off to VFX vendors. We knew that time saved there was going to give us more time on the back end to work creatively on the shots so the Portal was a very valuable tool.

How many VFX shots did you average per episode? Seems like a ton, especially with the AI characters. Who provided those and what were those turnarounds like?
Palmer: There were around 2,300 visual effects shots during this season — probably less than most people would think because we built a large Bay City street inside a former newspaper printing facility outside of Vancouver. The shot turnaround varied depending on the complexity and where we were in the schedule. We were lucky that something like episode 1’s “limo ride” sequence was started very early on because it gave us a lot of time to refine our first grand views of Bay City. Our VFX supervisor Everett Burrell and VFX producer Tony Meagher were able to get us out in front of a lot of challenges like the amount of 3D work in the last two episodes by starting that work early on since we knew we would need those shots from the script and prep phase.

The 16th annual VES Award winners

The Visual Effects Society (VES) celebrated artists and their work at the 16th Annual VES Awards, which recognize outstanding visual effects artistry and innovation in film, animation, television, commercials, video games and special venues.

Seven-time host, comedian Patton Oswalt, presided over more than 1,000 guests at the Beverly Hilton. War for the Planet of the Apes was named photoreal feature film winner, earning four awards. Coco was named top animated film, also earning four awards. Games of Thrones was named best photoreal episode and garnered five awards — the most wins of the night. Samsung; Do What You Can’t; Ostrich won top honors in the commercial field, scoring three awards. These top four contenders collectively garnered 16 of the 24 awards for outstanding visual effects.

President of Marvel Studios Kevin Feige presented the VES Lifetime Achievement Award to producer/writer/director Jon Favreau. Academy Award-winning producer Jon Landau presented the Georges Méliès Award to Academy Award-winning visual effects master Joe Letteri, VES. Awards presenters included fan-favorite Mark Hamill, Coco director Lee Unkrich, War for the Planet of the Apes director Matt Reeves, Academy Award-nominee Diane Warren, Jaime Camil, Dan Stevens, Elizabeth Henstridge, Sydelle Noel, Katy Mixon and Gabriel “Fluffy” Iglesias.

Here is a list of the winners:

Outstanding Visual Effects in a Photoreal Feature

War for the Planet of the Apes

Joe Letteri

Ryan Stafford

Daniel Barrett

Dan Lemmon

Joel Whist

 

Outstanding Supporting Visual Effects in a Photoreal Feature

Dunkirk

Andrew Jackson

Mike Chambers

Andrew Lockley

Alison Wortman

Scott Fisher

 

Outstanding Visual Effects in an Animated Feature

Coco

Lee Unkrich

Darla K. Anderson

David Ryu

Michael K. O’Brien

 

Outstanding Visual Effects in a Photoreal Episode

Game of Thrones: Beyond the Wall

Joe Bauer

Steve Kullback

Chris Baird

David Ramos

Sam Conway

 

Outstanding Supporting Visual Effects in a Photoreal Episode

Black Sails: XXIX

Erik Henry

Terron Pratt

Yafei Wu

David Wahlberg

Paul Dimmer

 

Outstanding Visual Effects in a Real-Time Project

Assassin’s Creed Origins

Raphael Lacoste

Patrick Limoges

Jean-Sebastien Guay

Ulrich Haar

 

Outstanding Visual Effects in a Commercial

Samsung Do What You Can’t: Ostrich

Diarmid Harrison-Murray

Tomek Zietkiewicz

Amir Bazazi

Martino Madeddu

 

 

Outstanding Visual Effects in a Special Venue Project

Avatar: Flight of Passage

Richard Baneham

Amy Jupiter

David Lester

Thrain Shadbolt

 

Outstanding Animated Character in a Photoreal Feature

War for the Planet of the Apes: Caesar

Dennis Yoo

Ludovic Chailloleau

Douglas McHale

Tim Forbes

 

Outstanding Animated Character in an Animated Feature

Coco: Hèctor

Emron Grover

Jonathan Hoffman

Michael Honsel

Guilherme Sauerbronn Jacinto

 

Outstanding Animated Character in an Episode or Real-Time Project

Game of Thrones The Spoils of War: Drogon Loot Train Attack

Murray Stevenson

Jason Snyman

Jenn Taylor

Florian Friedmann

 

Outstanding Animated Character in a Commercial

Samsung Do What You Can’t: Ostrich

David Bryan

Maximilian Mallmann

Tim Van Hussen

Brendan Fagan

 

Outstanding Created Environment in a Photoreal Feature

Blade Runner 2049; Los Angeles

Chris McLaughlin

Rhys Salcombe

Seungjin Woo

Francesco Dell’Anna

 

Outstanding Created Environment in an Animated Feature

Coco: City of the Dead

Michael Frederickson

Jamie Hecker

Jonathan Pytko

Dave Strick

 

Outstanding Created Environment in an Episode, Commercial, or Real-Time Project

Game of Thrones; Beyond the Wall; Frozen Lake

Daniel Villalba

Antonio Lado

José Luis Barreiro

Isaac de la Pompa

 

Outstanding Virtual Cinematography in a Photoreal Project

Guardians of the Galaxy Vol. 2: Groot Dance/Opening Fight

James Baker

Steven Lo

Alvise Avati

Robert Stipp

 

Outstanding Model in a Photoreal or Animated Project

Blade Runner 2049: LAPD Headquarters

Alex Funke

Steven Saunders

Joaquin Loyzaga

Chris Menges

 

Outstanding Effects Simulations in a Photoreal Feature

War for the Planet of the Apes

David Caeiro Cebrián

Johnathan Nixon

Chet Leavai

Gary Boyle

 

Outstanding Effects Simulations in an Animated Feature

Coco

Kristopher Campbell

Stephen Gustafson

Dave Hale

Keith Klohn

 

Outstanding Effects Simulations in an Episode, Commercial, or Real-Time Project 

Game of Thrones; The Dragon and the Wolf; Wall Destruction

Thomas Hullin

Dominik Kirouac

Sylvain Nouveau

Nathan Arbuckle

  

Outstanding Compositing in a Photoreal Feature

War for the Planet of the Apes

Christoph Salzmann

Robin Hollander

Ben Warner

Beck Veitch

 

Outstanding Compositing in a Photoreal Episode

Game of Thrones The Spoils of War: Loot Train Attack

Dom Hellier

Thijs Noij

Edwin Holdsworth

Giacomo Matteucci

 

Outstanding Compositing in a Photoreal Commercial

Samsung Do What You Can’t: Ostrich

Michael Gregory

Andrew Roberts

Gustavo Bellon

Rashabh Ramesh Butani

 

Outstanding Visual Effects in a Student Project

Hybrids

Florian Brauch

Romain Thirion

Matthieu Pujol

Kim Tailhades 

 

 

 

Capturing Foley for Epix’s Berlin Station

Now in its second season on Epix, the drama series Berlin Station centers on undercover agents, diplomats and whistleblowers inhabiting a shadow world inside the German capital.

Leslie Bloome

Working under the direction of series supervising sound editor Ruy Garcia, Westchester, New York-based Foley studio Alchemy Post Sound is providing Berlin Station with cinematic sound. Practical effects, like the clatter of weapons and clinking glass, are recorded on the facility’s main Foley stage. Certain environmental effects are captured on location at sites whose ambience is like the show’s settings. Interior footsteps, meanwhile, are recorded in the facility’s new “live” room, a 1,300-square-foot space with natural reverb that’s used to replicate the environment of rooms with concrete, linoleum and tile floors.

Garcia wants a soundtrack with a lot of detail and depth of field,” explains lead Foley artist and Alchemy Post founder Leslie Bloome. “So, it’s important to perform sounds in the proper perspective. Our entire team of editors, engineers and Foley artists need to be on point regarding the location and depth of field of sounds we’re recording. Our aim is to make every setting feel like a real place.”

A frequent task for the Foley team is to come up with sounds for high-tech cameras, surveillance equipment and other spy gadgetry. Foley artist Joanna Fang notes that sophisticated wall safes appear in several episodes, each one featuring differing combinations of electronic, latch and door sounds. She adds that in one episode a character has a microchip concealed in his suit jacket and the Foley team needed to invent the muffled crunch the chip makes when the man is frisked. “It’s one of those little ‘non-sounds’ that Foley specializes in,” she says. “Most people take it for granted, but it helps tell the story.”

The team is also called on to create Foley effects associated with specific exterior and interior locations. This can include everything from seedy safe houses and bars to modern office suites and upscale hotel rooms. When possible, Alchemy prefers to record such effects on location at sites closely resembling those pictured on-screen. Bloome says that recording things like creaky wood floors on location results in effects that sound more real. “The natural ambiance allows us to grab the essence of the moment,” he explains, “and keep viewers engaged with the scene.”

Footsteps are another regular Foley task. Fang points out that there is a lot of cat-and-mouse action with one character following another or being pursued, and the patter of footsteps adds to the tension. “The footsteps are kind of tough,” she says. “Many of the characters are either diplomats or spies and they all wear hard soled shoes. It’s hard to build contrast, so we end up creating a hierarchy, dark powerful heels for strong characters, lighter shoes for secondary roles.”

For interior footsteps, large theatrical curtains are used to adjust the ambiance in the live stage to fit the scene. “If it’s an office or a small room in a house, we draw the curtains to cut the room in half; if it’s a hotel lobby, we open them up,” Fang explains. “It’s amazing. We’re not only creating depth and contrast by using different types of shoes and walking surfaces, we’re doing it by adjusting the size of the recording space.”

Alchemy edits their Foley in-house and delivers pre-mixed and synced Foley that can be dropped right into the final mix seamlessly. “The things we’re doing with location Foley and perspective mixing are really cool,” says Foley editor and mixer Nicholas Seaman. “But it also means the responsibility for getting the sound right falls squarely on our shoulders. There is no ‘fix in the mix.’ From our point of view, the Foley should be able to stand on its own. You should be able to watch a scene and understand what’s going on without hearing a single line of dialogue.”

The studio used Neumann U87 and KMR81 microphones, a Millennia mic-pre and Apogee converter, all recorded into Avid Pro Tools on a C24 console. In addition to recording a lot of guns, Alchemy also borrowed a Doomsday prep kit for some of the sounds.

The challenge to deliver sound effects that can stand up to that level of scrutiny keeps the Foley team on its toes. “It’s a fascinating show,” says Fang. “One moment, we’re inside the station with the usual office sounds and in the next edit, we’re in the field in the middle of a machine gun battle. From one episode to the next, we never know what’s going to be thrown at us.”

House of Moves add Selma Gladney-Edelman, Alastair Macleod

Animation and motion capture studio House of Moves (HOM) has strengthened its team with two new hires — Selma Gladney-Edelman was brought on as executive producer and Alastair Macleod as head of production technology. The two industry vets are coming on board as the studio shifts to offer more custom short- and long-form content, and expands its motion capture technology workflows to its television, feature film, video game and corporate clients.

Selma Gladney-Edelman was most recently VP of Marvel Television for their primetime and animated series. She has worked in film production, animation and visual effects, and was a producer on multiple episodic series at Walt Disney Television Animation, Cartoon Network and Universal Animation. As director of production management across all of the Discovery Channels, she oversaw thousands of hours of television and film programming including TLC projects Say Yes To the Dress, Little People, Big World and Toddlers and Tiaras, while working on the team that garnered an Oscar nom for Werner Herzog’s Encounters at the End of the World and two Emmy wins for Best Children’s Animated Series for Tutenstein.

Scotland native Alastair Macleod is a motion capture expert who has worked in production, technology development and as an animation educator. His production experience includes work on films such as Lord of the Rings: The Two Towers, The Matrix Reloaded, The Matrix Revolutions, 2012, The Twilight Saga: Breaking Dawn — Part 2 and Kubo and the Two Strings for facilities that include Laika, Image Engine, Weta Digital and others.

Macleod pioneered full body motion capture and virtual reality at the research department of Emily Carr University in Vancouver. He was also the head of animation at Vancouver Film School and an instructor at Capilano University in Vancouver. Additionally, he developed PeelSolve, a motion capture solver plug-in for Autodesk Maya.

The sound of Netflix’s The Defenders

By Jennifer Walden

Netflix’s The Defenders combines the stories of four different Marvel shows already on the streaming service: Daredevil, Iron Fist, Luke Cage and Jessica Jones. In the new show, the previously independent superheroes find themselves all wanting to battle the same foe —a cultish organization called The Hand, which plans to destroy New York City. Putting their differences aside, the superheroes band together to protect their beloved city.

Supervising sound editor Lauren Stephens, who works at Technicolor at Paramount, has earned two Emmy nominations for her sound editing work on Daredevil. And she supervised the sound for each of the aforementioned Marvel series, with the exception of Jessica Jones. So when it came to designing The Defenders she was very conscious of maintaining the specific sonic characteristics they had already established.

“We were dedicated to preserving the palette of each of the previous Marvel characters’ neighborhoods and sound effects,” she explains. “In The Defenders, we wanted viewers of the individual series to recognize the sound of Luke’s Harlem and Daredevil’s Hell’s Kitchen, for example. In addition, we kept continuity for all of the fight material and design work established in the previous four series. I can’t think of another series besides Better Call Saul that borrows directly from its predecessors’ sound work.”

But it wasn’t all borrowed material. Eventually, Luke Cage (Mike Colter), Daredevil (Charlie Cox), Jessica Jones (Krysten Ritter), Iron Fist (Finn Jones) and Elektra Natchios (Elodie Yung) come together to fight The Hand’s leader Alexandra Reid (Sigourney Weaver). “We experience new locations, and new fighting techniques and styles,” says Stephens. “Not to mention that half the city gets destroyed by The Hand. We haven’t had that happen in the previous series.”

Even though these Netflix/Marvel series are based on superheroes, the sound isn’t overly sci-fi. It’s as though the superheroes have more practical superhuman abilities. Stephens says their fight sounds are all real punches and impacts, with some design elements added only when needed, such as when Iron Fist’s iron fist is activated. “At the heart of our punches, for instance, is the sound of a real fist striking a side of beef,” she says. “It sounds like you’d expect, and then we amp it up when we mix. We record a ton of cloth movement and bodies scraping and sliding and tumbling in Foley. Those elements connect us to the humans on-screen.”

Since most of the violence plays out in hand-to-hand combat, it takes a lot of editing to make those fight scenes, and it involves contributions from several sound departments. Stephens has her hard effects team — led by sound designer Jordon Wilby (who has worked on all the Netflix/Marvel series) cut sound effects for every single punch, grab, flip, throw and land. In addition, they cut metal shings and whooshes, impacts and drops for weapons, crashes and bumps into walls and furniture, and all the gunshot material.

Stephens then has the Technicolor Foley team — Foley artists Zane Bruce and Lindsay Pepper and mixer Antony Zeller —cover all the footsteps, cloth “scuffle,” wall bumps, body falls and grabs. Additionally, she has dialogue editor Christian Buenaventura clean up any dialogue that occurs within or around the fight scenes. With group ADR, they replace every grunt and effort for each individual in the fight so that they have ultimate control over every element during the mix.

Stephens finds Gallery’s SpotStudio to be very helpful for cueing all the group ADR. “I shoot a lot of group ADR for the fights and to help create the right populated feel for NYC. SpotStudio is a slick program that interfaces well with Avid’s Pro Tools. It grabs timecode location of ADR cues and can then output that to many word processing programs. Personally, I use FileMaker Pro. I can make great cuesheets that are easy to format and use for engineers and talent.”

All that effort results in fight scenes that feel “relentless and painful,” says Stephens. “I want them to have movement, tons of detail and a wide range of dynamics. I want the fights to sound great wherever our fans are listening.”

The most challenging fight in The Defenders happens in the season finale, when the superheroes fight The Hand in the sublevels of a building. “That underground fight was the toughest simply because it was endless and shot with a 360-degree turn. I focused on what was on-screen and continued those sounds just until the action passed out of frame. This kept our tracks from getting too cluttered but still gives us the right idea that 60 people are going at it,” concludes Stephens

Fear the Walking Dead: Encore colorist teams with DPs for parched look

The action of AMC’s zombie-infused Fear the Walking Dead this season is set in a brittle, drought-plagued environment, which becomes progressively more parched as the story unfolds. So when production was about to commence, the show’s principals were concerned that the normally-arid shoot locations in Mexico had undergone record rainfall and were suffused with verdant vegetation as far as the eye could see.

Pankaj Bajpai of Encore, who has color graded the series from the start, and the two new cinematographers hired for this season — Christopher LaVasseur and Scott Peck — conferred early on about how best to handle this surprising development.

It wouldn’t have been practical to move locations or try to “dress” the scenes to look as described on the page, nor would the budget allow for addressing the issue through VFX. Bajpai, who, in addition to his colorist work also oversees new workflows for Encore, realized that although he could produce the desired effect in his FilmLight Baselight toolset through multiple keys and windows, that too would be less than practical.

Instead, he proposed using a technique that’s far from standard operating procedure for a series. “We got ‘under the hood’ of the Baselight,” he says, “and set up color suppression matrices,” which essentially use mathematical equations to define the degree to which each of the primary colors — red, green and blue — can be represented in an image. The technique, he explains, “allows you to be much more specific about suppressing certain hues without affecting everything else as much as you would by keying a color or manipulating saturation.”

Once designed, these restrictions on the green within the imagery could be dialed up or down, primarily affecting just the colors in the foliage that the filmmakers wanted to re-define, without collateral damage to skin tones and other elements that they didn’t want effects. “I knew that the cinematographers could shoot in the location and we could alter the environment as necessary in the grade,” Bajpai says. He showed the DPs how effective the technique was, and they quickly got on board. Peck, who was able to sit in on the grading for the first episode, recalls, “One of the things I was concerned with was this whole question about the green [foliage] because I knew in the story as the season progresses, water becomes less available. So this idea of changing the greens had to be a gradual process up to around episode nine. There was still a lot of discussion about how we are going to do this. But I knew just working with Pankaj at Encore for a day, that we could do it in the color grade.”

Of course, there was more to work out between Bajpai and the cinematographers, who’d been charged by the producers with taking the look in a somewhat new direction. “Wherever possible I wanted to plan as much with the cinematographers early on so that we’re all working toward a common goal,” he says.

Prior to this season’s start of production, Bajpai and the two DPs developed a shooting LUT to use in conjunction with the specific combination of lenses and the Arri sensors they would use to shoot the season. “Scott recommended using the Hawk T1 prime lenses,” says LaVasseur, “and I suggested going with a fairly low-contrast LUT.” Borrowing language from the photochemical days, he explains, “We wanted to start with a soft image and then ‘print’ really hard,” to yield the show’s edgy, contrasty type of look.

Bajpai calibrated the DPs’ laptops so that they’d be able to get the most out of sample-graded images that he would send them as he started coloring episodes. “We would provide notes when Pankaj had completed a pass,” says LaVasseur, but it was usually just a few very small tweaks I was asking for. We were all working toward the same goal so there weren’t surprises in the way he graded anything.”

“Pankaj had it done very quickly, especially the handling of the green,” Peck adds. “The show needed that look to build to a certain point and then stay there, but the actual locations weren’t cooperative. We were able to just shoot and we all knew what it needed to look like after Pankaj worked on it.”

“Communication is so important,” LaVasseur stresses. “You need to have the DPs, production designer and costume designer working together on the look. You need to know that your colorist is part of the discussion so they’re not taking the images in some other direction than intended. I come from the film days and we would always say, ‘Plan your shoot. Shoot your plan.’ That’s how we approached this season, and I think it paid off.”

The challenges of dialogue and ice in Game of Thrones ‘Beyond the Wall’

By Jennifer Walden

Fire-breathing dragons and hordes of battle-ready White Walkers are big attention grabbers on HBO’s Game of Thrones, but they’re not the sole draw for audiences. The stunning visual effects and sound design are just the gravy on the meat and potatoes of a story that has audiences asking for more.

Every line of dialogue is essential for following the tangled web of storylines. It’s also important to take in the emotional nuances of the actors’ performances. Striking the balance between clarity and dynamic delivery isn’t an easy feat. When a character speaks in a gruff whisper because, emotionally, it’s right for the scene, it’s the job of the production sound crew and the post sound crew to make that delivery work.

At Formosa Group’s Hollywood location, an Emmy-winning post sound team works together to put as much of the on-set performances on the screen as possible. They are supervising sound editor Tim Kimmel, supervising dialogue editor Paul Bercovitch and dialogue/music re-recording mixer Onnalee Blank.

Blank and the show’s mixing team picked up a 2018 Emmy for Outstanding Sound Mixing For a Comedy or Drama Series (One Hour) for their work on Season 7’s sixth episode, “Beyond the Wall.”

Tim Kimmel and Onnalee Blank

“The production sound crew does such a phenomenal job on the show,” says Kimmel. “They have to face so many issues on set, between the elements and the costumes. Even though we have to do some ADR, it would be a whole lot more if we didn’t have such a great sound crew on-set.”

On “Beyond the Wall,” the sound team faced a number of challenges. Starting at the beginning of this episode, Jon Snow [Kit Harington] and his band of fighters trek beyond the wall to capture a White Walker. As they walk across a frozen, windy landscape, they pass the time by getting to know each other more. Here the threads of their individual stories from past seasons start to weave together. Important connections are being made in each line of dialogue.

Those snowy scenes were shot in Iceland and the actors wore metal spikes on their shoes to help them navigate the icy ground. Unfortunately, the spikes also made their footsteps sound loud and crunchy, and that got recorded onto the production tracks.

Another challenge came from their costumes. They wore thick coats of leather and fur, which muffled their dialogue at times or pressed against the mic and created a scratchy sound. Wind was also a factor, sometimes buffeting across the mic and causing a low rumble on the tracks.

“What’s funny is that parts of the scene would be really tough to get cleaned up because the wind is blowing and you hear the spikes on their shoes — you hear costume movements. Then all of a sudden they stop and talk for a minute and the wind stops and it’s the most pristine, quiet, perfect recording you can think of,” explains Kimmel. “It almost sounded like it was shot on a soundstage. In Iceland, when the wind isn’t blowing and the actors aren’t moving, it’s completely quiet and still. So it was tough to get those two to match.”

As supervising sound editor, Kimmel is the first to assess the production dialogue tracks. He goes through an episode and marks priority sections for supervising dialogue editor Bercovitch to tackle first. He says, “That helps Tim [Kimmel] put together his ADR plan. He wants to try to pare down that list as much as possible. For Beyond the Wall, he wanted me to start with the brotherhood’s walk-and-talk north of the wall.”

Bercovitch began his edit by trying to clean up the existing dialogue. For that opening sequence, he used iZotope RX 6’s Spectral Repair to clean up the crunchy footsteps and the rumble of heavy winds. Next, he searched for usable alt takes from the lav and boom tracks, looking for a clean syllable or a full line to cut in as needed. Once Bercovitch was done editing, Kimmel could determine what still needed to be covered in ADR. “For the walk-and-talk beyond the wall, the production sound crew really did a phenomenal job. We didn’t have to loop that scene in its entirety. How they got as good of recordings as they did is honestly beyond me.”

Since most of the principle actors are UK and Ireland-based, the ADR is shot in London at Boom Post with ADR supervisor Tim Hands. “Tim [Hands] records 90% of the ADR for each season. Occasionally, we’ll shoot it here if the actor is in LA,” notes Kimmel.

Hands had more lines than usual to cover on Beyond the Wall because of the battle sequence between the brotherhood and the army of the dead. The principle actors came in to record grunts, efforts and breaths, which were then cut to picture. The battle also included Bercovitch’s selects of usable production sound from that sequence.

Re-recording mixer Blank went through all of those elements on dub Stage 1 at Formosa Hollywood using an Avid S6 console to control the Pro Tools 12 session. She chose vocalizations that weren’t “too breathy, or sound like it’s too much effort because it just sounds like a whole bunch of grunts happening,” she says. “I try to make the ADR sound the same as the production dialogue choices by using EQ, and I only play sounds for whoever is on screen because otherwise it just creates too much confusion.”

One scene that required extensive ADR was for Arya (Maisie Williams) and Sansa (Sophie Turner) on the catwalk at Winterfell. In the seemingly peaceful scene, the sisters share an intimate conversation about their father as snow lightly falls from the sky. Only it wasn’t so peaceful. The snow was created by a loud snow machine that permeated the production sound, which meant the dialogue on the entire scene needed to be replaced. “That is the only dialogue scene that I had no hand in and I’ve been working on the show for three seasons now,” says Bercovitch.

For Bercovitch, his most challenging scenes to edit were ones that might seem like they’d be fairly straightforward. On Dragonstone, Daenerys (Emilia Clarke) and Tyrion (Peter Dinklage) are in the map room having a pointed discussion on succession for the Iron Throne. It’s a talk between two people in an interior environment, but Bercovitch points out that the change of camera perspective can change the sound of the mics. “On this particular scene and on a lot of scenes in the show, you have the characters moving around within the scene. You get a lot of switching between close-ups and longer shots, so you’re going between angles with a usable boom to angles where the boom is not usable.”

There’s a similar setup with Sansa and Brienne (Gwendoline Christie) at Winterfell. The two characters discuss Brienne’s journey to parley with Cersei (Lena Headey) in Sansa’s stead. Here, Bercovitch faced the same challenge of matching mic perspectives, and also had the added challenge of working around sounds from the fireplace. “I have to fish around in the alt takes — and there were a lot of alts — to try to get those scenes sounding a little more consistent. I always try to keep the mic angles sounding consistent even before the dialogue gets to Onnalee (Blank). A big part of her job is dealing with those disparate sound sources and trying to make them sound the same. But my job, as I see it, is to make those sound sources a little less disparate before they get to her.”

One tool that’s helped Bercovitch achieve great dialogue edits is iZotope’s RX 6. “It doesn’t necessarily make cleaning dialogue faster,”he says. “It doesn’t save me a ton of time, but it allows me to do so much more with my time. There is so much more that you can do with iZotope RX 6 that you couldn’t previously do. It still takes nitpicking and detailed work to get the dialogue to where you want it, but iZotope is such an incredibly powerful tool that you can get the result that you want,” he says.

On the dub stage, Blank says one of her most challenging scenes was the opening walk-and-talk sequence beyond the wall. “Half of that was ADR, half was production, and to make it all sound the same was really challenging. Those scenes took me four days to mix.”

Her other challenge was the ADR scene with Arya and Sansa in Winterfell, since every line there was looped. To help the ADR sound natural, as if it’s coming from the scene, Blank processes and renders multiple tracks of fill and backgrounds with the ADR lines and then re-records that back into Avid Pro Tools. “That really helps it sit back into the screen a little more. Playing the Foley like it’s another character helps too. That really makes the scene come alive.”

Bercovitch explains that the final dialogue you hear in a series doesn’t start out that way. It takes a lot of work to get the dialogue to sound like it would in reality. “That’s the thing about dialogue. People hear dialogue all day, every day. We talk to other people and it doesn’t take any work for us to understand when other people speak. Since it doesn’t take any work in one’s life why would it require a lot of work when putting a film together? There’s a big difference between the sound you hear in the world and recorded sound. Once it has been recorded you have to take a lot of care to get those recordings back to a place where your brain reads it as intelligible. And when you’re switching from angle to angle and changing mic placement and perspective, all those recordings sound different. You have to stitch those together and make them sound consistent so it sounds like dialogue you’d hear in reality.”

Achieving great sounding dialogue is a team effort — from production through post. “Our post work on the dialogue is definitely a team effort, from Paul’s editing and Tim Hands’ shooting the ADR so well to Onnalee getting the ADR to match with the production,” explains Kimmel. “We figure out what production we can use and what we have to go to ADR for. It’s definitely a team effort and I am blessed to be working with such an amazing group of people.”


Jennifer Walden is a New Jersey-based audio engineer and writer.

DP David Tattersall on shooting Netflix’s Death Note

Based on the manga series of the same name by Tsugumi Ohba and Takeshi Obata, Death Note stars Nat Wolff as Light Turner, a man who obtains a supernatural notebook that gives him the power to exterminate any living person by writing his or her name in the notebook. Willem Dafoe plays Ryuk, a demonic god of death and the creator of the Death Note. The stylized Netflix feature film was directed by Adam Wingard (V/H/S/, You’re Next) and shot by cinematographer David Tattersall (The Green Mile, Star Wars: Episode I, II and III) with VariCam 35s in 4K RAW with Codex VRAW recorders.

Tattersall had previously worked with Wingard on the horror television series, Outcast. Per Tattersall, he wasn’t aware of the manga series of books but during pre-production, he was able to go through a visual treasure trove of manga material that the art department compiled.

Instead of creating a “cartoony” look, Tattersall and Wingard were more influenced by classic horror films, as well as well-crafted movies by David Fincher and Stanley Kubrick. “Adam is a maestro of the horror genre, and he is very familiar with constructing scenes around scary moments and keeping tension,” explains Tattersall. “It wasn’t necessarily whole movies that influenced us — it was more about taking odd sequences that we thought might be relevant to what we were doing. We had a very cool extended foot chase that we referred to The French Connection and Se7en, both of which have a mix of handheld, extreme wides and long lens shots. Also, because of Adam’s love of Kubrick movies, we had compositions with composure and symmetry that are reminiscent of The Shining, or crazy wide-angle stuff from A Clockwork Orange. It sounds like a mish-mash, but we did have rules.”

Dialogue scenes were covered in a realistic non-flashy way and for Tattersall, one of his biggest challenges was dealing with the demon character, Ryuk, both physically and photographically. The team started with a huge puppet character with puppeteers operating it, but it wasn’t a practical approach since many of the scenes were shot in small spaces such as Light’s bedroom.

“Eventually, the practical issue led to us using a mime artist in full costume with the intention of doing face replacement later,” explains Tattersall. “From our testing, the approach of ‘less is more’ became a thing — less light, more shadow and mystery, less visible, more effective. It worked well for this character who is mostly seen hiding in the shadows. It’s similar to the first Jaws movie. The shark is strangely more scary and ominous when you only get a few glimpses in the frame here and there — a suggestion. And that was our approach for the first 75% of the film. You might get a brief lean out of the shadows and a quick lean back in. Often, we would just shoot him out of focus. We’d keep the focus in the foreground for the Light character and Ryuk would be an out-of-focus blob in the background. It’s not until the very end — the final murder sequence — that you get to see him in full head-to-toe clarity.”

Tattersall shot the film with two VariCam 35s as his A and B cameras and had a VariCam LT for backup. He shot in 4K DCI (4096 x 2160) capturing VRAW files to Codex VRAW recorders. For lensing, he shot with Zeiss Master primes with a 2:39:1 extraction. “This set has become a favorite of mine for the past few years and I’ve grown to love them,” says Tattersall. “They are a bit big and heavy, but they open to a T1.3 and they’re so velvety smooth. With this show having so much night work, that extra speed was very useful.”

In terms of RAW capture, Tattersall tried to keep it simple, using Fotokem’s nextLAB for on-set workflow. “It was almost like using a one light printing process,” he explains. “We had three basic looks — a fairly cool dingy look, one that sometimes falls back on the saturation or leans in the cold direction. I have a set of rules, but I occasionally break them. We tried as much as possible to shoot only in the shade — bringing in butterfly nets or shooting on the shady side of buildings during the day. It was Adam’s wish to keep this heavy, moody atmosphere.”

Tattersall used a few tools to capture unique visuals. To capture low angle shots, he used a P+S Skater Scope that lets you shoot low to the ground. “You can also incorporate floating Dutch angles with its motorized internal prism, so this was something we did throughout,” he says. “The horizon line would lean over to one side or the other.” He also used a remote rollover rig, which allowed the camera to roll 180-degrees when on a crane, giving Tattersall a dizzying visual.

“We also shot with a Phantom Flex to shoot 500fps,” continues Tattersall. “We would have low Dutch angles, an 8mm fish eye look and a Lensbaby to degrade the focus even more. The image could get quite wonky on occasion, which is counterpoint to the more classic coverage of the calmer dialogue moments.”

Although he did a lot of night work, Tattersall did not use the native 5,000 ISO. “I have warmed to a new range of LED lights — the Cineo Maverick, Matchbox and Matchstix. They’re all color balanced and they’re all multi-varied Daylight or Tungsten so it’s quick and easy to change the color temperature without the use of gels. We also made use of Arri Skypanels. Outside, we used tried and tested old school HMIs or 9-light or 12-light MaxiBrutes. There’s nothing quite like them in terms of powerful source lights.”

Death Note was finished at Technicolor by colorist Skip Kimball on Blackmagic Resolve. “The grade was mostly about smoothing out the bumps and tweaking the contrast” explains Tattersall. “Since it’s a dark feature, there was an emphasis on a heavy mood — keeping the blacks, with good contrast and saturated colors. But in the end, the photographic stylization came from the camera placement and lens choices working together with the action choreography.

Emmy Awards: OJ: Made in America composer Gary Lionelli

By Jennifer Walden

The aftermath of a tragic event plays out in front of the eyes of the nation. OJ Simpson, wanted for the gruesome murder of his wife and her friend, fails to turn himself in to the authorities. News helicopters follow the police chase that follows Simpson back to his Rockingham residence where they plan to take him into custody. Decades later, three-time Emmy-winning composer Gary Lionelli is presented with the opportunity to score that iconic Bronco chase.

Here, Lionelli talks about his approach to scoring ESPN’s massive documentary OJ: Made in America. His score on Part 3 is currently up for Emmy consideration for Outstanding Music Composition for a Limited Series. The entire OJ: Made in America score is available digitally through Lakeshore Records.

Gary Lionelli

Scoring OJ: Made in America seems like such a huge undertaking. It’s a five-part series, and each part is over 90 minutes long. How did you tackle this beast?
I’d never scored anything that long within such a short timeframe. Because each part was so long, it wasn’t like doing a TV series but more like scoring five 90-minute films back-to-back. I just focused on one cue at a time, putting one foot in front of the other so I wouldn’t feel overwhelmed by the full scope of the work and could relax enough to write the score! I knew I’d get to the finish line at some point, but it seemed so far away most of the time that I just didn’t want to dwell on that.

When you got this project, did they deliver it as one crazy, long piece? Or did they give it to you in its separate parts?
I got everything at once, which was totally mind-boggling. When you get any project, you need to watch it before you start working on it. For this one, it meant watching a seven-and-a-half-hour film, which was a feat in and of itself. The scale was just huge on this. Looking back, my eyelids still twitch.

It was a pretty nerve-racking time because the schedule was really tight. That was one of the most challenging parts of doing this project. I could have used a year to write this music, because five films are ordinarily what I‘d do in a year, not six months. But all of us who write music for film know that you have to work within extreme deadlines as a matter of course. So you say yes, and you find a way to do it.

So you basically locked yourself up for 14 hours a day, and just plugged away at it?
Right, except it was actually about 15 hours a day, seven days a week, with no breaks. I finished the score 11 days before its theatrical release, which is insane. But, hey, that part is all in the past now, and it’s great to see the film out there getting such attention. One thing that made it worthwhile to me in the end was the quality of the filmmaking — I was riveted by the film the whole time I was working on it.

When composing, you worked only on one part at a time and not with an overall story arc in mind?
I watched all five parts over the course of four days. Once I’d watched the first two parts, I couldn’t wait to start writing so I did that for a bit and then went back to watch the rest.

The director Ezra Edelman wanted me to first score the infamous Bronco chase, which is in Part 3. It’s a 30-minute segment of that particular episode. It was a long sequence of events, all having to do with the chase itself, the events leading up to it and the aftermath of it. So that is what I scored first. It’s kind of strange to dive into a film by first scoring such a pivotal, iconic event. But it worked out — what I wrote for that segment stuck.

It was strange to be writing music for something I had seen on television 20 years before – just to think that there I was, watching the Bronco chase on TV along with everyone else, not having the remotest idea that 20 years down the line I was going to be writing music for this real-life event. It’s just a very odd thing.

The Bronco chase wasn’t a high-speed chase. It was a long police escort back to OJ’s house. The music you wrote for this segment was so brooding and it fit perfectly…
I loved when Zoe Tur, the helicopter pilot, said they were giving OJ a police motorcade. That’s exactly what he got. So I didn’t want to score the sequence by commenting literally on what was happening — what people were doing, or the fact that this was a “chase.” What I tried to do was focus on the subtext, which was the tragedy of the circumstances, and have that direct the course of the music, supplying an overarching musical commentary.

For your instrumentation, did the director let you be carried away by your own muse? Or did he request specific instruments?
He was specific about two things: one, that there would be a trumpet in the score, and two, he wanted an oboe. Other than those two instruments, it was up to me. I have a trumpet player, Jeff Bunnell, who I’ve worked with before. It’s a great partnership because he’s a gifted improviser, and sometimes he knows what I want even when I don’t. He did a fantastic job on the score.

I also had a 40-piece string section recorded at the Eastman Scoring Stage at Warner Bros. Studios. We used players here in town and they added a lot, really bringing the score to life.

Were you conducting the orchestra? Or did you stay close to the engineer in the booth?
I wanted to be next to the recording engineer so I could hear everything as it was being recorded. I had a conductor instead. Besides, I’m a terrible conductor.

What instruments did you choose for the Bronco chase score?
For one of the scenes, I used layers of distorted electric guitars. Another element of the score was musical sound manipulation of acoustic instruments through electronics. It’s a time-consuming way to conjure up sounds, with all the trial and error involved, but the results can sometimes give a film an identity beyond what you can do with an orchestra alone.

So you recorded real instruments and then processed them? Can you share an example of your processing chain?
Sometimes I will get my guitar out and play a phrase. I’ll take that phrase and play it backwards, drop it two octaves, put it through a ring modulator, and then I’ll chop it up into short segments and use that to create a rhythmic pattern. The result is nothing like a real guitar. I didn’t necessarily know what I was going for at the start, but then I’d end up with this cool beat. Then I’d build a cue around that.

The original sound could be anything. I could tap a pencil on a desk and then drop that three octaves, time compress it and do all sorts of other processing. The result is a weird drum sound that no one’s ever heard before. It’s all sorts of experimentation, with the end result being a sound that has some originality and that piques the interest of the person watching the film.

To break that down a little further, what program do you work in?
I work in Pro Tools. I went from Digital Performer to Logic — I think most film composers use Logic or Cubase, but there are a growing number who actually use Pro Tools. I don’t need MIDI to jump through a lot of hoops. I just need to record basic lines because most of that stuff gets replaced by real players anyhow.

When you work in Pro Tools, it’s already the delivery format for the orchestra, so you eliminate a conversion step. I’ve been using Pro Tools for the past four years, and so far it’s been working out great. It has some limitations in MIDI, but not that many and nothing that I can’t work around.

What are some of your favorite processing plug-ins?
For pitching, I use Melodyne by Celemony and Serato’s Pitch ‘n’ Time. There’s a new pitch shifter in Pro Tools called X-Form that’s also good. I also use Waves SoundShifter — whatever seems to do a better job for what I’m working on. I always experiment to see which one works the best to give me the sound I’m looking for.

Besides pitch shifters, I use GRM Tools by Ina-GRM. They make weird plug-ins, like one called Shift, that really convolute sound to the point where you can take a drum or rhythmic guitar and turn it into a high-hat sound. It doesn’t sound like a real high-hat. It sounds like a weird high-hat, not a real one. You never know what you’re going to get from this plug-in, and that’s why I like it so much.

I also use a lot of Soundtoys plug-ins, like Crystallizer, which can really change sounds in unexpected ways. Soundtoys has great distortion plug-ins too. I’m always on the hunt for something new.

A lot of times I use hardware, like guitar pedals. It’s great to turn real knobs and get results and ideas from that. Sometimes the hardware will have a punchier sound, and maybe you can do more extreme things with it. It’s all about experimentation.

You’ve talked before about using a Guitarviol. Was that how you created the long, suspended bass notes in the Bronco chase score?
Yes, I did use the Guitarviol in that and in other places in the score, too. It’s a very weird instrument, because it looks like a cello but doesn’t sound like one, and it definitely doesn’t sound like a guitar. It has a weird, almost Middle Eastern sound to it, and that makes you want to play in that scale sometimes. Sometimes I’ll use it to write an idea, and then I’ll have my cellist play the same thing on cello.

The Guitarviol is built by Jonathan Wilson, who lives in Los Angeles. He had no idea when he invented this thing that it was going to get adopted by the film composer community here in town. But it has, and he can’t make them fast enough.

Do you end up layering the Guitarviol and the cello in the mix? Or do you just go with straight cello?
It’s usually just straight cello. There are a couple of cellists I use who are great. I don’t want to dilute their performance by having mine in the background. The Guitarviol is an inspiration to write something for the cellists to hear, and then I’ll just have them take over from there.

The overall sound of Part 3 is very brooding, and the percussion choices have complementary deep tones. Can you tell me about some of the choices you made there?
Those are all real drums. I don’t use any samples. I love playing real drums. I have a real timpani, a big Brazilian Surdo drum, a gigantic steel bass drum that sounds like a Caribbean steel drum but only two octaves lower (it has a really odd sound), and I have a classic Ludwig Beatles drum kit. I have a marimba and a collection of small percussion instruments. I use them all.

Sometimes I will pitch the recordings down to make them sound bigger. The Surdo by itself sounds huge, and when you pitch that down half an octave it’s even bigger. So I used all of those instruments and I played them. I don’t think I used a single drum sample on the entire score.

When you use percussion samples, you have to hunt around in your entire hard drive for a great tom-tom or a Taiko drum. It’s so much easier to run over to one in your studio and just play it. You never know how it’s going to sound, depending on how you mic it that day. And it’s more inspiring to play the real thing. You get great variation. Every time you hit the drum it sounds different, but a sample sound pretty much sounds the same every time you trigger it.

For striking, did you choose mallets, brushes, sticks, your hands, or other objects?
For the Surdo, I used my hands. I use marimba mallets and timpani mallets for the other instruments. For example, I’ll use timpani mallets for the big steel bass drum. Sometimes I’ll use timpani mallets on my drum kit’s bass drum, because it gives a different sound. It has a more orchestral sound, not like a kick drum from a rock band.

I’m always experimenting. I use brushes a lot on cymbals, and I use the brushes on the steel drum because it gives it a weird sound. You can even use brushes on the timpani, and that creates a strange sound. There are definitely no rules. Whatever you think or can imagine having an effect on the drum, you just try it out. You never know what you’ll get — it’s always good to give it a chance.

In addition to the Bronco chase scene, are there any other tracks that stood out for you in Part 3?
When you score something this long, at a certain point everything starts to run together in your mind. You don’t remember what cue belongs to what scene. But there are many that I can remember. During the jury section of that episode, I used an oboe for Johnny Cochran speaking to the jury. That was an interesting pairing, the oboe and Johnny Cochran. In a way, the oboe became an extension of his voice during his closing argument. I can’t really explain why it worked, but somehow it was the right match.

For the beginning of Part 3, when the police arrive because there was a phone call from Nicole Brown Simpson saying she was afraid of OJ, the cue there was very understated. It had a lot of strange, low sounds to it. That one comes to mind.

At the end of Part 3, they go to OJ’s Rockingham residence, and his lawyers had staged the setting. I did a cue there that was sort of quizzical in a way, just to show the ridiculousness of the whole thing. It was like a farce, the way they set up his residence. So I made the score take a right turn into a different area for that part. It gets away from the dark, brooding undercurrent that the rest of Part 3’s score had.

Of all the parts you could have submitted for Emmy consideration, why did you choose Part 3?
It was a toss-up between Part 2 and Part 3. Part 2 had some of the more major trumpet themes, more of the signature sound with the trumpet and the orchestra. But there were a few examples of that in Part 3, too.

I just felt the Bronco chase, score-wise, had a lot of variation to it, and that it moved in a way that was unpredictable. I ultimately thought that was the way to go, though it was a close race between Part 2 and Part 3.

I found out later that ESPN had submitted Part 3 for Emmy consideration in other categories, so there is a bit of synergy there.

—————-

Jennifer Walden is a New Jersey-based audio engineer and writer.

Barry Sonnenfeld on Netflix’s A Series of Unfortunate Events

By Iain Blair

Director/producer/showrunner Barry Sonnenfeld has a gift for combining killer visuals with off-kilter, broad and often dark comedy, as showcased in such monster hits as the Men in Black and The Addams Family franchises.

He did learn from the modern masters of black comedy, the Coen brothers, beginning his prolific career as their DP on their first feature film, Blood Simple and then shooting such classics as Raising Arizona and Miller’s Crossing. He continued his comedy training as the DP on such films as Penny Marshall’s Big, Danny Devito’s Throw Momma from the Train and Rob Reiner’s When Harry Met Sally.

So maybe it was just a matter of time before Sonnenfeld — whose directing credits include Get Shorty, Wild Wild West, RV and Nine Lives — gravitated toward helming the acclaimed new Netflix show A Series of Unfortunate Events, based on the beloved and best-selling “Lemony Snicket” children’s series by Daniel Handler. After all, with the series’ rat-a-tat dialogue, bizarre humor and dark comedy, it’s a perfect fit for the director’s own strengths and sensibilities.

I spoke with Sonnenfeld, who won a 2007 Primetime Emmy and a DGA Award for his directorial achievement on Pushing Daisies, about making the series, the new golden age of TV, his love of post — and the real story behind why he never directed the film version of A Series of Unfortunate Events.

Weren’t you originally set to direct the 2004 film, and you even hired Handler to write the screenplay?
That’s true. I was working with producer Scott Rudin, who had done the Addams Family films with me, and Paramount decided they needed more money, so they brought in another studio, DreamWorks. But the DreamWorks producer — who had done the Men in Black films with me — and I don’t really get along. So when they came on board, Daniel and I were let go. I’d been very involved with it for a long time. I’d already hired a crew, sets were all designed, and it was very disappointing as I loved the books.

But there’s a happy ending. You are doing Netflix TV series, which seems much closer to the original books than the movie version. How important was finding the right tone?
The single most important job of a director is both finding and maintaining the right tone. Luckily, the tone of the books is exactly in my wheelhouse — creating worlds that are real, but also with some artifice in them, like the Men in Black and Addams Family movies, and Pushing Daisies. I tend to like things that are a bit dark, slightly quirky.

What did you think of the film version?
I thought it was slightly too big and loud, and I wanted to do something more like a children’s book, for adults.

The film version had to stuff Handler’s first three novels into a single movie, but the TV format, with its added length, must work far better for the books?
Far better, and the other great thing is that once Netflix hired me — and it was a long auditioning process — they totally committed. They take a long time finding the right material and pairing it with the right filmmaker but once they do, they really trust their judgment.

I really wanted to shoot it all on stages, so I could control everything. I didn’t want sun or rain. I wanted gloomy overhead. So we shot it all in Vancouver, and Netflix totally bought into that vision. I have an amazing team — the great production designer Bo Welch, who did Men in Black and other films with me, and DP Bernard Couture.

Patrick Warburton’s deadpan delivery as Lemony Snicket, the books’ unreliable narrator, is a great move compared with having just the film’s voiceover. How early on did you make that change?
When I first met with Netflix, I told them that Lemony should be an on-screen character. That was my goal. Patrick’s just perfect for the role. He’s the sort of Rod Serling/Twilight Zone presence — only more so, as he’s involved in the actual visual style of the show.

How early on do you deal with post and CG for each episode?
Even before we’re shooting. You don’t want to wait until you lock picture to start all that work, or you’ll never finish in time. I’m directing most of it — half the first season and over a third of the second. Bo’s doing some episodes, and we bring in the directors at least a month before the shoot, which is long for TV, to do a shot list. These shows, both creatively and in terms of budget, are made in prep. There should be very few decisions being made in the shoot or surprises in post because basically every two episodes equal one book, and they’re like feature films but on one-tenth of the budget and a quarter of the schedule.

We only have 24 days to do two hours worth of feature film. Our goal is to make it look as good as any feature, and I think we’ve done that. So once we have sequences we’re happy with, we show them to Netflix and start post, as we have a lot of greenscreen. We do some CGI, but not as much as we expected.

Do you also post in Vancouver?
No. We began doing post there for the first season, but we discovered that with our TV budget and my feature film demands and standards, it wasn’t working out. So now we work with several post vendors in LA and San Francisco. All the editorial is in LA.

Do you like the post process?
I’ve always loved it. As Truffaut said, the day you finish filming is the worst it’ll ever be, and then in post you get to make it great again, separating the wheat from the chaff, adding all the VFX and sound. I love prep and post — especially post as it’s the least stress and you have the most time to just think. Production is really tough. Things go wrong constantly.

You used two editors?
Yes, Stuart Bass and Skip MacDonald, and each edits two episodes/one book as we go. I’m very involved, but in TV the director gets a very short time to do their cut, and I like to give notes and then leave. My problem is I’m a micro-manager, so it’s best if I leave because I drive everyone crazy! Then the showrunner — which is also me — takes over. I’m very comfortable in post, with all the editing and VFX, and I represent the whole team and end up making all the post decisions.

Where did you mix the sound?
We did all the mixing on the Sony lot with the great Paul Ottosson who won Oscars for Zero Dark Thirty and The Hurt Locker. We go way back, as he did Men in Black 3 and other shows for me, and what’s so great about him is that he both designs the sound and then also mixes.

The show uses a lot of VFX. Who did them?
We used three main houses — Shade and Digital Sandbox in LA and Tippett in San Francisco. We also used EDI, an Italian company, who came in late to do some wire removal and clean up.

How important was the DI on this and where did you do it?
We did it all at Encore LA, and the colorist on the first season was Laura Jans Fazio, who was fantastic. It’s the equivalent to a movie DI, where you do all the final color timing, and getting the right look was crucial. The DP created very good LUTs, and our rough cut was very close to where we wanted it, and then the DP and myself piggy-backed sessions with the colorist. It’s a painful experience for me as it’s so slow, and like editing, I micro-manage. So I set looks for scenes and then leave.

Barry Sonnefeld directs Joan Cusack.

Is it a golden age for TV?
Very much so. The writing’s a very high standard, and now everyone has wide-screen TVs there’s no more protecting the 3:4 image, which is almost square. When I began doing TV, there was no such thing as a wide shot. Executives would look at my cut, and the first thing they’d always say was, “Do you have a close-up of so and so?” Now it’s all changed. But TV is so different from movies. I look back fondly at movie schedules!

How important are the Emmys and other awards?
They’re very important for Netflix and all the new platforms. If you have critical success, then they get more subscribers, more money and then they develop more projects. And it’s great to be acknowledged by your peers.

What’s next?
I’ll finish season two and we’re hopeful about season three, which would keep us busy through fall 2018. And Vancouver’s a perfect place to be as long as you’re shooting on stage and don’t have to deal with the weather.

Will there be a fourth Men in Black?
If there is, I don’t think Will or I will be involved. I suspect there won’t be one, as it might be just too expensive to make now, with all the back-end deals for Spielberg and Amblin and so on. But I hope there’s one.

Images: Joe Lederer/Netflix


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Is television versioning about to go IMF?

By Andy Wilson

If you’ve worked in the post production industry for the last 20 years, you’ll have seen the exponential growth of feature film versioning. What was once a language track dub, subtitled version or country specific compliance edit has grown into a versioning industry that has to feed a voracious number of territories, devices, platforms and formats — from airplane entertainment systems to iTunes deliveries.

Of course, this rise in movie versioning has been helped by the shift over the last 10 years to digital cinema and file-based working. In 2013, SMPTE ratified ST 2067-2, which created the Interoperable Master Format (IMF). IMF was designed to help manage the complexity of storing high-quality master rushes inside a file structure that allowed the flexibility to generate multiple variants of films through constraining what was included in the output and in the desired output formats.

Like any workflow and format change, IMF has taken time to be adopted, but it is now becoming the preferred way to share high-quality file masters between media organizations. These masters are all delivered in the J2K codec to support cinema resolutions and playback technologies.

Technologists in the broadcast community have been monitoring the growth in popularity and flexibility of IMF, with its distinctive solution to the challenge of multiple versioning. Most broadcasters have moved away from tape-based playout and are instead using air-ready playout files. These are medium-sized files (50-100Mb/s), derived from high quality rushes that can be used on playout servers to create broadcast streams. The most widespread of these includes the native XDCAM file format, but it is fast being overtaken by the AS-11 format. This format has proved very popular in the United Kingdom, where all major broadcasters made a switch to AS-11 UK DPP in 2014. AS-11 is currently rolling out in the US via the AS-11 X8 and X9 variants. However, these remain air-ready playout files, output from the 600+Mb/s ProRes and RAW files used in high-end productions. AS-11 brings some uniformity, but it doesn’t solve the versioning challenge.

Versioning is rapidly becoming as big an issue for high-end broadcast content as for feature films. Broadcasters are now seeing the sales lifecycle of some of their programs running for more than 10 years. The BBC’s Planet Earth is a great example of this, with dozens of versions being made over several years. So the need to keep high-quality files for re-versioning for new broadcast and online deliveries has become increasingly important. It is crucial for long-tail sales revenue, and productions are starting to invest in higher-resolution recordings for exactly this reason.

So, as the international high-end television market continues to grow, producers are having to look at ways that they can share much higher quality assets than air-ready files. This is where IMF offers significant opportunity for efficiencies in the broadcast and wider media market and why it is something that has the attention of producers, such as the BBC and Sky. Major broadcasters such as these have been working with global partners through the Digital Production Partnership (DPP) to help develop a new specification of IMF, specifically designed for television and online mastering.

The DPP, in partnership with the North American Broadcasters Association (NABA) and the European Broadcasting Union (EBU), have been exploring what the business requirements are for a mastering format for broadcasting. The outcome of this work was published in June 2017, and can be downloaded here.

The work explored three different user requirements: Program Acquisitions (incoming), Program Sales (outgoing) and Archive. The sales and acquisition of content can be significantly transformed with the ability to build new versions on the fly, via the Composition Playlist (CPL) and an Output Profile List (OPL). The ability to archive master rushes in a suitably high-quality package will be extremely valuable to broadcast archives. The addition of the ability to store ProRes as part of an IMF is also being welcomed, as many broadcaster archives are already full of ProRes material.

The EBU-QC group has already started to look at how to manage program quality from a broadcast IMF package, and how technical assessments can be carried out during the outputting of materials, as well as on the component assets. This work paves the way for some innovative solutions to future QC checks, whether carried out locally in the post suite or in the cloud.

The DPP will be working with SMPTE and its partners to fast track a constrained version of IMF ready for use in the broadcast and online delivery market in the first half of 2018.

As OTT video services rely heavily on the ability to output multiple different versions of the source content, this new variant of IMF could play a particularly important role in automatic content versioning and automated processes for file creation and delivery to distribution platforms — not to mention in advertising, where commercials are often re-versioned for multiple territories and states.

The DPP’s work will include the ability to add ProRes- and H.264-derived materials into the IMF package, as well as the inclusion of delivery specific metadata. The DPP are working to deliver some proof-of-concept presentations for IBC 2017 and will host manufacturer and supplier briefing days and plugfests as the work progresses on the draft version of the IMF specification. It is hoped that the work will be completed in time to have the IMF specification for broadcast and online integrated into products by NAB 2018.

It’s exciting to think about how IMF and Internet-enabled production and distribution tools will work together as part of the architecture of the future content supply chain. This supply chain will enable media companies to respond more quickly and effectively to the ever-growing and changing demands of the consumer. The DPP sees this shift to more responsive operational design as the key to success for media suppliers in the years ahead.


Andy Wilson is head of business development at DPP.

Dailies and post for IFC’s Brockmire

By Randi Altman

When the name Brockmire first entered my vocabulary, it was thanks to a very naughty and extremely funny short video that I saw on YouTube, starring Hank Azaria. It made me laugh-cry.

Fast forward about seven years and the tale of the plaid-jacket-wearing, old-school baseball play-by-play man — who discovers his beloved wife’s infidelity and melts down in an incredibly dirty and curse-fueled way on air — is picked up by IFC, in the series aptly named Brockmire. It stars Azaria, Amanda Peet and features cameos from sportscasters like Joe Buck and Tim Kurkjian.

The Sim Group was called on to provide multiple services for Brockmire: Sim provided camera rentals, Bling Digital provided dailies and workflow services, and Chainsaw provided offline editorial facilities, post finishing services, and deliverables.

We reached out to Chainsaw’s VP of business development, Michael Levy, and Bling Digital’s workflow producer, James Koon, with some questions about workflow. First up is Levy.

Michael Levy

How early did you get involved on Brockmire?
Our role with Brockmire started from the very beginning stages of the project. This was through a working relationship I had with Elizabeth Baquet, who is a production executive at Funny or Die (which produces the show).

What challenges did you have to overcome?
One of the biggest challenges was related to scaling a short to a multi-episode series and having multiple episodes in both production and in post at the same time. However, all the companies that make up Sim Group have worked on many episodic series over the years, so we were in a really good position to offer advice in terms of how to plan a workflow strategy, how to document things properly and how to coordinate getting their camera and dailies offline media from Atlanta to Post Editorial in Los Angeles.

What tools did they need for post and how involved was Chainsaw?
Chainsaw worked very hard with our Sim Group colleagues in Atlanta to provide a level of coordination that I believe made life much simpler for the Brockmire production/editorial team.

Offline editing for the series was done on our Avid Media composer systems in cutting rooms here in the Chainsaw/SIM Group studio in Los Angeles at the Las Palmas Building.

The Avid dailies media created by Bling-Atlanta, our partner company in the SimGroup, was piped over the Internet each day to Chainsaw. When the Brockmire editorial crew walked into their cutting rooms, their offline dailies media was ready to edit with on their Avid Isis server workspace. Whenever needed, they were also able to access their Arri Alexa full-rez dailies media that had been shipped on Bling drives from Atlanta.

Bling-Atlanta’s workflow supervisor for Brockmire, James Koon, remained fully involved, and was able to supervise the pulling of any clips needed for VFX, or respond to any other dailies related needs.

Deb Wolfe, Funny or Die’s post producer for Brockmire, also had an office here at Chainsaw. She consulted regularly with Annalise Kurinsky (Chainsaw’s in-house producer for Brockmire) and I as they moved along locking cuts and getting ready for post finishing.

In preparation for the finishing work, we were able to set-up color tests with Chainsaw senior colorist Andy Lichtstein, who handled final color for the series in one of our FilmLight Baselight color suites. I should note that all of our Chainsaw finishing rooms were right downstairs on the second floor of the same Sim Group Las Palmas Building.

How closely did you work with Deb Wolfe?
Very closely, especially in dealing with an unexpected production problem. Co-star Amanda Peet was accidentally hit in the head by a thrown beer can (how Brockmire! as they would say in the series). We quickly called in Boyd Stepan, Chainsaw’s Senior VFX artist, and came up with a game plan to do Flame paint fixes on all of the affected Amanda Peet shots. We also provided additional VFX compositing for other planned VFX shots in several of their episodes.

What about the HD online finish?
That was done on Avid Symphony and Baselight by staff online editor Jon Pehlke, making full use of Chainsaw’s Avid/Baselight clip-based AAF workflow.

The last stop in the post process was the Chainsaw Deliverables Department, which took care of QC and requested videotape dubs and creation and digital upload of specified delivery files.

James Koon

Now for James Koon…

James, what challenges did you have to overcome if any?
I would say that the biggest challenge overall with Brockmire was the timeframe. Twenty-four days to shoot eight episodes is ambitious. While in general this doesn’t pose a specific problem in dailies, the tight shooting schedule meant that certain elements of the workflow were going to need more attention. The color workflow, in particular, was one that created a fair amount of discussion — with the tight schedules on set, the DP (Jeffrey Waldron) wanted to get his look, but wasn’t going to have much time, if any, on-set coloring. So we worked with the DP to set up looks before they started shooting that could be stored in the camera and monitored on set and would be applied and tweaked as needed back at the dailies lab with notes from the DP.

Episode information from set to editorial was also an important consideration as they were shooting material from all eight episodes at once. Making sure to cross reference and double check which episode a shot was for was important to make sure that editorial could quickly find what they needed.

Can you walk us through the workflow, and how you worked with the producers?
They shot with the Arri’s Amira and Alexa Mini, monitoring with the LUTs created before production. This material was offloaded to an on-set back-up and a shuttle drive  — we generally use G-Tech G-RAID 4TB Thunderbolt or USB3 and  for local storage a Promise Pegasus drive and a back up on our Facilis Terrablock SAN — that was sent to the lab along with camera notes and any notes from the DP and/or the DIT regarding the look for the material. Once received at the lab we would offload the footage to our local storage and process the footage in the dailies software, syncing the material to the audio mixers recording and logging the episode, scene and take information for every take, using camera notes, script notes and audio logs to make sure that the information was correct and consistent.

We also applied the correct LUT based on camera reports and tweaked color as needed to match cameras and make any adjustments needed from the DPs notes. Once all of that was completed, we would render Avid materials for editorial, create Internet streaming files for IFC’s Box service, as well as creating DVDs.

We would bring in the Avid files and organize them into bins per the editorial specs, and upload the files and bins to the editorial location in LA. These files were delivered directly to a dailies partition on their Isis, so once editorial arrived in the morning, everything was waiting for them.

Once dailies were completed, LTO backups of the media and dailies were written as well as additional temporary backups of the source material as a safety. These final backups were completed and verified by the following morning, and editorial and production were both notified, allowing production to clear cards from the previous day if needed.

What tools did you use for dailies?
We used DaVinci Resolve to set original looks with the DP before the show began shooting, Colorfront Express Dailies for dailies processing, Media Composer for Avid editorial prep and bin organization and Imagine’s PreRoll Post for LTO writing and verification.

Harbor’s Bobby Johanson discusses ADR for TV and film

By Jennifer Walden

A lot of work comes in and out of the ADR department at New York City’s Harbor Picture Company. A lot.

Over the past year alone, ADR mixer Bobby Johanson has been cranking out ADR and loop group for films such as Beauty and the Beast, The Light Between Oceans, Patriots Day, The Girl on the Train, Triple 9, Hail, Caesar! and more.

His expertise goes beyond film though. Johanson also does ADR for series, for shows like Amazon’s Red Oaks and their upcoming series The Marvelous Mrs. Maisel, and Netflix’s Master of None, which we will touch on lightly in a bit. First, let’s talk the art of ADR.

According to Johanson, “Last week, I did full days on three different films. Some weeks we record full days, nights and weekends, depending on the season, film festivals, what’s in post, actor availability and everything else that goes on with scheduling. Some sessions will book for two hours out of a day, while another client will want eight hours because of actor availability.”

With so many projects passing through his studio, efficiency is essential, but not at the cost of a job well done. “You have an actor on the stage and the director in the room, and you have to make things efficient,” says Johanson. “You have to play lines back as they are going to be in the show. You want to play the line and hear, ‘Was that ADR?’ Instantly, it’s a whole new world. People have been burned by not so good ADR in the past, and I feel like that compromises the performance. It’s very important for the talent to feel like they’re in good hands, so they forget about the technical side and just focus on their acting.”

Johanson got his start in ADR at New York’s Sound One facility, first as a messenger running reels around, and then moving up to the machine room when there was an opening for Sound One’s new ADR stage. “We didn’t really have anyone teaching us. The job was shown to us once; then we just had to figure out how to thread the dubbers and the projector. Once we got those hung, we would sit in the ADR studio and watch. I picked up a lot of my skills old-school. I’ve learned to incorporate those techniques into current technology and that works well for us.”

Tools
Gear-wise, one staple of his ADR career has been the Soundmaster ADR control system. Johanson calls it an “old-school tool,” probably 25 years old at this point, but he hasn’t found anything faster for recording ADR. “I used it at Sound One, and I used it at Digital Cinema, and now I use it here at Harbor. Until someone can invent another ADR synchronizer, this is the best for me.”

Johanson integrates the Soundmaster system with Avid Pro Tools 12 and works as a two-man team with ADR recordist Mike Rivera. “You can’t beat the efficiency and the attention to detail that you can get with the two-man team.”

Rivera tags the takes and makes minor edits while Johanson focuses on the director and the talent. “Because we are working on a synchronizer, the ADR recordist can do things that you couldn’t do if you were just shooting straight to Pro Tools,” explains Johanson. “We can actually edit on the fly and instantly playback the line in sync. I have the time to get the reverb on it and sweeten it. I can mix the line in because I’m not cutting it or pulling it into the track. That is being done while the system is moving on the pre-roll for a playback.”

For reverb, Johanson chooses an outboard Lexicon PCM80. This puts the controls easily within reach, and he can quickly add or change the reverb on the fly, helping the clean ADR line to sync into the scene. “The reverb unit is pretty old, but it is single-handedly the easiest reverb unit that you can use. There are four room sizes, and then you can adjust the delay of the reverb four times. I have been using this reverb for so many years now that I can match any reverb from any movie or TV show because I know this unit so well.”

Another key piece of gear in his set-up is an outboard Eventide H3000 SE sampler, which Johanson uses to sample the dialogue line they need to replace and play it back over and over for the actor to re-perform. “We offer a variety of ways to do ADR, like using beeps and having the actor perform to picture, but many actors prefer an older method that goes back to ‘looping.’ Back in the day, you would just run a line over and over again and the actor would emulate it. Then we put the select take of that line to picture. It’s a method that 60 percent of our actors who come in here love to do, and I can do that using the sampler.”

He also uses the sampler for playback. By sampling background noise from the scene, he can play that under the ADR line during playback and it helps the ADR to sit in the scene. “I keep the sampler and reverb as outboard gear because I can control them quickly. I’m doing things freestyle and we don’t have to stop the session. We don’t have to stop the system and wait for a playback or wait to do a record pass. Because we are a two-man operation, I can focus on these pieces of gear while Mike is tagging the takes with their cue numbers and managing them in the Pro Tools session for delivery. I can’t find an easier or quicker way to do what I do.”

While Johanson’s set-up may lack the luster of newly minted audio tools, it’s hard to argue with results. It’s not a case of “if it’s not broke then don’t fix it,” but rather a case of “don’t mess with perfection.”

Master of None
The set-up served them well while recording ADR and loop group for Netflix’s Emmy-winning comedy series Master of None. “Kudos to production sound mixer Michael Barosky because there wasn’t too much dialogue that we needed to replace with ADR for Season 2,” says Johanson. “But we did do a lot of loop group — sweetening backgrounds and walla, and things like that.”

For the Italian episodes, they brought in bilingual actors to record Italian language loop group. One scene that stood out for Johanson was the wedding scene in Italy, where the guests start jumping into the swimming pool. “We have a nice-sized ADR stage and so that frees us up to do a lot of movement. We were directing the actors to jump in front of the mic and run by the mic, to give us the effect of people jumping into the pool. That worked quite nicely in the track.”

Netflix’s The Last Kingdom puts Foley to good use

By Jennifer Walden

What is it about long-haired dudes strapped with leather, wielding swords and riding horses alongside equally fierce female warriors charging into bloody battles? There is a magic to this bygone era that has transfixed TV audiences, as evident by the success of HBO’s Game of Thrones, History Channel’s Vikings series and one of my favorites, The Last Kingdom, now on Netflix.

The Last Kingdom, based on a series of historical fiction novels by Bernard Cornwell, is set in late 9th century England. It tells the tale of Saxon-born Uhtred of Bebbanburg who is captured as a child by Danish invaders and raised as one of their own. Uhtred gets tangled up in King Alfred of Wessex’s vision to unite the three separate kingdoms (Wessex, Northumbria and East Anglia) into one country called England. He helps King Alfred battle the invading Danish, but Uhtred’s real desire is to reclaim his rightful home of Bebbanburg from his duplicitous uncle.

Mahoney Audio Post
The sound of the series is gritty and rich with leather, iron and wood elements. The soundtrack’s tactile quality is the result of extensive Foley work by Mahoney Audio Post, who has been with the series since the first season. “That’s great for us because we were able to establish all the sound for each character, village, environment and more, right from the first episode,” says Foley recordist/editor/sound designer Arran Mahoney.

Mahoney Audio Post is a family-operated audio facility in Sawbridgeworth, Hertfordshire, UK. Arran Mahoney explains the studio’s family ties. “Clare Mahoney (mum) and Jason Swanscott (cousin) are our Foley artists, with over 30 years of experience working on high-end TV shows and feature films. My brother Billy Mahoney and I are the Foley recordists and editors/sound designers. Billy Mahoney, Sr. (dad) is the founder of the company and has been a dubbing mixer for over 40 years.”

Their facility, built in 2012, houses a mixing suite and two separate audio editing suites, each with Avid Pro Tools HD Native systems, Avid Artist mixing consoles and Genelec monitors. The facility also has a purpose-built soundproof Foley stage featuring 20 different surfaces including grass, gravel, marble, concrete, sand, pebbles and multiple variations of wood.

Foley artists Clare Mahoney and Jason Swanscott.

Their mic collection includes a Røde NT1-A cardioid condenser microphone and a Røde NTG3 supercardioid shotgun microphone, which they use individually for close-micing or in combination to create more distant perspectives when necessary. They also have two other studio staples: a Neumann U87 large-diaphragm condenser mic and a Sennheiser MKH-416 short shotgun mic.

Going Medieval
Over the years, the Mahoney Foley team has collected thousands of props. For The Last Kingdom specifically, they visited a medieval weapons maker and bought a whole armory of items: swords, shields, axes, daggers, spears, helmets, chainmail, armor, bridles and more. And it’s all put to good use on the series. Mahoney notes, “We cover every single thing that you see on-screen as well as everything you hear off of it.” That includes all the feet (human and horses), cloth, and practical effects like grabs, pick-ups/put downs, and touches. They also cover the battle sequences.

Mahoney says they use 20 to 30 tracks of Foley just to create the layers of detail that the battle scenes need. Starting with the cloth pass, they cover the Saxon chainmail and the Vikings leather and fur armor. Then they do basic cloth and leather movements to cover non-warrior characters and villagers. They record a general weapons track, played at low volume, to provide a base layer of sound.

Next they cover the horses from head to hoof, with bridles and saddles, and Foley for the horses’ feet. When asked what’s the best way to Foley horse hooves, Mahoney asserts that it is indeed with coconuts. “We’ve also purchased horseshoes to add to the stable atmospheres and spot FX when required,” he explains. “We record any abnormal horse movements, i.e. crossing a drawbridge or moving across multiple surfaces, and sound designers take care of the rest. Whenever muck or gravel is needed, we buy fresh material from the local DIY stores and work it into our grids/pits on the Foley stage.”

The battle scenes also require Foley for all the grabs, hits and bodyfalls. For the blood and gore, they use a variety of fruit and animal flesh.

Then there’s a multitude of feet to cover the storm of warriors rushing at each other. All the boots they used were wrapped in leather to create an authentic sound that’s true to the time. Mahoney notes that they didn’t want to capture “too much heel in the footsteps, while also trying to get a close match to the sync sound in the event of ADR.”

Surfaces include stone and marble for the Saxon castles of King Alfred and the other noble lords. For the wooden palisades and fort walls, Mahoney says they used a large wooden base accompanied by wooden crates, plinths, boxes and an added layer of controlled creaks to give an aged effect to everything. On each series, they used 20 rolls of fresh grass, lots of hay for the stables, leaves for the forest, and water for all the sea and river scenes. “There were many nights cleaning the studio after battle sequences,” he says.

In addition to the aforementioned props of medieval weapons, grass, mud, bridles and leather, Mahoney says they used an unexpected prop: “The Viking cloth tracks were actually done with samurai suits. They gave us the weight needed to distinguish the larger size of a Danish man compared to a Saxon.”

Their favorite scenes to Foley, and by far the most challenging, were the battle scenes. “Those need so much detail and attention. It gives us a chance to shine on the soundtrack. The way that they are shot/edited can be very fast paced, which lends itself well to micro details. It’s all action, very precise and in your face,” he says. But if they had to pick one favorite scene, Mahoney says it would be “Uhtred and Ragnar storming Kjartan’s stronghold.”

Another challenging-yet-rewarding opportunity for Foley was during the slave ship scenes. Uhtred and his friend are sold into slavery as rowers on a Viking ship, which holds a crew of nearly 30 men. The Mahoney team brought the slave ship to life by building up layers of detail. “There were small wood creaks with small variations of wood and big creaks with larger variations of wood. For the big creaks, we used leather and a broomstick to work into the wood, creating a deep creak sound by twisting the three elements against each other. Then we would pitch shift or EQ to create size and weight. When you put the two together it gives detail and depth. Throw in a few tracks of rigging and pulleys for good measure and you’re halfway there,” says Mahoney.

For the sails, they used a two-mic setup to record huge canvas sheets to create a stereo wrap-around feel. For the rowing effects, they used sticks, brooms and wood rubbing, bouncing, or knocking against large wooden floors and solid boxes. They also covered all the characters’ shackles and chains.

Foley is a very effective way to draw the audience in close to a character or to help the audience feel closer to the action on-screen. For example, near the end of Season 2’s finale, a loyal subject of King Alfred has fallen out of favor. He’s eventually imprisoned and prepares to take his own life. The sound of his fingers running down the blade and the handling of his knife make the gravity of his decision palpable.

Mahoney shares another example of using Foley to draw the audience in — during the scene when Sven is eaten by Thyra’s wolves (following Uhtred and Ragnar storming Kjartan’s stronghold). “We used oranges and melons for Sven’s flesh being eaten and for the blood squirts. Then we created some tracks of cloth and leather being ripped. Specially manufactured claw props were used for the frantic, ravenous wolf feet,” he says. “All the action was off-screen so it was important for the audience to hear in detail what was going on, to give them a sense of what it would be like without actually seeing it. Also, Thyra’s reaction needed to reflect what was going on. Hopefully, we achieved that.”

The long, strange trip of Amir Bar-Lev’s new Dead doc

Deadheads take note — Long Strange Trip, director Amir Bar-Lev’s four-hour documentary on rock’s original jam band, the Grateful Dead, is now available for viewing. While the film had a theatrical release in New York and Los Angeles on May 26, the doc was made available on Amazon Video as a six-episode series.

L-R: Jack Lewars and Keith Jenson.

Encompassing the band’s rise and decades-long career, the film, executive produced by Martin Scorsese, was itself 14 years in the making. That included three months of final post at Technicolor PostWorks New York, where colorist Jack Lewars and online editor Keith Jenson worked with Bar-Lev to finalize the film’s form and look.

The documentary features scores of interviews conducted by Bar-Lev with band members and their associates, as well as a mountain of concert footage and other archival media. All that made editorial conforming complex as Jenson (using Autodesk Flame) had to keep the diverse source material organized and make it fit properly into a single timeline. “We had conversions that were made from old analog tapes, archival band footage, DPX scans from film and everything in between,” he recalls. “There was a lot of cool stuff, which was great, but it required attention to detail to ensure it came out nice and smooth.”

The process was further complicated as creative editorial was ongoing throughout post. New material was arriving constantly. “We do a lot of documentary work here, so that’s something we’re used to,” Jenson says. “We have workflows and failsafes in place for all formats and know how to translate them for the Lustre platform Jack uses. Other than the sheer amount, nothing took us by surprise.”

Lewars faced a similar challenge during grading as he was tasked with bringing consistency to material produced over a long period of time by varying means. The overall visual style, he says, recalls the band’s origins in the psychedelic culture of the 1960s. “It’s a Grateful Dead movie, so there are a lot of references to their experiments with drugs,” he explains. “Some sections have a trippy feel where the visuals go in and out of different formats. It almost gives the viewer the sense of being on acid.”

The color palette, too, has a psychedelic feel, reflecting the free-spirited essence of the band and its co-founder. “Jerry Garcia’s life, his intention and his outlook, was to have fun,” Lewars observes. “And that’s the look we embraced. It’s very saturated, very colorful and very bright. We tried to make the movie as fun as possible.”

The narrative is frequently punctuated by animated sequences where still photographs, archival media and other elements are blended together in kaleidoscopic patterns. Finalizing those sequences required a few extra steps. “For the animation sequences, we had to cut in the plates and get them to Jack to grade,” explains Jenson. “We’d then send the color-corrected plates to the VFX and animation department for treatment. They’d come back as completed elements that we’d cut into the conform.”

The documentary climaxes with the death of Garcia and its aftermath. The guitarist suffered a heart attack in 1995 after years of struggling with diabetes and drug addiction. As those events unfold, the story undergoes a mood change that is mirrored in shifts in the color treatment. “There is a four-minute animated sequence in the last reel where Jerry has just passed and they are recapping the film,” Lewars says. “Images are overlaid on top of images. We colored those plates in hyper saturation, pushing it almost to the breaking point.

“It’s a very emotional moment,” he adds. “The earlier animated sequences introduced characters and were funny. But it’s tied together at the end in a way that’s sad. It’s a whiplash effect.”

Despite the length of the project and the complexity of its parts, it came together with few bumps. “Supervising producer Stuart Macphee and his team were amazing,” says Jenson. “They were very well organized, incredibly so. With so many formats and conversions coming from various sources, it could have snowballed quickly, but with this team it was a breeze.”

Lewars concurs. Long Strange Trip is an unusual documentary both in its narrative style and its looks, and that’s what makes it fascinating for Deadheads and non-fans alike. “It’s not a typical history doc,” Lewars notes. “A lot of documentaries go with a cold, bleach by-pass look and gritty feel. This was the opposite. We were bumping the saturation in parts where it felt unnatural, but, in the end, it was completely the right thing to do. It’s like candy.”

You can binge it now on Amazon Video.

Pixelogic acquires Sony DADC NMS’ creative services unit

Pixelogic, a provider of localization and distribution services, has completed the acquisition of the creative services business unit of Sony DADC New Media Solutions, which specializes in 4K, UHD, HDR and IMF workflows for features and episodics. The move brings an expansion of Pixelogic’s significant services to the media and entertainment industry and provides additional capabilities, including experienced staff, proprietary technology and an extended footprint.

According to John Suh, co-president of Pixelogic, the acquisition “expands our team of expert media engineers and creative talent, extends our geographic reach by providing a fully established London operation and further adds to our capacity and capability within an expansive list of tools, technologies, formats and distribution solutions.”

Seth Hallen

Founded less than a year ago, Pixelogic currently employs over 240 worldwide and is led by industry veterans Suh and Rob Seidel. While the company is headquartered in Burbank, California, it has additional operations in Culver City, California, London and Cairo.

Sony DADC NMS Creative Services was under the direction of Seth Hallen, who joins Pixelogic as senior VP of business development and strategy. All Sony DADC NMS Creative Services staff, technology and operations are now part of Pixelogic. “Our business model is focused on the deep integration of localization and distribution services for movies and television products,” says Hallen. “This supply chain will require significant change in order to deliver global day and date releases with collapsed distribution windows, and by partnering closely with our customers we are setting out to innovate and help lead this change.”

FX’s Fargo features sounds as distinctive as its characters

By Jennifer Walden

In Fargo, North Dakota, in the dead of winter, there’s been a murder. You might think you’ve heard this story before, but Noah Hawley keeps coming up with a fresh, new version of it for each season of his Fargo series on FX. Sure, his inspiration was the Coen brothers’ Oscar-winning Fargo film, but with Season 3 now underway it’s obvious that Hawley’s series isn’t simply a spin-off.

Martin Lee and Kirk Lynds.

Every season of the Emmy-winning Fargo series follows a different story, with its own distinct cast of characters, set in its own specified point in time. Even the location isn’t always the same — Season 3 takes place in Minnesota. What does link the seasons together is Hawley’s distinct black humor, which oozes from these disparate small-town homicides. He’s a writer and director on the series, in addition to being the showrunner and an executive producer. “Noah is very hands-on,” confirms re-recording mixer Martin Lee at Tattersall Sound & Picture in Toronto, part of the SIM Group family of companies, who has been mixing the show with re-recording mixer Kirk Lynds since Season 2.

Fargo has a very distinct look, feel and sound that you have to maintain,” explains Lee. “The editors, producers and Noah put a lot of work into the sound design and sound ideas while they are cutting the picture. The music is very heavily worked while they are editing the show. By the time the soundtrack gets to us there is a pretty clear path as to what they are looking for. It’s up to us to take that and flesh it out, to make it fill the 5.1 environment. That’s one of the most unique parts of the process for us.”

Season 3 follows rival brothers, Emmit and Ray Stussy (both played by Ewan McGregor). Their feud over a rare postage stamp leads to a botched robbery attempt that ultimately ends in murder (don’t worry, neither Ewan character meets his demise…yet??).

One of the most challenging episodes to mix this season, so far, was Episode 3, “The Law of Non-Contradiction.” The story plays out across four different settings, each with unique soundscapes: Minnesota, Los Angeles in 2010, Los Angeles in 1975 and an animated sci-fi realm. As police officer Gloria Burgle (Carrie Coon) unravels the homicide in Eden Valley, Minnesota, her journey leads her to Los Angeles. There the story dives into the past, to 1975, to reveal the life story of science fiction writer Thaddeus Mobley (Thomas Mann). The episode side-trips into animation land when Gloria reads Mobley’s book titled The Planet Wyh.

One sonic distinction between Los Angeles in 2010 and Los Angeles of 1975 was the density of traffic. Lee, who mixed the dialogue and music, says, “All of the scenes that were taking place in 2010 were very thick with traffic and cars. That was a technical challenge, because the recordings were very heavy with traffic.”

Another distinction is the pervasiveness of technology in social situations, like the bar scene where Gloria meets up with a local Los Angeles cop to talk about her stolen luggage. The patrons are all glued to their cell phones. As the camera pans down the bar, you hear different sounds of texting playing over a contemporary, techno dance track. “They wanted to have those sounds playing, but not become intrusive. They wanted to establish with sound that people are always tapping away on their phones. It was important to get those sounds to play through subtly,” explains Lynds.

In the animated sequences, Gloria’s voice narrates the story of a small android named MNSKY whose spaceman companion dies just before they reach Earth. The robot carries on the mission and records an eon’s worth of data on Earth. The robot is eventually reunited with members of The Federation of United Planets, who cull the android’s data and then order it to shut down. “Because it was this animated sci-fi story, we wanted to really fill the room with the environment much more so than we can when we are dealing with production sound,” says Lee. “As this little robotic character is moving through time on Earth, you see something like the history of man. There’s voiceover, sound effects and music through all of it. It required a lot of finesse to maintain all of those elements with the right kind of energy.”

The animation begins with a spaceship crashing into the moon. MNSKY wakes and approaches the injured spaceman who tells the android he’s going to die. Lee needed to create a vocal process for the spaceman, to make it sound as though his voice is coming through his helmet. With Audio Ease’s Altiverb, Lee tweaked the settings on a “long plastic tube” convolution reverb. Then he layered that processed vocal with the clean vocal. “It was just enough to create that sense of a helmet,” he says.

At the end, when MNSKY rejoins the members of the Federation on their spaceship it’s a very different environment from Earth. The large, ethereal space is awash in long, warm reverbs which Lynds applied using plug-ins like PhoenixVerb 5.1 and Altiverb. Lee also applied a long reverb treatment to the dialogue. “The reverbs have quite a significant pre-delay, so you almost have that sense of a repeat of the voice afterwards. This gives it a very distinctive, environmental feel.”

Lynds and Lee spend two days premixing their material on separate dub stages. For the premix, Lynds typically has all the necessary tracks from supervising sound editor Nick Forshager while Lee’s dialogue and music tracks come in more piecemeal. “I get about half the production dialogue on day one and then I get the other half on day two,” says Lee. “ADR dribbles in the whole time, including well into the mixing process. ADR comes in even after we have had several playbacks already.”

Fortunately, the show doesn’t rely heavily on ADR. Lee notes that they put a lot of effort into preserving the production. “We use a combination of techniques. The editors find the cleanest lines and takes (while still keeping the performance), then I spent a lot of time cleaning that up,” he says.

This season Lee relies more on Cedar’s DNS One plug-in for noise reduction and less on the iZotope RX5 (Connect version). “I’m finding with Fargo that the showrunners are uniquely sensitive to the effects of the iZotope processing. This year it took more work to find the right sound. It ends up being a combination of both the Cedar and the RX5,” reports Lee.

After premixing, Lee and Lynds bring their tracks together on Tattersall’s Stage 1. They have three days for the 5.1 final mix. They spend one (very) long day building the episode in 5.1 and then send their mix to Los Angeles for Forshager and co-producer Gregg Tilson to review. Then Lee and Lynds address the first round of notes the next morning and send the mix back to Los Angeles for another playback. Each consecutive playback is played for more people. The last playback is for Hawley on the third day.

“One of the big challenges with the workflow is mixing an episode in one day. It’s a long mix day. At least the different time zones help. We send them a mix to listen to typically around 6-7pm PST, so it’s not super late for them. We start at 8am EST the next morning, which is three hours ahead of their time. By the time they’re in the studio and ready to listen, it is 10am their time and we’ve already spent three or four hours handling the revisions. That really works to our advantage,” says Lee.

Sound in the Fargo series is not an afterthought. It’s used to build tension, like a desk bell that rings for an uncomfortably long time, or to set the mood of a space, like an overly noisy fish tank in a cheap apartment. By the time the tracks have made it to the mixers, there’s been “a lot of time and effort spent thinking about what the show was going to sound like,” says Lynds. “From that sense, the entire mix for us is a creative opportunity. It’s our chance to re-create that in a 5.1 environment, and to make that bigger and better.”

You can catch new episodes of Fargo on FX Networks, Wednesdays at 10pm EST.


Jennifer Walden is a New Jersey-based audio engineer and writer.

Assistant Editor’s Bootcamp coming to Burbank in June

The new Assistant Editors’ Bootcamp, founded by assistant/lead editors Noah Chamow (The Voice) and Conor Burke (America’s Got Talent), is a place for a assistant editors and aspiring assistants to learn and collaborate with one another in a low-stakes environment. The next Assistant Editors’ Bootcamp classes will be held on June 10-11, along with a Lead Assistant Editors’ class geared toward understanding troubleshooting and system performance on June 24-25. All classes, sponsored by AlphaDogs’ Editor’s Lounge, will be held at Skye Rentals in Burbank.

The classes will cover such topics as The Fundamentals of Video, Media Management, Understanding I/O and Drive Speed, Prepping Footage for Edit, What’s New in Media Composer, Understanding System Performance Bottlenecks and more. Cost is $199 for two days for the Assistant Editor class, and $299 for two days for the Lead Assistant Editor class. Space is on a first-come, first-served basis and is limited to 25 participants per course. You can register here.

A system with Media Composer 8.6 or later and an external hard drive is required to take the class (30-day Avid trial available) 8GB of system memory and Windows 7/OS X 10.9 or later are needed to run Media Composer 8.6. Computer rentals are available for as little as $54 a week from Hi-Tech Computer Rental in Burbank.

Chamow and Burke came up with the idea for Assistant Editors’ Bootcamp when they realized how challenging it is to gain any real on-the-job experience in today’s workplace. With today’s focus being primarily on doing things faster and more efficiently, it’s almost impossible to find the time to figure out why one method of doing something is faster than another. Having worked extensively in reality television and creating the “The Super Grouper,” a multi-grouping macro for Avid that is now widely used in reality post workflows, Chamow understands first-hand the landscape of the assistant editor’s world. “One of the most difficult things about working in the entertainment industry, especially in a technical position, is that there is never time to learn,” he says. “I’m very passionate about education and hope by hosting these classes, I can help other assistants hone their skills as well as helping those who are new to the business get the experience they need.”

Having worked as both an assistant editor and lead assistant editor, Burke has created workflows and overseen post for up to 10 projects at a time, before moving into his current position at NBC’s America’s Got Talent. “In my years of experience and working on grueling deadlines, I completely understand how difficult the job of an assistant editor can be, having little or no time to learn anything other than what’s right in front of you,” he says. “In teaching this class, I hope to make peers feel more confident and have a better understanding in their work, taking them to the next level in their careers.”

Main Image (L-R): Noah Chamow and Conor Burke.

The A-List: Director Ron Howard discusses National Geo’s Genius

By Iain Blair

Ron Howard has done it all in Hollywood. The former child star of The Andy Griffith Show and Happy Days not only successfully made the tricky transition to adult actor (at 22 he starred opposite John Wayne in The Shootist and was nominated for a Best Supporting Actor Oscar), but went on to establish himself as an Oscar-winning director and producer (A Beautiful Mind). He is also one of Hollywood’s most beloved and commercially successful and versatile helmers.

Since making his directorial debut in 1977 with Grand Theft Auto (when he was still on Happy Days), he’s made an eclectic group of films about boxers (Cinderella Man), astronauts (Apollo 13), mermaids (Splash), symbologists (The Da Vinci Code franchise), politicians (Frost/Nixon) firefighters (Backdraft), mathematicians (A Beautiful Mind), Formula One racing (Rush), whalers (In the Heart of the Sea) and the Fab Four (his first documentary, The Beatles: Eight Days a Week).

Born in Oklahoma with showbiz in his DNA — his parents were both actors — Howard “always wanted to direct” and notes that “producing gives you control.” In 1986, he co-founded Imagine Entertainment with Brian Grazer, a powerhouse in film and TV (Empire, Arrested Development) production. His latest project is the new Genius series for National Geographic.

The 10-part global event series — the network’s first scripted series — is based on Walter Isaacson’s book “Einstein: His Life and Universe” and tracks Albert Einstein’s rise from humble origins as an imaginative and rebellious thinker through his struggles to be recognized by the establishment, to his global celebrity status as the man who unlocked the mysteries of the cosmos with his theory of relativity.

But if you’re expecting a dry, intellectual by-the-numbers look at his life and career, you’re in for a big surprise.

With an impressive cast that includes Geoffrey Rush as the celebrated scientist in his later years, Johnny Flynn as Einstein in the years before he rose to international acclaim and Emily Watson as his second wife — and first cousin — Elsa Einstein, the show is full of sex, drugs and rock ‘n’ roll.

We’re mostly joking, but the series does balance the hard-to-grasp scientific theories with an entertaining exploration of a man with an often very messy private life as it follows Einstein’s alternately exhilarating emotions and heartlessness in dealing with his closest personal relationships, including his children, his two wives and the various women with whom he cheats on them.

Besides all the personal drama, there’s plenty of global drama as Genius is set against an era of international conflict over the course of two world wars. Faced with rising anti-Semitism in Europe, surveillance by spies and the potential for atomic annihilation, Einstein struggles as a husband and a father, not to mention as a man of principle, even as his own life is put in danger.

I talked recently with Ron Howard about directing the first episode and his love of production and post.

What was the appeal of doing this and making your scripted television directorial debut with the first episode?
I’ve become a big fan of all the great TV shows people are doing now, where you let a story unfold in a novelistic way, and I was envious of a lot of my peers getting into doing TV — and this was a great project that just really suits the TV format. Over the years, I had read various screenplays about Einstein but they just never worked as a movie, so when National Geographic wanted to reach out to their audience in a more ambitious way, suddenly there was this perfect platform to do this life justice and have the length it needed. It’s an ideal fit, and it was perfect to do it with National Geographic.

Given that you had considered making a film about him, how familiar were you with Einstein and his life? How do you find the drama in an academic’s life?
I thought I had some insight, but I was blown away by the book and Noah Pink’s screenplay, and everyone on the team brought their own research to the process, and it became more and more fascinating. There was this constant pressure on Einstein that I felt we could work with through the whole series, and that I never realized was there. And with that pressure, there’s drama. We came very close to not benefiting from his genius because of all the forces against him – sometimes from external forces, like governments and academic institutions, but often from his own foibles and flaws. He was even on a hit list. So I was really fascinated by his whole story.

What most surprised you about Einstein once you began delving deeper into his private life?
That he was such a Lothario! He had quite a complicated love life, but it was also that he had such a dogged commitment to his principles and logic and point-of-view. I was doing post on the Beatles documentary as we prepped, and it was the same thing with those young men. They often didn’t listen to outside influences and people telling them it couldn’t be done. They absolutely committed to their musical vision and principles with all their drive and focus, and it worked — and collectively I think you could say the band was genius.

Einstein also trusted his convictions, whether it was physics or math, and if the conventional answers didn’t satisfy his sense of logic, he’d just dig deeper. The same thing can be said for his personal life and relationships, and trying to find a balance between his career and life’s work, and family and friends. Look at his falling in love with his fellow physics student Mileva Maric, which causes all sorts of problems, especially when she unexpectedly gets pregnant. No one else thought she was particularly attractive, she was a bit of an outcast as the only female physics student, and yet his logic called him to her. The same thing with politics. He went his own way in everything. He was a true renaissance man, eternally curious about everything.

In terms of dealing with very complex ideas that aren’t necessarily very cinematic, it must have helped that you’d made A Beautiful Mind?
Yes, we saw a lot of similarities between the two. It really helped that both men were essentially visualists — Einstein even more so than John Nash. That gave us a big advantage and gave me the chance to show audiences some of his famous thought experiments in cinematic ways, and he described them very vividly and they’re a fantastic jumping-off point — it was his visualizations that helped him wrap his head around the physics. He began with something he could grasp physically and then went back to prove it with the math. Those principles gave him the amazing insights about the nature of the universe, and time and space, that we’ve all benefitted from.

I assume you began integrating post and all the VFX very early on?
Right away, in preproduction meetings in Prague, in the Czech Republic, where Einstein lived and taught early in his career. We had our whole team there on location, including our VFX supervisor Eric Durst and his team, DP Mathias Herndl, our production designers and art directors and so on. With all the VFX, we stayed pretty close to how Einstein described his thought experiments. The one that starts off this first episode is very vivid, whereas the first one he has as a 17-year-old boy is done in a more chalk-board kind of way, where he faints and can barely hang on mentally to the image. All the dailies and visual effects were done by UPP.

Where did you do the post?
We did all the editing and sound back in LA.

Do you like the post process?
I love it. I love the edit and slowly pulling it all together after the stress of the shoot.

It was edited by James Wilcox, who’s done CSI: Miami and Hawaii Five-O, along with Debby Germino and J. Kathleen Gibson. How early was James involved and was he on set?
Dan and Mike weren’t available. It’s the first time I’d worked with James and he’s very creative and did a great job. He wasn’t on the set, but we were constantly in communication and we’d send him material back to LA and then when I got back, we sat down together.

The show constantly cuts back and forth in time.
Yes, I was fascinated by all those transitions and I worked very closely with my team to make sure we had all that down, and that it all flowed smoothly in the edit. For instance, Johnny Flynn plays violin and he trained classically, so he actually plays in all those scenes. But Geoffrey doesn’t play violin, but he practiced for several months, and we had a teacher on set too. Geoffrey was so dedicated to creating this character.They both looked at tons of footage of Einstein as an older man, so Johnny could develop aspects of Einstein’s manner and behavior as the younger one, which Geoffrey could work with later, so we had a real continuity to the character. That’s a big reason why I wanted to be so hands-on with the first episode, as we were defining so many key aspects of the man and the aesthetics and the way we’d be telling the whole story.

Can you talk about working on the sound and music?
It’s always huge to me and adds so much to every scene. Lorne Balfe wrote a fantastic score and we had a great sound team: production sound mixer Peter Forejt, supervising sound editor Daniel Pagan, music editor Del Spiva and re-recording mixers Mark Hensley and Bob Bronow. For post production audio we used Smart Post Sound.

The DI must have been important?
It was very important since we were trying to do stuff with the concept of time in very subtle ways using the camera work, the palette and the lighting style. This all changed subtly depending on whether it was an Einstein memory, or a flashback to his younger, brasher self, or looking ahead to the iconic older man where it was all a little more formal. So we went for different looks to match the different energies and, of course, the editing style had to embody all of that as well. The colorist was Pankaj Bajpai, and he did a great job.

What’s next?
I plan to do more TV. Remember, I came out of TV and it’s so exciting now. I’m also developing several movie projects, including Seveneves, a sci-fi film, and Under the Banner of Heaven which is based on the Jon Krakauer bestseller. So whatever comes together first.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Game of Thrones: VFX associate producer Adam Chazen

With excitement starting to build for the seventh season of HBO’s Game of Thrones, what better time to take a quick look back at last season’s VFX workflow. HBO associate VFX producer Adam Chazen was kind enough to spend some time answering questions after just wrapping Season 7.

Tell us about your background as a VFX associate producer and what led you to Game of Thrones.
I got my first job as a PA at VFX studio Pixomondo. I was there for a few years, working under my current boss Steve Kullback (visual effects producer on Game of Thrones). He took me with him when he moved to work on Yogi Bear, and then on Game of Thrones.

I’ve been with the show since 2011, so this is my sixth year on board. It’s become a real family at this point; lots of people have been on since the pilot.

From shooting to post, what is your role working on Game of Thrones?
As the VFX associate producer, in pre-production mode I assist with organizing our previs and concept work. I help run and manage our VFX database and I schedule reviews with producers, directors and heads of departments.

During production I make sure everyone has what they need on set in order to shoot for the various VFX requirements. Also during production, we start to post the show — I’m in charge of running review sessions with our VFX supervisor Joe Bauer. I make sure that all of his notes get across to the vendors and that the vendors have everything they need to put the shots together.

Season 7 has actually been the longest we’ve stayed on set before going back to LA for post. When in Belfast, it’s all about managing the pre-production and production process, making sure everything gets done correctly to make the later VFX adjustments as streamlined as possible. We’ll have vendors all over the world working on that next step — from Australia to Spain, Vancouver, Montreal, LA, Dublin and beyond. We like to say that the sun never sets on Game of Thrones.

What’s the process for bringing new vendors onto the show?
They could be vendors that we’ve worked with in the past. Other times, we employ vendors that come recommended by other people. We check out industry reels and have studios do testing for us. For example, when we have dragon work we ask around for vendors willing to run dragon animation tests for us. A lot of it is word of mouth. In VFX, you work with the people that you know will do great work.

What’s your biggest challenge in creating Game of Thrones?
We’re doing such complex work that we need to use multiple vendors. This can be a big hurdle. In general, whether it be film or TV, when you have multiple vendors working on the same shot, it becomes a potential issue.

Linking in with cineSync helps. We can have a vendor in Australia and a vendor in Los Angeles both working on the same shot, at exactly the same time. I first started using cineSync while at Pixomondo and found it makes the revision process a lot quicker. We send notes out to vendors, but most of the time it’s easier to get on cineSync, see the same image and draw on it.

Even the simple move of hovering a cursor over the frame can answer a million questions. We have several vendors who don’t use English as their first language, such as those in Spain. In these cases, communication is a lot easier via cineSync. By pointing to a single portion of a single frame, we completely bypass the language barrier. It definitely helps to see an image on screen versus just explaining it.

What is your favorite part of the cineSync toolkit?
We’ve seen a lot of cool updates to cineSync. Specifically, I like the notes section, where you can export a PDF to include whichever frame that note is attributed to.

Honestly, just seeing a cursor move on-screen from someone else’s computer is huge. It makes things so much easier to just point and click. If we’re talking to someone on the phone, trying to tell them about an issue in the upper left hand corner, it’s going to be hard to get our meaning across. cineSync takes away all of the guesswork.

Besides post, we also heavily use cineSync for shoot needs. We shoot the show in Northern Ireland, Iceland, Croatia, Spain and Calgary. With cineSync, we are able to review storyboards, previs, techvis and concepts with the producers, directors, HODs and others, wherever they are in the world. It’s crucial that everyone is on the same page. Being able to look at the same material together helps everyone get what they want from a day on set.

Is there a specific shot, effect or episode you’re particularly proud of?
The Battle of the Bastards — it was a huge episode. Particularly, the first half of the episode when Daenerys came in with her dragons at the battle of Meereen, showing those slavers who is boss. Meereen City itself was a large CG creation, which was unusual for Game of Thrones. We usually try to stay away from fully CG environments and like to get as much in-camera as possible.

For example, when the dragon breathes fire we used an actual flamethrower we shot. Back in Season 5, we started to pre-animate the dragon, translate it to a motion control rig, and attach a flamethrower to it. It moves exactly how the dragon would move, giving us a practical element to use in the shot. CG fire can be done but it’s really tricky. Real is real, so you can’t question it.

With multiple vendors working on the sequence, we had Rodeo FX do the environment while Rhythm & Hues did the dragons. We used cineSync a lot, reviewing shots between both vendors in order to point out areas of concern. Then in the second half of the episode, which was the actual Battle of the Bastards, the work was brilliantly done by Australian VFX studio Iloura.

Jason Moss composes music for ABC’s The Toy Box

By Jennifer Walden

Children may not be the best source for deciding when bedtime should be, or deciding what’s for dinner (chicken nuggets again?), but who better to decide what toys kids want to play with? A large part of the Tom Hanks film Big was based on this premise.

ABC’s new inventor-centric series, The Toy Box, which premiered in April, features four young judges who are presented with new toy inventions. They then get to decide which toy prototypes would be popular with others in their demographic. Toy inventors competing on the show first meet with a set of “expert mentors,” a small group of adults who delve into the specifics of the toy and offer advice.

Jason Moss

If the toy makes it past that panel, it gets put into the “toy box.” The toy is then presented to the four young judges, who get to play with it, ask questions and give their critique to the toy inventor. The four young judges deliberate and make a final decision on which toy will advance to the next round. At the end of the season, the judges will chose one winning toy to be made by Mattel and sold exclusively at Toys ‘R’ Us.

The Toy Box needed a soundtrack that could both embody the essence of juvenile joviality and portray the pseudo-seriousness its pre-teen decision makers. It’s not a job for your average reality show composer. It required askew musical sensibilities. “The music is fun and super-pop sounding with cool analog synths and video game sounds. It’s really energetic and puts a smile on your face,” says the series composer/music supervisor Jason Moss at Super Sonic Noise in Los Angeles. “Then for the decision-making cues, as the kids decide whether they like a toy and what they’re going to do, it had to be something other than what you’d expect. It couldn’t sound too dark. It still had to be quirky.”

Moss knows quirky. He was the composer on IFC’s Gigi Does It, starring David Krumholtz as an eccentric Jewish grandmother living in Florida. Moss also composed the theme music for the Seeso original series Bajillion Dollar Propertie$, a partially improvised comedy series that pokes fun at real estate reality shows.

Moss covered all of The Toy Box’s musical needs — from high-energy pop and indie rock tracks when the kids are playing with the toys to comedic cues infused with ukulele and kitschy strings, and tension tracks for moments of decision. He wrote original music as well as curated selections from the Bulletproof Bear music catalog. Bulletproof Bear offers a wide variety of licensable tracks written by Moss, plus other music catalogs they represent. “It’s a big collection with over 33,000 tracks. We can really compete with bigger music license companies because we have a huge amount of diverse music that can cover the whole production from head to toe,” he says.

The Gear
Moss composes in Apple’s Logic Pro X. He performed live guitars, bass and ukulele (using the Kala U-Bass bass ukulele). For mics, he chose Miktek Audio’s CV4 large diaphragm condense tube and their C5 small diaphragm pencil condenser, each paired with Empirical Labs Mike-E pre-amps.

Moss combined the live sounds with virtual instruments, particularly those from Spectrasonics. XLN Audio’s Addictive Drums were his go-to for classic and modern drum sounds. For synths, he used reFX’s Nexus, libraries from Native Instrument’s Kontakt, Arturia’s Analog Lab and their VOX Continental V. He also called on the ROLI Equator sound engine via the ROLI Rise 25 key MIDI controller, which features soft squishy silicone keys much different from a traditional keyboard controller. The Akai MPK88 weighted key controller is Moss’ choice in that department. For processing and effects, he chose plug-ins by Soundtoys and PSP Audioware. He also incorporated various toy and video game sounds into the tracks.

The Score
The show’s two-minute opener combines three separate segments — the host (Modern Family‘s Eric Stonestreet), the expert mentor introductions and the judges introductions. Each has its own musical vibe. The host and the expert mentors have original music that Moss wrote specifically for the show. The judges have a dramatic pulsing-string track that is licensed from Bulletproof Bear’s catalog. In addition, a five-second tag for The Toy Box logo is licensed from the Bulletproof Bear catalog. That tag was composed by Jon LaCroix, who is one of Moss’ business partners. In regards to the dramatic strings on the kids’ entrance, Moss, who happened to write that cue, says, “The way they filmed the kids… it’s like they are little mini adults. So the theme has some seriousness to it. In context, it’s really cute.”

For the decision-making cues, Moss wanted to stay away from traditional tension strings. To give the track a more playful feel that would counterbalance the tension, he used video game sounds and 808 analog drum sounds. “I also wanted to use organic sounds that were arpeggiated and warm. They are decision-making tick-tock tracks, but I wanted to make it more fun and interesting,” says Moss.

“We were able to service the show on the underscore side with Bulletproof Bear’s music catalog in conjunction with my original music. It was a great opportunity for us to keep all the music within our company and give the client a one-stop shop, keeping the music process organized and easy,” he explains. “It was all about finding the right sound, or the right cue, for each of those segments. At the end of the day, I want to make sure that everybody is happy, acknowledge the showrunners’ musical vision and strive to capture that. It was a super-fun experience, and hopefully it will come back for a second, third and tenth season! It’s one of those shows you can watch with your kids. The kid judges are adorable and brutally honest, and with the myriad of adult programming out there, it’s refreshing to see a show like The Toy Box get green-lit.”

A chat with Emmy-winning comedy editor Sue Federman

This sitcom vet talks about cutting Man With A Plan and How I Met Your Mother.

By Dayna McCallum

The art of sitcom editing is overly enjoyed and underappreciated. While millions of people literally laugh out loud every day enjoying their favorite situation comedies, very few give credit to the maestro behind the scenes, the sitcom editor.

Sue Federman is one of the best in the business. Her work on the comedy How I Met Your Mother earned three Emmy wins and six nominations. Now the editor of CBS’ new series, Man With A Plan, Federman is working with comedy legends Matt LeBlanc and James Burrows to create another classic sitcom.

However, Federman’s career in entertainment didn’t start in the cutting room; it started in the orchestra pit! After working as a professional violinist with orchestras in Honolulu and San Francisco, she traded in her bow for an Avid.

We sat down to talk with Federman about the ins and outs of sitcom editing, that pesky studio audience, and her journey from musician to editor.

When did you get involved with your show, and what is your workflow like?
I came onto Man With A Plan (MWAP) after the original pilot had been picked up. They recast one of the leads, so there was a reshoot of about 75 percent of the pilot with our new Andi, Liza Snyder. My job was to integrate the new scenes with the old. It was interesting to preserve the pace and feel of the original and to be free to bring my own spin to the show.

The workflow of the show is pretty fast since there’s only one editor on a traditional audience sitcom. I usually put a show together in two to three days, then work with the producers for one to two days, and then send a pretty finished cut to the studio/network.

What are the biggest challenges you face as an editor on a traditional half-hour comedy?
One big challenge is managing two to three episodes at a time — assembling one show while doing producer or studio/network notes on another, as well as having to cut preshot playbacks for show night, which can be anywhere from three to eight minutes of material that has to be cut pretty quickly.

Another challenge is the live audience laughter. It’s definitely a unique part of this kind of show. I worked on How I Met Your Mother (HIMYM) for nine years without an audience, so I could completely control the pacing. I added fake laughs that fit the performances and things like that. When I came back to a live audience show, I realized the audience is a big part of the way the performances are shaped. I’ve learned all kinds of ways to manipulate the laughs and, hopefully, still preserve the spontaneous live energy of the show.

How would you compare cutting comedy to drama?
I haven’t done much drama, but I feel like the pace of comedy is faster in every regard, and I really enjoy working at a fast pace. Also, as opposed to a drama or anything shot single-camera, the coverage on a multi-cam show is pretty straightforward, so it’s really all about performance and pacing. There’s not a lot of music in a multi-cam, but you spend a lot of time working with the audience tracks.

What role would you say an editor has in helping to make a “bit” land in a half-hour comedy?
It’s performance, timing and camera choices — and when it works, it feels great. I’m always amazed at how changing an edit by a frame or two can make something pop. Same goes for playing something wider or closer depending on the situation.

MWAP is shot before a live studio audience. How does that affect your rhythm?
The audience definitely affects the rhythm of the show. I try to preserve the feeling of the laughs and still keep the show moving. A really long laugh is great on show night, but usually we cut it down a bit and play off more reactions. The actors on MWAP are great because they really know how to “ride” the laughs and not break character. I love watching great comedic actors, like the cast of I Love Lucy, for example, who were incredible at holding for laughs. It’s a real asset and very helpful to the editor.

Can you describe your process? And what system do you edit the show on?
I’ve always used the Avid Media Composer. Dabbled with Final Cut, but prefer Avid. I assemble the whole show in one sequence and go scene by scene. I watch all of the takes of a scene and make choices for each section or sometimes for each line. Then I chunk the scene together, sometimes putting in two choices for a line or area. I then cut into the big pieces to select the cameras for each shot. After that, I go back and find the rhythm of the scene — tightening the pace, cutting into the laughs and smoothing them.

After the show is put together, I go back and watch the whole thing again, pretending that I’ve never seen it, which is a challenge. That makes me adjust it even more. I try to send out a pretty polished first cut, without cutting any dialogue to show the producers everything, which seems to make the whole process go faster. I’m lucky that the directors on MWAP are very seasoned and don’t really give me many notes. Jimmy Burrows and Pam Fryman have directed almost all of the episodes, and I don’t send out a separate cut to either of them. Particularly with Pam, as I’ve worked with her for about 11 years, so we have a nice shorthand.

How do assistant editors work into the mix?
My assistant, Dan “Steely” Esparza, is incredible! He allows me to show up to work every day and not think about anything other than cutting the show. He’s told me, even though I always ask, that he prefers not to be an editor, so I don’t push him in that direction. He is excellent at visual effects and enjoys them, so I always have him do those. On HIMYM, we had quite a lot of visual effects, so he was pretty busy there. But on MWAP, it’s mostly rough composites for blue/greenscreen scenes and painting out errant boom shadows, boom mics and parts of people.

Your work on HIMYM was highly lauded. What are some of your favorite “editing” moments from that show and what were some of the biggest challenges they threw at you?
I really loved working on that show — every episode was unique, and it really gave me opportunities to grow as an editor. Carter Bays and Craig Thomas were amazing problem solvers. They were able to look at the footage and make something completely different out of it if need be. I remember times when a scene wasn’t working or was too long, and they would write some narration, record the temp themselves, and then we’d throw some music over it and make it into a montage.

Some of the biggest editing challenges were the music videos/sequences that were incorporated into episodes. There were three complete Robin Sparkles videos and many, many other musical pieces, almost always written by Carter and Craig. In “P.S. I Love You,” they incorporated her last video into kind of a Canadian Behind the Music about the demise of Robin Sparkles, and that was pretty epic for a sitcom. The gigantic “Subway Wars” was another big challenge, in that it had 85 “scenelets.” It was a five-way race around Manhattan to see who could get to a restaurant where Woody Allen was supposedly eating first, with each person using a different mode of transportation. Crazy fun and also extremely challenging to fit into a sitcom schedule.

You started in the business as a classical musician. How does your experience as a professional violinist influence your work as an editor?
I think the biggest thing is having a good feeling for the rhythm of whatever I’m working on. I love to be able to change the tempo and to make something really pop. And when asked to change the pacing or cut sections out, when doing various people’s notes, being able to embrace that too. Collaborating is a big part of being a musician, and I think that’s helped me a lot in working with the different personalities. It’s not unlike responding to a conductor or playing chamber music. Also having an understanding of phrasing and the overall structure of a piece is valuable, even though it was musical phrasing and structure, it’s not all that different.

Obviously, whenever there’s actual music involved, I feel pretty comfortable handling it or choosing the right piece for a scene. If classical music’s involved, I have a great deal of knowledge that can be helpful. For example, in HIMYM, we needed something to be a theme for Barney’s Playbook antics. I tried a few things, and we landed on the Mozart Rondo Alla Turca, which I’ve been hearing lately in the Progresso Soup commercials.

How did you make the transition from the concert hall to the editing room?
It’s a long story! I was playing in the San Francisco Ballet Orchestra and was feeling stuck. I was lucky enough to find an amazing career counseling organization that helped me open my mind to all kinds of possibilities, and they helped me to discover the perfect job for me. It was quite a journey, but the main thing was to be open to anything and identify the things about myself that I wanted to use. I learned that I loved music (but not playing the violin), puzzles, stories and organizing — so editing!

I sold a bow, took the summer off from playing and enrolled in a summer production workshop at USC. I wasn’t quite ready to move to LA, so I went back to San Francisco and began interning at a small commercial editing house. I was answering phones, emptying the dishwasher, getting coffees and watching the editing, all while continuing to play in the Ballet Orchestra. The people were great and gave me opportunities to learn whenever possible. Luckily for me, they were using the Avid before it came to TV and features. Eventually, there was a very rough documentary that one of the editors wanted to cut, but it wasn’t organized. They gave me the key to the office and said, “You want to be an editor? Organize this!” So I did, and they started offering me assistant work on commercials. But I wanted to cut features, so I started to make little trips to LA to meet anybody I could.

Bill Steinberg, an editor working in the Universal Syndication department who I met at USC, got me hooked up with an editor who was to be one of Roger Corman’s first Avid editors. The Avids didn’t arrive right away, but he helped me put my name in the hat to be an assistant the next time. It happened, and I was on my way! I took a sabbatical from the orchestra, went down to LA, and worked my tail off for $400 a week on three low-budget features. I was in heaven. I had enough hours to join the union as an assistant, but I needed money to pay the admission fee. So I went back to San Francisco and played one month of Nutcrackers to cover the fee, and then I took another year sabbatical. Bill offered me a month position in the syndication department to fill in for him, and show the film editors what I knew about the Avid.

Eventually Andy Chulack, the editor of Coach, was looking for an Avid assistant, and I was recommended because I knew it. Andy hired me and took me under his wing, and I absolutely loved it. I guess the upshot is, I was fearlessly naive and knew the Avid!

What do you love most about being an editor?
I love the variation of material and people that I get to work with, and I like being able to take time to refine things. I don’t have to play it live anymore!

Lost in Time game show embraces ‘Interactive Mixed Reality’

By Daniel Restuccio

The Future Group — who has partnered with Fremantle Media, Ross Video and Epic Games — have created a new super-agile entertainment platform that blends linear television and game technology into a hybrid format called “Interactive Mixed Reality.”

The brainchild of Bård Anders Kasin, this innovative content deployment medium generated a storm of industry buzz at NAB 2016, and their first production Lost in Time — a weekly primetime game show — is scheduled to air this month on Norwegian television.

The Idea
The idea originated more than 13 years ago in Los Angeles. In 2003, at age 22, Kasin, a self-taught multimedia artist from Notodden, Norway, sent his CV and a bunch of media projects to Warner Bros. in Burbank, California, in hopes of working on The Matrix. They liked it. His interview was on a Wednesday and by Friday he had a job as a technical director.

Kasin immersed himself in the cutting-edge movie revolution that was The Matrix franchise. The Wachowskis visionary production was a masterful inspiration and featured a compelling sci-fi action story, Oscar-winning editing, breakthrough visual effects (“bullet-time”) and an expanded media universe that included video games and an animè-style short The Animatrix. The Matrix Reloaded and The Matrix Revolutions were shot at the same time, as well as more than an hour of footage specifically designed for the video game. The Matrix Online, an Internet gaming platform, was a direct sequel to The Matrix Revolutions.

L-R: Bård Anders Kasin and Jens Petter Høili.

Fast forward to 2013 and Kasin has connected with software engineer and serial entrepreneur Jens Petter Høili, founder of EasyPark and Fairchance. “There was this producer I knew in Norway,” explains Kasin, “who runs this thing called the Artists’ Gala charity. He called and said, ‘There’s this guy you should meet. I think you’ll really hit it off.’” Kasin met Høili had lunch and discussed projects they each were working on. “We both immediately felt there was a connection,” recalls Kasin. No persuading was necessary. “We thought that if we combined forces we were going to get something that’s truly amazing.”

That meeting of the minds led to the merging of their companies and the formation of The Future Group. The mandate of Oslo-based The Future Group is to revolutionize the television medium by combining linear TV production with cutting-edge visual effects, interactive gameplay, home viewer participation and e-commerce. Their IMR concept ditches the individual limiting virtual reality (VR) headset, but conceptually keeps the idea of creating content that is a multi-level, intricate and immersive experience.

Lost in Time
Fast forward again, this time to 2014. Through another mutual friend, The Future Group formed an alliance with Fremantle Media. Fremantle, a global media company, has produced some of the highest-rated and longest-running shows in the world, and is responsible for top international entertainment brands such as Got Talent, Idol and The X Factor.

Kasin started developing the first IMR prototype. At this point, the Lost in Time production had expanded to include Ross Video and Epic Games. Ross Video is a broadcast technology innovator and Epic Games is a video game producer and the inventor of the Unreal game engine. The Future Group, in collaboration with Ross Video, engineered the production technology and developed a broadcast-compatible version of the Unreal game engine called Frontier, shown at NAB 2016, to generate high-resolution, realtime graphics used in the production.

On January 15, 2015 the first prototype was shown. When Freemantle saw the prototype, they were amazed. They went directly to stage two, moving to the larger stages at Dagslys Studios. “Lost in Time has been the driver for the technology,” explains Kasin. “We’re a very content-driven company. We’ve used that content to drive the development of the platform and the technology, because there’s nothing better than having actual content to set the requirements for the technology rather than building technology for general purposes.”

In Lost in Time, three studio contestants are set loose on a greenscreen stage and perform timed, physical game challenges. The audience, which could be watching at home or on a mobile device, sees the contestant seamlessly blended into a virtual environment built out of realtime computer graphics. The environments are themed as western, ice age, medieval times and Jurassic period sets (among others) with interactive real props.

The audience can watch the contestants play the game or participate in the contest as players on their mobile device at home, riding the train or literally anywhere. They can play along or against contestants, performing customized versions of the scripted challenges in the TV show. The mobile content uses graphics generated from the same Unreal engine that created the television version.

“It’s a platform,” reports partner Høili, referring to the technology behind Lost in Time. A business model is a way you make money, notes tech blogger Jonathan Clarks, and a platform is something that generates business models. So while Lost in Time is a specific game show with specific rules, built on television technology, it’s really a business technology framework where multiple kinds of interactive content could be generated. Lost in Time is like the Unreal engine itself, software that can be used to create games, VR experiences and more, limited only by the imagination of the content creator. What The Future Group has done is create a high-tech kitchen from which any kind of cuisine can be cooked up.

Soundstages and Gear
Lost in Time is produced on two greenscreen soundstages at Dagslys Studios in Oslo. The main “gameplay set” takes up all of Studio 1 (5,393 square feet) and the “base station set” is on Studio 3 (1,345 square feet). Over 150 liters (40 gallons) of ProCyc greenscreen paint was used to cover both studios.

Ross Video, in collaboration with The Future Group, devised an integrated technology of hardware and software that supports the Lost in Time production platform. This platform consists of custom cameras, lenses, tracking, control, delay, chroma key, rendering, greenscreen, lighting and switcher technology. This system includes the new Frontier hardware, introduced at NAB 2016, which runs the Unreal game engine 3D graphics software.

Eight Sony HDC-2500 cameras running HZC-UG444 software are used for the production. Five are deployed on the “gameplay set.” One camera rides on a technocrane, two are on manual pedestal dollies and one is on Steadicam. For fast-action tracking shots, another camera sits on the Furio RC dolly that rides on a straight track that runs the 90-foot length of the studio. The Furio RC pedestal, controlled by SmartShell, guarantees smooth movement in virtual environments and uses absolute encoders on all axes to send complete 3D tracking data into the Unreal engine.

There is also one Sony HDC-P1 camera that is used as a static, center stage, ceiling cam flying 30 feet above the gameplay set. There are three cameras in the home base set, two on Furio Robo dollies and one on a technocrane. In the gameplay set, all cameras (except the ceiling cam) are tracked with the SolidTrack IR markerless tracking system.

All filming is done at 1080p25 and output RGB 444 via SDI. They use a custom LUT on the cameras to avoid clipping and an expanded dynamic range for post work. All nine camera ISOs, separate camera “clean feeds,” are recorded with a “flat” LUT in RGB 444. For all other video streams, including keying and compositing, they use LUT boxes to invert the signal back to Rec 709.

Barnfind provided the fiber optic network infrastructure that links all the systems. Ross Video Dashboard controls the BarnOne frames as well as the router, Carbonite switchers, Frontier graphics system and robotic cameras.

A genlock signal distributed via OpenGear syncs all the gear to a master clock. The Future Group added proprietary code to Unreal so the render engine can genlock, receive and record linear timecode (LTC) and output video via SDI in all industry standard formats. They also added additional functionality to the Unreal engine to control lights via DMX, send and receive GPI signals, communicate with custom sensors, buttons, switches and wheels used for interaction with the games and controlling motion simulation equipment.

In order for the “virtual cameras” in the graphics systems and the real cameras viewing the real elements to have the exact same perspectives, an “encoded” camera lens is required that provides the lens focal length (zoom) and focus data. In addition the virtual lens field of view (FOV) must be properly calibrated to match the FOV of the real lens. Full servo digital lenses with 16-bit encoders are needed for virtual productions. Lost in Time uses three Canon lenses with these specifications: Canon Hj14ex4.3B-IASE, Canon Hj22ex7.6B-IASE-A and Canon Kj17ex7.7B-IASE-A.

The Lost in Time camera feeds are routed to the Carbonite family hardware: Ultrachrome HR, Carbonite production frame and Carbonite production switcher. Carbonite Ultrachrome HR is a stand-alone multichannel chroma key processor based on the Carbonite Black processing engine. On Lost in Time, the Ultrachrome switcher accepts the Sony camera RGB 444 signal and uses high-resolution chroma keyers, each with full control of delay management, fill color temperature for scene matching, foreground key and fill, and internal storage for animated graphics.

Isolated feeds of all nine cameras are recorded, plus two quad-splits with the composited material and the program feed. Metus Ingest, a The Future Group proprietary hardware solution, was used for all video recording. Metus Ingest can simultaneously capture and record  up to six HD channels of video and audio from multiple devices on a single platform.

Post Production
While the system is capable of being broadcast live, they decided not to go live for the debut. Instead they are only doing a modest amount of post to retain the live feel. That said, the potential of the post workflow on Lost in Time arguably sets a whole new post paradigm. “Post allows us to continue to develop the virtual worlds for a longer amount of time,” says Kasin. “This gives us more flexibility in terms of storytelling. We’re always trying to push the boundaries with the creative content. How we tell the story of the different challenges.”

All camera metadata, including position, rotation, lens data, etc., and all game interaction, were recorded in the Unreal engine with a proprietary system. This allowed graphics playback as a recorded session later. This also let the editors change any part of the graphics non-destructively. They could choose to replace 3D models or textures or in post change the tracking or point-of-view of any of the virtual cameras as well as add cameras for more virtual “coverage.”

Lost in Time episodes were edited as a multicam project, based on the program feed, in Adobe Premiere CC. They have a multi-terabyte storage solution from Pixit Media running Tiger Technology’s workflow manager. “The EDL from the final edit is fed through a custom system, which then builds a timeline in Unreal to output EXR sequences for a final composite.”

That’s it for now, but be sure to visit this space again to see part two of our coverage on The Future Group’s Lost in Time. Our next story will include the real and virtual lighting systems, the SolidTrack IR tracking system, the backend component, and interview with Epic Games’ Kim Libreri about Unreal engine development/integration and a Lost in Time episode editor.


Daniel Restuccio, who traveled to Oslo for this piece, is a writer, producer and teacher. He is currently multimedia department chairperson at California Lutheran in Thousand Oaks.

Mozart in the Jungle

The colorful dimensions of Amazon’s Mozart in the Jungle

By Randi Altman

How do you describe Amazon’s Mozart in the Jungle? Well, in its most basic form it’s a comedy about the changing of the guard — or maestro — at the New York Philharmonic, and the musicians that make up that orchestra. When you dig deeper you get a behind-the-scenes look at the back-biting and crazy that goes on in the lives and heads of these gifted artists.

Timothy Vincent

Timothy Vincent

Based on the novel Mozart in the Jungle: Sex, Drugs, and Classical Music by oboist Blair Tindall, the series — which won the Golden Globe last year and was nominated this year — has shot in a number of locations over its three seasons, including Mexico and Italy.

Since its inception, Mozart in the Jungle has been finishing in 4K and streaming in both SDR and HDR. We recently reached out to Technicolor’s senior color timer, Timothy Vincent, who has been on the show since the pilot to find out more about the show’s color workflow.

Did Technicolor have to gear up infrastructure-wise for the show’s HDR workflow?
We were doing UHD 4K already and were just getting our HDR workflows worked out.

What is the workflow from offline to online to color?
The dailies are done in New York based on the Alexa K1S1 709 LUT. (Technicolor On-Location Services handled dailies out of Italy, and Technicolor PostWorks in New York.) After the offline and online, I get the offline reference made with the dailies so I can look at if I have a question about what was intended.

If someone was unsure about watching in HDR versus SDR, what would you tell them?
The emotional feel of both the SDR and the HDR is the same. That is always the goal in the HDR pass for Mozart. One of the experiences that is enhanced in the HDR is the depth of field and the three-dimensional quality you gain in the image. This really plays nicely with the feel in the landscapes of Italy, the stage performances where you feel more like you are in the audience, and the long streets of New York just to name a few.

Mozart in the JungleWhen I’m grading the HDR version, I’m able to retain more highlight detail than I was in the SDR pass. For someone who has not yet been able to experience HDR, I would actually recommend that they watch an episode of the show in SDR first and then in HDR so they can see the difference between them. At that point they can choose what kind of viewing experience they want. I think that Mozart looks fantastic in both versions.

What about the “look” of the show. What kind of direction where you given?
We established the look of the show based on conversations and collaboration in my bay. It has always been a filmic look with soft blacks and yellow warm tones as the main palette for the show. Then we added in a fearlessness to take the story in and out of strong shadows. We shape the look of the show to guide the viewers to exactly the story that is being told and the emotions that we want them to feel. Color has always been used as one of the storytelling tools on the show. There is a realistic beauty to the show.

What was your creative partnership like with the show’s cinematographer, Tobias Datum?
I look forward to each episode and discovering what Tobias has given me as palette and mood for each scene. For Season 3 we picked up where we left off at the end of Season 2. We had established the look and feel of the show and only had to account for a large portion of Season 3 being shot in Italy. Making sure to feel the different quality of light and feel of the warmth and beauty of Italy. We did this by playing with natural warm skin tones and the contrast of light and shadow he was creating for the different moods and locations. The same can be said for the two episodes in Mexico in Season 2. I know now what Tobias likes and can make decisions I’m confident that he will like.

Mozart in the JungleFrom a director and cinematographer’s point of view, what kind of choices does HDR open up creatively?
It depends on if they want to maintain the same feel of the SDR or if they want to create a new feel. If they choose to go in a different direction, they can accentuate the contrast and color more with HDR. You can keep more low-light detail while being dark, and you can really create a separate feel to different parts of the show… like a dream sequence or something like that.

Any workflow tricks/tips/trouble spots within the workflow or is it a well-oiled machine at this point?
I have actually changed the way I grade my shows based on the evolution of this show. My end results are the same, but I learned how to build grades that translate to HDR much easier and consistently.

Do you have a color assistant?
I have a couple of assistants that I work with who help me with prepping the show, getting proxies generated, color tracing and some color support.

What tools do you use — monitor, software, computer, scope, etc.?
I am working on Autodesk Lustre 2017 on an HP Z840, while monitoring on both a Panasonic CZ950 and a Sony X300. I work on Omnitek scopes off the downconverter to 2K. The show is shot on both Alexa XT and Alexa Mini, framing for 16×9. All finishing is done in 4K UHD for both SDR and HDR.

Anything you would like to add?
I would only say that everyone should be open to experiencing both SDR and HDR and giving themselves that opportunity to choose which they want to watch and when.

ACE Eddie nominees include Arrival, Manchester by the Sea, Better Call Saul

The American Cinema Editors (ACE) have named the nominees for the 67th ACE Eddie Award, which recognize editing in 10 categories of film, television and documentaries.

Winners will be announced during ACE’s annual awards ceremony on January 27 at the Beverly Hilton Hotel. In addition to the regular editing awards, J.J. Abrams will receive the ACE Golden Eddie Filmmaker of the Year award.

Check out the nominees:

BEST EDITED FEATURE FILM (DRAMATIC)
Arrival
Joe Walker, ACE

Hacksaw Ridge
John Gilbert, ACE

Hell or High Water
Jake Roberts

Manchester by the Sea
Jennifer Lame
 
Moonlight
Nat Sanders, Joi McMillon

BEST EDITED FEATURE FILM (COMEDY)
Deadpool
Julian Clarke, ACE

Hail, Caesar!
Roderick Jaynes

The Jungle Book
Mark Livolsi, ACE

La La Land
Tom Cross, ACE

The Lobster
Yorgos Mavropsaridis

BEST EDITED ANIMATED FEATURE FILM
Kubo and the Two Strings
Christopher Murrie, ACE

Moana
Jeff Draheim, ACE

Zootopia
Fabienne Rawley and Jeremy Milton

BEST EDITED DOCUMENTARY (FEATURE)

13th
Spencer Averick

Amanda Knox
Matthew Hamachek

The Beatles: Eight Days a Week — The Touring Years
Paul Crowder

OJ: Made in America
Bret Granato, Maya Mumma and Ben Sozanski

Weiner
Eli B. Despres

BEST EDITED DOCUMENTARY (TELEVISION)
The Choice 2016
Steve Audette, ACE

Everything Is Copy
Bob Eisenhardt, ACE

We Will Rise: Michelle Obama’s Mission to Educate Girls Around the World
Oliver Lief

BEST EDITED HALF-HOUR SERIES
Silicon Valley: “The Uptick”
Brian Merken, ACE

Veep: “Morning After”
Steven Rasch, ACE

Veep: “Mother”
Shawn Paper

BEST EDITED ONE-HOUR SERIES — COMMERCIAL
Better Call Saul: “Fifi”
Skip Macdonald, ACE

Better Call Saul: “Klick”
Skip Macdonald, ACE & Curtis Thurber

Better Call Saul: “Nailed”
Kelley Dixon, ACE and Chris McCaleb

Mr. Robot: “eps2.4m4ster-s1ave.aes”
Philip Harrison

This is Us: “Pilot”
David L. Bertman, ACE

BEST EDITED ONE-HOUR SERIES – NON-COMMERCIAL
The Crown: “Assassins”
Yan Miles, ACE

Game of Thrones: “Battle of the Bastards”
Tim Porter, ACE

Stranger Things: “Chapter One: The Vanishing of Will Byers”
Dean Zimmerman

Stranger Things: “Chapter Seven: The Bathtub”
Kevin D. Ross

Westworld: “The Original”
Stephen Semel, ACE and Marc Jozefowicz

BEST EDITED MINISERIES OR MOTION PICTURE (NON-THEATRICAL)
All the Way
Carol Littleton, ACE

The Night Of: “The Beach”
Jay Cassidy, ACE

The People V. OJ Simpson: American Crime Story: “Marcia, Marcia, Marcia”
Adam Penn, Stewart Schill, ACE and C. Chi-yoon Chung

BEST EDITED NON-SCRIPTED SERIES:
Anthony Bourdain: Parts Unknown: “Manila” 
Hunter Gross, ACE

Anthony Bourdain: Parts Unknown: Senegal
Mustafa Bhagat

Deadliest Catch: “Fire at Sea: Part 2”
Josh Earl, ACE and Alexander Rubinow, ACE

Final ballots will be mailed on January 6, and voting ends on January 17. The Blue Ribbon screenings, where judging for all television categories and the documentary categories take place, will be on January 15. Projects in the aforementioned categories are viewed and judged by committees comprised of professional editors (all ACE members). All 850-plus ACE members vote during the final balloting of the ACE Eddies, including active members, life members, affiliate members and honorary members.

Main Image: Tilt Photo

What it sounds like when Good Girls Revolt for Amazon Studios

By Jennifer Walden

“Girls do not do rewrites,” says Jim Belushi’s character, Wick McFadden, in Amazon Studios’ series Good Girls Revolt. It’s 1969, and he’s the national editor at News of the Week, a fictional news magazine based in New York City. He’s confronting the new researcher Nora Ephron (Grace Gummer) who claims credit for a story that Wick has just praised in front of the entire newsroom staff. The trouble is it’s 1969 and women aren’t writers; they’re only “researchers” following leads and gathering facts for the male writers.

When Nora’s writer drops the ball by delivering a boring courtroom story, she rewrites it as an insightful articulation of the country’s cultural climate. “If copy is good, it’s good,” she argues to Wick, testing the old conventions of workplace gender-bias. Wick tells her not to make waves, but it’s too late. Nora’s actions set in motion an unstoppable wave of change.

While the series is set in New York City, it was shot in Los Angeles. The newsroom they constructed had an open floor plan with a bi-level design. The girls are located in “the pit” area downstairs from the male writers. The newsroom production set was hollow, which caused an issue with the actors’ footsteps that were recorded on the production tracks, explains supervising sound editor Peter Austin. “The set was not solid. It was built on a platform, so we had a lot of boomy production footsteps to work around. That was one of the big dialogue issues. We tried not to loop too much, so we did a lot of specific dialogue work to clean up all of those newsroom scenes,” he says.

The main character Patti Robinson (Genevieve Angelson) was particularly challenging because of her signature leather riding boots. “We wanted to have an interesting sound for her boots, and the production footsteps were just useless. So we did a lot of experimenting on the Foley stage,” says Austin, who worked with Foley artists Laura Macias and Sharon Michaels to find the right sound. All the post sound work — sound editorial, Foley, ADR, loop group, and final mix was handled at Westwind Media in Burbank, under the guidance of post producer Cindy Kerber.

Austin and dialog editor Sean Massey made every effort to save production dialog when possible and to keep the total ADR to a minimum. Still, the newsroom environment and several busy street scenes proved challenging, especially when the characters were engaged in confidential whispers. Fortunately, “the set mixer Joe Foglia was terrific,” says Austin. “He captured some great tracks despite all these issues, and for that we’re very thankful!”

The Newsroom
The newsroom acts as another character in Good Girls Revolt. It has its own life and energy. Austin and sound effects editor Steve Urban built rich backgrounds with tactile sounds, like typewriters clacking and dinging, the sound of rotary phones with whirring dials and bell-style ringers, the sound of papers shuffling and pencils scratching. They pulled effects from Austin’s personal sound library, from commercial sound libraries like Sound Ideas, and had the Foley artists create an array of period-appropriate sounds.

Loop group coordinator Julie Falls researched and recorded walla that contained period appropriate colloquialisms, which Austin used to add even more depth and texture to the backgrounds. The lively backgrounds helped to hide some dialogue flaws and helped to blend in the ADR. “Executive producer/series creator Dana Calvo actually worked in an environment like this and so she had very definite ideas about how it would sound, particularly the relentlessness of the newsroom,” explains Austin. “Dana had strong ideas about the newsroom being a character in itself. We followed her guide and wanted to support the scenes and communicate what the girls were going through — how they’re trying to break through this male-dominated barrier.”

Austin and Urban also used the backgrounds to reinforce the difference between the hectic state of “the pit” and the more mellow writers’ area. Austin says, “The girls’ area, the pit, sounds a little more shrill. We pitched up the phone’s a little bit, and made it feel more chaotic. The men’s raised area feels less strident. This was subtle, but I think it helps to set the tone that these girls were ‘in the pit’ so to speak.”

The busy backgrounds posed their own challenge too. When the characters are quiet, the room still had to feel frenetic but it couldn’t swallow up their lines. “That was a delicate balance. You have characters who are talking low and you have this energy that you try to create on the set. That’s always a dance you have to figure out,” says Austin. “The whole anarchy of the newsroom was key to the story. It creates a good contrast for some of the other scenes where the characters’ private lives were explored.”

Peter Austin

The heartbeat of the newsroom is the teletype machines that fire off stories, which in turn set the newsroom in motion. Austin reports the teletype sound they used was captured from a working teletype machine they actually had on set. “They had an authentic teletype from that period, so we recorded that and augmented it with other sounds. Since that was a key motif in the show, we actually sweetened the teletype with other sounds, like machine guns for example, to give it a boost every now and then when it was a key element in the scene.”

Austin and Urban also built rich backgrounds for the exterior city shots. In the series opener, archival footage of New York City circa 1969 paints the picture of a rumbling city, moved by diesel-powered buses and trains, and hulking cars. That footage cuts to shots of war protestors and police lining the sidewalk. Their discontented shouts break through the city’s continuous din. “We did a lot of texturing with loop group for the protestors,” says Austin. He’s worked on several period projects over years, and has amassed a collection of old vehicle recordings that they used to build the street sounds on Good Girls Revolt. “I’ve collected a ton of NYC sounds over the years. New York in that time definitely has a different sound than it does today. It’s very distinct. We wanted to sell New York of that time.”

Sound Design
Good Girls Revolt is a dialogue-driven show but it did provide Austin with several opportunities to use subjective sound design to pull the audience into a character’s experience. The most fun scene for Austin was in Episode 5 “The Year-Ender” in which several newsroom researchers consume LSD at a party. As the scene progresses, the characters’ perspectives become warped. Austin notes they created an altered state by slowing down and pitching down sections of the loop group using Revoice Pro by Synchro Arts. They also used Avid’s D-Verb to distort and diffuse selected sounds.

Good Girls Revolt“We got subjective by smearing different elements at different times. The regular sound would disappear and the music would dominate for a while and then that would smear out,” describes Austin. They also used breathing sounds to draw in the viewer. “This one character, Diane (Hannah Barefoot), has a bad experience. She’s crawling along the hallway and we hear her breathing while the rest of the sound slurs out in the background. We build up to her freaking out and falling down the stairs.”

Austin and Urban did their design and preliminary sound treatments in Pro Tools 12 and then handed it off to sound effects re-recording mixer Derek Marcil, who polished the final sound. Marcil was joined by dialog/music re-recording mixer David Raines on Stage 1 at Westwind. Together they mixed the series in 5.1 on an Avid ICON D-Control console. “Everyone on the show was very supportive, and we had a lot of creative freedom to do our thing,” concludes Austin.

Creating and tracking roaches for Hulu’s 11.22.63

By Randi Altman

Looking for something fun and compelling to watch while your broadcast shows are on winter break? You might want to try Hulu’s original eight-part miniseries 11.22.63, which the streaming channel released last February.

It comes with a pretty impressive pedigree — it’s based on a Stephen King novel, it’s executive produced by J.J. Abrams, it stars Oscar-nominee James Franco (127 Hours) and it’s about JFK’s assassination and includes time travel. C’mon!

The plot involves Franco’s character traveling back to 1960 in an effort to stop JFK’s assassination, but just as he makes headway, he feels the past pushing back in some dangerous, and sometimes gross, ways.

Bruce Branit

In the series pilot, Franco’s character, Jack Epping, is being chased by Kennedy’s security after he tries to sneak into a campaign rally. He ducks in a storage room to hide, but he’s already ticked off the past, which slowly serves him up a room filled with cockroaches that swarm him. The sequence is a slow build, with roaches crawling out, covering the floor and then crawling up him.

I’m not sure if Franco has a no-roach clause in his contract (I would), but in order to have control over these pests, it was best to create them digitally. This is where Bruce Branit, owner of BranitFX in Kansas City, Missouri came in. Yes, you read that right, Kansas City, and his resume is impressive. He is a frequent collaborator with Jay Worth, Bad Robot’s VFX supervisor.

So for this particular scene, BranitFX had one or two reference shots, which they used to create a roach brush via Photoshop. Once the exact look was determined regarding the amount of attacking roaches, they animated it in 3D and and composited. They then used 2D and 3D tracking tools to track Franco while the cockroaches swarmed all over him.

Let’s find out more from Bruce Branit.

How early did you get involved in that episode? How much input did you have in how it would play out?
For this show, there wasn’t a lot of lead time. I came on after shooting was done and there was a rough edit. I don’t think the edit changed a lot after we started.

What did the client want from the scene, and how did you go about accomplishing that?
VFX supervisor Jay Worth and I have worked together on a lot of shows. We’d done some roaches for an episode of Almost Human, and also I think for Fringe, so we had some similar assets and background with talking “roach.” The general description was tons of roaches crawling on James Franco.

Did you do previs?
Not really. I rendered about 10 angles of the roach we had previously worked with and made Adobe Photoshop brushes out of each frame. I used that to paint up a still of each shot to establish a baseline for size, population and general direction of the roaches in each of the 25 or so shots in the sequence.

Did you have to play with the movements a lot, or did it all just come together?
We developed a couple base roach walks and behaviors and then populated each scene with instances of that. This changed depending on whether we needed them crossing the floor, hanging on a light fixture or climbing on Franco’s suit. The roach we had used in the past was similar to what the producers on 11.22.63 had in mind. We made a few minor modifications with texture and modeling. Some of this affected the rig we’d built so a lot of the animations had to be rebuilt.

Can you talk about your process/workflow?
This sequence was shot in anamorphic and featured a constantly flashing light on the set going from dark emergency red lighting to brighter florescent lights. So I generated unsqueezed lens distortion, removed and light mitigated interim plates to pull all of our 2D and 3D tracking off of. The tracking was broken into 2D, 3D and 3D tracking by hand involving roaches on Franco’s body as he turns and swats at them in a panic. The production had taped large “Xs” on his jacket to help with this roto-tracking, but those two had to be painted out for many shots prior to the roaches reaching Franco.

The shots were tracked in Fusion Studio for 2D and SynthEyes for 3D. A few shots were also tracked in PFTrack.

The 3D roach assets were animated and rendered in NewTek LightWave. Passes for the red light and white light conditions were rendered as well as ambient show and specular passes. Although we were now using tracking plates with the 2:1 anamorphic stretch removed, a special camera was created in LightWave that was actually double the anamorphic squeeze to duplicate the vertical booked and DOF from an anamorphic lens. The final composite was completed in Blackmagic Fusion Studio using the original anamorphic plates.

What was the biggest challenge you faced working on this scene?
Understanding the anamorphic workflow was a new challenge. Luckily, I had just completed a short project of my own called Bully Mech that was shot with Lomo anamorphic lenses. So I had just recently developed some familiarity and techniques to deal with the unusual lens attributes of those lenses. Let’s just say they have a lot of character. I talked with a lot of cinematographer friends to try to understand how the lenses behaved and why they stretched the out-of-focus element vertically while the image was actually stretched the other way.

What are you working on now?
I‘ve wrapped up a small amount of work on Westworld and a handful of shots on Legends of Tomorrow. I’ve been directing some television commercials the last few months and just signed a development deal on the Bully Mech project I mentioned earlier.

We are making a sizzle reel of the short that expands the scope of the larger world and working with concept designers and a writer to flush out a feature film pitch. We should be going out with the project early next year.

GenPop’s Bill Yukich directs, edits gritty open for Amazon’s Goliath 

Director/editor Bill Yukich helmed the film noir-ish opening title sequence for Amazon’s new legal drama, Goliath. Produced by LA-based content creation studio GenPop, the black and white intro starts with Goliath lead actor Billy Bob Thornton jumping into the ocean. While underwater, and smoking a cigarette and holding a briefcase, he casually strolls through rooms filled with smoke and fire. At the end of the open, he rises from the water as the Santa Monica Pier appears next to him and as the picture turns from B&W to color. The Silent Comedy’s “Bartholomew” track plays throughout.

The ominous backdrop, of a man underwater but not drgoliathowning, is a perfect visual description of Thornton’s role as disgraced lawyer Billy McBride. Yukich’s visuals, he says, are meant to strike a balance between dreamlike and menacing.

The approved concept called for a dry shoot, so Yukich came up with solutions to make it seem as though the sequence was actually filmed underwater. Shot on a Red Magnesium Weapon camera, Yukich used a variety of in-camera techniques to achieve the illusion of water, smoke and fire existing within the same world, including the ingenious use of smoke to mimic the movement of crashing waves.

After wrapping the live-action shoot with Thornton, Yukich edited and color corrected the sequence. The VFX work was mostly supplementary and used to enhance the practical effects which were captured on set, such as adding extra fireballs into the frame to make the pyrotechnics feel fuller. Editing was via Adobe Premiere and VFX and color was done in Autodesk Flame. In the end, 80 percent was live action and only 20 percent visual effects.

Once post production was done, Yukich projected the sequence onto a screen which was submerged underwater and reshot the projected footage. Though technically challenging, Yukich says, this Inception-style method of re-shooting the footage gave the film the organic quality that he was looking for.

Yukich recently worked as lead editor for Beyoncé’s visual album Lemonade. Stepping behind the lens was a natural progression for Yukich, who began directing concerts for bands like Godsmack and The Hollywood Undead, as well as music videos for HIM, Vision of Disorder and The Foo Fighters.

SMPTE: The convergence of toolsets for television and cinema

By Mel Lambert

While the annual SMPTE Technical Conferences normally put a strong focus on things visual, there is no denying that these gatherings offer a number of interesting sessions for sound pros from the production and post communities. According to Aimée Ricca, who oversees marketing and communications for SMPTE, pre-registration included “nearly 2,500 registered attendees hailing from all over the world.” This year’s conference, held at the Loews Hollywood Hotel and Ray Dolby Ballroom from October 24-27, also attracted more than 108 exhibitors in two exhibit halls.

Setting the stage for the 2016 celebration of SMPTE’s Centenary, opening keynotes addressed the dramatic changes that have occurred within the motion picture and TV industries during the past 100 years, particularly with the advent of multichannel immersive sound. The two co-speakers — SMPTE president Robert Seidel and filmmaker/innovator Doug Trumbull — chronicled the advance in audio playback sound since, respectively, the advent of TV broadcasting after WWII and the introduction of film soundtracks in 1927 with The Jazz Singer.

Robert Seidel

ATSC 3.0
Currently VP of CBS Engineering and Advanced Technology, with responsibility for TV technologies at CBS and the CW networks, Seidel headed up the team that assisted WRAL-HD, the CBS affiliate in Raleigh, North Carolina, to become the first TV station to transmit HDTV in July 1996.  The transition included adding the ability to carry 5.1-channel sound using Advanced Television Systems Committee (ATSC) standards and Dolby AC-3 encoding.

The 45th Grammy Awards Ceremony broadcast by CBS Television in February 2004 marked the first scheduled HD broadcast with a 5.1 soundtrack. The emergent ATSC 3.0 standard reportedly will provide increased bandwidth efficiency and compression performance. The drawback is the lack of backwards compatibility with current technologies, resulting in a need for new set-top boxes and TV receivers.

As Seidel explained, the upside for ATSC 3.0 will be immersive soundtracks, using either Dolby AC-4 or MPEG-H coding, together with audio objects that can carry alternate dialog and commentary tracks, plus other consumer features to be refined with companion 4K UHD, high dynamic range and high frame rate images. In June, WRAL-HD launched an experimental ATSC 3.0 channel carrying the station’s programming in 1080p with 4K segments, while in mid-summer South Korea adopted ATSC 3.0 and plans to begin broadcasts with immersive audio and object-based capabilities next February in anticipation of hosting the 2018 Winter Olympics. The 2016 World Series games between the Cleveland Indians and the Chicago Cubs marked the first live ATSC 3.0 broadcast of a major sporting event on experimental station Channel 31, with an immersive-audio simulcast on the Tribune Media-owned Fox affiliate WJW-TV.

Immersive audio will enable enhanced spatial resolution for 3D sound-source localization and therefore provide an increased sense of envelopment throughout the home listening environment, while audio “personalization” will include level control for dialog elements, alternate audio tracks, assistive services, other-language dialog and special commentaries. ATSC 3.0 also will support loudness normalization and contouring of dynamic range.

Doug Trumbull

Higher Frame Rates
With a wide range of experience within the filmmaking and entertainment technologies, including visual effects supervision on 2001: A Space Odyssey, Close Encounters of the Third Kind, Star Trek: The Motion Picture and Blade Runner, Trumbull also directed Silent Running and Brainstorm, as well as special venue offerings. He won an Academy Award for his Showscan process for high-speed 70mm cinematography, helped develop IMAX technologies and now runs Trumbull Studios, which is innovating a new MAGI process to offer 4K 3D at 120fps. High production costs and a lack of playback environments meant that Trumbull’s Showscan format never really got off the ground, which was “a crushing disappointment,” he conceded to the SMPTE audience.

But meanwhile, responding to falling box office receipts during the ‘50s and ‘60s, Hollywood added more consumer features, including large-screen presentations and surround sound, although the movie industry also began to rely on income from the TV community for broadcast rights to popular cinema releases.

As Seidel added, “The convergence of toolsets for both television and cinema — including 2K, 4K and eventually 8K — will lead to reduced costs, and help create a global market around the world [with] a significant income stream.” He also said that “cord cutting” — substituting cable subscription services for Amazon.com, Hulu, iTunes, Netflix and the like — is bringing people back to over-the-air broadcasting.

Trumbull countered that TV will continue at 60fps “with a live texture that we like,” whereas film will retain its 24fps frame rate “that we have loved for years and which has a ‘movie texture.’ Higher frame rates for cinema, such as 48fps used by Peter Jackson for several of the Lord of the Rings films, has too much of a TV look. Showscan at 120fps and a 360-degree shutter avoided that TV look, which is considered objectionable.” (Early reviews of director Ang Lee’s upcoming 3D film Billy Lynn’s Long Halftime Walk, which was shot in 4K at 120fps, have been critical of its video look and feel.)

complex-tv-networkNext-Gen Audio for Film and TV
During a series of “Advances in Audio Reproduction” conference sessions, chaired by Chris Witham, director of digital cinema technology at Walt Disney Studios, three presentations covered key design criteria for next-generation audio for TV and film. During his discussion called “Building the World’s Most Complex TV Network — A Test Bed for Broadcasting Immersive & Interactive Audio,” Robert Bleidt, GM of Fraunhofer USA’s audio and multimedia division, provided an overview of a complete end-to-end broadcast plant that was built to test various operational features developed by Fraunhofer, Technicolor and Qualcomm. These tests were used to evaluate an immersive/object-based audio system based on MPEG-H for use in Korea during planned ATSC 3.0 broadcasting.

“At the NAB Convention we demonstrated The MPEG Network,” Bleidt stated. “It is perhaps the most complex combination of broadcast audio content ever made in a single plant, involving 13 different formats.” This includes mono, stereo, 5.1-channel and other sources. “The network was designed to handle immersive audio in both channel- and HOA-based formats, using audio objects for interactivity. Live mixes from a simulated sports remote was connected to a network operating center, with distribution to affiliates, and then sent to a consumer living room, all using the MPEG-H audio system.”

Bleidt presented an overview of system and equipment design, together with details of a critical AMAU (audio monitoring and authoring unit) that will be used to mix immersive audio signals using existing broadcast consoles limited to 5.1-channel assignment and panning.

Dr. Jan Skoglund, who leads a team at Google developing audio signal processing solutions, addressed the subject of “Open-source Spatial Audio Compression for VR Content,” including the importance of providing realistic immersive audio experiences to accompany VR presentations and 360-degree 3D video.

“Ambisonics have reemerged as an important technique in providing immersive audio experiences,” Skoglund stated. “As an alternative to channel-based 3D sound, Ambisonics represent full-sphere sound, independent of loudspeaker location.” His fascinating presentation considered the ways in which open-source compression technologies can transport audio for various species of next-generation immersive media. Skoglund compared the efficacy of several open-source codecs for first-order Ambisonics, and also the progress being made toward higher-order Ambisonics (HOA) for VR content delivered via the internet, including enhanced experience provided by HOA.

Finally, Paul Peace, who oversees loudspeaker development for cinema, retail and commercial applications at JBL Professional — and designed the Model 9350, 9300 and 9310 surround units — discussed “Loudspeaker Requirements in Object-Based Cinema,” including a valuable in-depth analysis of the acoustic delivery requirements in a typical movie theater that accommodates object-based formats.

Peace is proposing the use of a new metric for surround loudspeaker placement and selection when the layout relies on venue-specific immersive rendering engines for Dolby Atmos and Barco Auro-3D soundtracks, with object-based overhead and side-wall channels. “The metric is based on three foundational elements as mapped in a theater: frequency response, directionality and timing,” he explained. “Current set-up techniques are quite poor for a majority of seats in actual theaters.”

Peace also discussed new loudspeaker requirements and layout criteria necessary to ensure a more consistent sound coverage throughout such venues that can replay more accurately the material being re-recorded on typical dub stages, which are often smaller and of different width/length/height dimensions than most multiplex environments.


Mel Lambert, who also gets photo credit on pictures from the show, is principal of Content Creators, an LA-based copywriting and editorial service, and can be reached at mel.lambert@content-creators.com Follow him on Twitter @MelLambertLA.

 

Quick Chat: Efilm’s new managing director Al Cleland

Al Cleland has been promoted to managing director of Deluxe’s Efilm, which is a digital color, finishing and location services company working on feature films, episodics and trailers. For the past eight years, Cleland has been VP of trailers at Efilm.

A 30-year veteran of the post business, Cleland started his career at Editel and joined CIS, which later became Efilm, as one of the company’s original employees. He served as senior V/GM at Technicolor Creative Services for 10 years, and at Postworks, Los Angeles, returning to Efilm as VP of trailers. We threw three questions at Cleland, let’s see what he had to say…


After working on trailers for the last eight years, you must be excited to be working in all aspects of what Efilm does.
Our trailer department started out dedicated to finishing one studio’s trailers and we’ve expanded into a dedicated hub for the marketing departments of all the studios. Our trailers department has had the advantage of connectivity and common practices with all of Deluxe’s facilities throughout the world. I’ve loved being part of that growth process and, in my new position, I’ll continue to oversee that vital part of the company.

What’s challenging about trailers that people even in the business might not think about?
The great team in that division have to pull together shots and visual effects while the film itself is being finished, which is a unique logistical challenge. And they’re doing all kinds of small changes and creating effects specific to the trailer and to the MPAA requirements for trailers. It’s a unique skill set.

What do you hope to accomplish for Efilm going forward?
Efilm is expanding in terms of the amount of work and the kind of work we’re doing, and I intend to push that expansion along at an even faster rate. We’ve always had an amazing team of colorists, producers and editors that are really the heart of Efilm. We have wonderful technical and support staff. And, of course, we have access to all of those elements at our partner companies and we continue to build on that.

It’s early to talk about specifics, but we all know the industry is changing rapidly. We’ve been among the very first to introduce new technologies and workflows and that’s something the team here is going to expand on.

The A-List — Director Ed Zwick talks Jack Reacher: Never Go Back

By Iain Blair

Director, screenwriter and producer Ed Zwick got his start in television as co-creator of the Emmy Award-winning series Thirtysomething. His feature film career kicked off when he directed the Rob Lowe/Demi Moore vehicle, About Last Night. Zwick went on to direct the Academy Award-winning films Glory and Legends of the Fall. 

Zwick also produced the Oscar-nominated I Am Sam, as well as Traffic — winner of two Golden Globes and four Academy Awards — directed by Steven Soderbergh. He won an Academy Award as a producer of 1999’s Best Picture, Shakespeare in Love.

Ed Zwick

His latest film, Paramount’s Jack Reacher: Never Go Back, reunites him with his The Last Samurai star Tom Cruise. It’s an action-packed follow-up to 2012’s Jack Reacher hit that grossed over $200 million in worldwide box office.

The set-up? Years after resigning command of an elite military police unit, the nomadic, righter-of-wrongs Reacher is drawn back into the life he left behind when his friend and successor, Major Susan Turner (Cobie Smulders), is framed for espionage. Naturally, Reacher will stop at nothing to prove her innocence and to expose the real perpetrators behind the killings of his former soldiers. Mayhem quickly ensues, helped along with plenty of crazy stunts and cutting-edge VFX.

I recently chatted with Zwick about making the film.

You’ve worked in so many genres, but this is your first crime thriller. 
I’ve always loved crime thrillers — especially films like Three Days of the Condor and Bullitt where the characters and their relationships are far more important than the action. That’s where I tried to take this.

Jack Reacher: Never Go BackTom Cruise is famous for being a perfectionist and doing all his own stunts when possible. Any surprises re-teaming with him?
Yeah, I always say the most boring job on set is being Tom’s stunt double. Tom is a perfectionist and he loves to be involved in every aspect of the production, so no surprises there. He has such a great love for all the different genres, but a particular love for action films and thrillers. It was very important for him that he didn’t do something that was like all the other films out there. I think we all felt that superhero fatigue has been setting in, so the idea was to do things on a more human scale, and make it more realistic and authentic, both with the characters and with the action.

What were the main technical challenges of making this?
We shot it all in New Orleans, and it’s a road movie. So we had to shoot Washington, DC, there too, and create a cross-country journey with different airports and so on. We did all of that with some sleight of hand and extensions and VFX. I think it’s also a challenge to come up with new settings for action pieces we haven’t seen before, and that’s where the parade and rooftop sequences in New Orleans come in, along with the fight on the plane. The book it’s based on is set in LA and DC, but they’re both tough to shoot in, and with the great tax breaks in Louisiana and all the great locations, it made sense to shoot there.

Jack Reacher: Never Go BackEvery shoot is tough, but it was pretty straightforward on this, though shutting down the whole French Quarter took some doing —  but all the city officials were so helpful. The rooftop stuff was very challenging to do, and we did a lot of prep and began on post right away, on day one.

Do you like the post process?
I love it. I can sit there with a cup of coffee and my editor and rewrite the entire script as much as I can. It’s the best part and most creative part of the whole process.

Where did you post?
We did it all in LA. We just set up some offices in Santa Monica where I live and did all the editorial there.

You’ve typically worked with Steve Rosenblum, but you edited this film with Billy Weber, who’s been nominated twice for Oscars (Top Gun, The Thin Red Line) and whose credits include The Warriors, Pee Wee’s Big Adventure, Beverly Hills Cop II, Midnight Run and The Tree of Life, among others. How did that relationship work?
Steve wasn’t available so I asked him, ‘Who can I hire that you’d be jealous of?’ He said, ‘There’s only one person — Billy Weber. He’s your guy.’ He was right. Billy’s legendary and has cut so many great movies for directors like Terrence Malik, Tony Scott, Walter Hill, Martin Brest, Tim Burton, and he’s a prince. I love editing, and I loved working with him. He’s a great collaborator, and I was very open to all his ideas.Jack Reacher: Never Go Back

He came to New Orleans and we set up a cutting room there and he did the assembly there as we shot. Then we moved back to LA. Billy lives on the other side of town, so to beat the traffic we’d start every day at 6am and wrap at 3pm. It was a great system.

Can you talk about the importance of music and sound in the film?
I love working with the audio, and Henry Jackman did a great, classic-modern score. It was crucial, not just for all the action, but for some of the quieter moments. Then we mixed the sound at Fox, with Andy Nelson who’s now done 10 of my movies.

This is obviously not a VFX-driven piece, but the VFX play a big role.
You’re right, and we didn’t want it to look like there was a ton of CG work. In the end, we had well over 200 shots, including stuff like the Capitol Dome in DC in the background and tons of bullet hits on cars and enhancements. But I didn’t want all the VFX to be at all noticeable. Lola and Flash Film Works did the work, and often today where you need bullet hits on a car, it’s far cheaper and more time-effective to add them in post, so there was a lot of that.

How important was the DI on this and where did you do it?
We did it at Company 3 in Santa Monica with colorist Stephen Nakamura, who is brilliant. We went for a natural look but also enhanced some of the dramatic scenes [via Resolve]. It’s remarkable what you can do now in the DI, and as we shot on film I wanted to preserve some of that real film look, so I think it’s a light touch, but also a sophisticated one in the DI.

What’s next?
I don’t have anything lined up, so I’m taking a break.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Vancouver names first film commissioner, opens Film & Media Centre

The city of Vancouver and the Vancouver Economic Commission (VEC) have named David Shepheard as Vancouver’s first film commissioner. At the same time, they announced the creation of the Vancouver Film & Media Center, which aims to bring more work to the city. Vancouver is already the third largest film production center in North America — behind New York and Los Angeles.

Shepheard, who brings over 16 years of experience to his new role, previously ran the film commission services for Film London, the capital’s Media Development Agency. Film London supports producers shooting in London, advises them on accessing UK tax breaks and co-production opportunities, and promotes London’s film, TV, post production, animation, VFX and games sectors. Prior to that, Shepheard was CEO of Open House Films in the UK — a consultancy partnership specializing in developing strategies, film commissions and media development agencies at city, regional and state levels in Europe, Africa, North America, Asia and the Middle East.

“As one of Vancouver’s high-growth industries, film and media has been a big contributor to our economic growth and has a tremendous positive impact in our city,” reports Mayor Gregor Robertson. “David Shepheard’s expertise and experience — coupled with the new Film & Media Centre — will take our digital entertainment industry to the next level on the international stage.”

Nancy Mott and actor Ryan Reynolds during the filming of Deadpool. Deadpool 2 returns to Vancouver for production in January 2017.

In recent years, Vancouver has experienced big growth in film and TV production, and the industry has been a top contributor to British Columbia’s diversified economy, creating jobs, services and tax revenue. Last year, with pilot production filming increasing by 67 percent from 2015 to 2016. In 2015, the city issued almost 5,000 permits for 353 productions filmed in Vancouver; provisional data for 2016 from the city of Vancouver’s film office projects that this year’s numbers are expected to be even higher.

Shepheard will work with executive director Nancy Mott to attract film and TV production, co-production and post production projects to Vancouver. Through the prior business development activities of its Asia-Pacific Centre, the VEC has already identified opportunities in Japan, China and Korea, all of whom have been stepping up investment and production activity.

The sounds of Brooklyn play lead role in HBO’s High Maintenance

By Jennifer Walden

New Yorkers are jaded, and one of the many reasons is that just about anything they want can be delivered right to their door: Chinese food, prescriptions, craft beer, dry cleaning and weed. Yes, weed. This particular item is delivered by “The Guy,” the protagonist of HBO’s new series, High Maintenance.

The Guy (played by series co-creator Ben Sinclair) bikes around Brooklyn delivering pot to a cast of quintessentially quirky New York characters. Series creators Sinclair and Katja Blichfeld string together vignettes — using The Guy as the common thread — to paint a realistic picture of Brooklynites.

andrew-guastella

Nutmeg’s Andrew Guastella. Photo credit: Carl Vasile

“The Guy delivers weed to people, often going into their homes and becoming part of their lives,” explains sound editor/re-recording mixer Andrew Guastella at Nutmeg, a creative marketing and post studio based in New York. “I think that what a lot of viewers like about the show is how quickly you come to know complete strangers in a sort of intimate way.”

Blichfeld and Sinclair find inspiration for their stories from their own experiences, says Guastella, who follows suit in terms of sound. “We focus on the realism of the sound, and that’s what makes this show unique.” The sound of New York City is ever-present, just as it is in real life. “Audio post was essential for texturizing our universe,” says Sinclair. “There’s a loud and vibrant city outside of those apartment walls. It was important to us to feel the presence of a city where people live on top of each other.”

Big City Sounds
That edict for realism drives all sound-related decisions on High Maintenance. On a typical series, Guastella would strive to clean up every noise on the production dialogue, but for High Maintenance, the sound of sirens, horns, traffic, even car alarms are left in the tracks, as long as they’re not drowning out the dialogue. “It’s okay to leave sounds in that aren’t obtrusive and that sell the fact that they are in New York City,” he says.

For example, a car alarm went off during a take. It wasn’t in the way of the dialogue but it did drop out on a cut, making it stand out. “Instead of trying to remove the alarm from the dialogue, I decided to let it roll and I added a chirp from a car alarm, as if the owner turned off the alarm [or locked the car], to help incorporate it into the track. A car alarm is a sound you hear all the time in New York.”

Exterior scenes are acceptably lively, and if an interior scene is feeling too quiet, Guastella can raise a neighborly ruckus. “In New York, there’s always that noisy neighbor. Some show creators might be a little hesitant to use that because it could be distracting, but for this show, as long as it’s real, Ben and Katja are cool with it,” he says. During a particularly quiet interior scene, he tried adding the sounds of cars pulling away and other light traffic to fill up the space, but it wasn’t enough, so Guastella asked the creators, “’How do you feel about the neighbors next door arguing?’ And they said, ‘That’s real. That’s New York. Let’s try it out.’”

Guastella crafted a commotion based on his own experience of living in an apartment in Queens. Every night he and his wife would hear the downstairs neighbors fighting. “One night they were yelling and then all we heard was this loud, enormous slam. Hopefully, it was a door,” jokes Guastella. “Ben and Katja are always pulling from their own experiences, so I tried to do that myself with the soundtrack.”

Despite the skill of production sound mixer Dimitri Kouri, and a high tolerance for the ever-present sound of New York City, Guastella still finds himself cleaning dialogue tracks using iZotope’s RX 5 Advanced. One of his favorite features is RX Connect. With this plug-in feature, he can select a region of dialogue in his Avid Pro Tools session and send that region directly to iZotope’s standalone RX application where he can edit, clean and process the dialogue. Once he’s satisfied, he can return that cleaned up dialogue right back in sync on the timeline of his Pro Tools session where he originally sent it from.

“I no longer have to deal with exporting and importing audio files, which was not an efficient way to work,” he says. “And for me, it’s important that I work within the standalone application. There are plug-in versions of some RX tools, but for me, the standalone version offers more flexibility and the opportunity to use the highly detailed visual feedback of its audio-spectrum analyzer. The spectrogram makes using tools like Spectral Repair and De-click that much more effective and efficient. There are more ways to use and combine the tools in general.”

Guastella has been with the series since 2012, during its webisode days on Vimeo. Back then, it was a passion-project, something he’d work on at home on his own time. From the beginning, he’s handled everything audio: the dialogue cleaning and editing, the ambience builds and Foley and the final mix. “Andrew [Guastella] brought his professional ear and was always such a pleasure to work with. He always delivered and was always on time,” says Blichfeld.

The only aspect that Guastella doesn’t handle is the music. “That’s a combination of licensed music (secured by music supervisor Liz Fulton) and original composition by Chris Bear. The music is well-established by the time the episode gets to me,” he says.

On the Vimeo webisodes, Guastella would work an episode’s soundtrack into shape, and then send it to Blichfeld and Sinclair for notes. “They would email me or we would talk over the phone. The collaborative process wasn’t immediate,” he says. Now that HBO has picked up the series and renewed it for Season 2, Guastella is able to work on High Maintenance in his studio at Nutmeg, where he has access to all the amenities of a full-service post facility, such as sound effects libraries, an ADR booth, a 5.1 surround system and room to accommodate the series creators who like to hang around and work on the sound with Guastella. “They are very particular about sound and very specific. It’s great to have instant access to them. They were here more than I would’ve expected them to be and it was great spending all that time with them personally and professionally.”

In addition to being a series co-creator, co-writer and co-director with Blichfeld, Sinclair is also one of show’s two editors. This meant they were being pulled in several directions, which eventually prevented them from spending so much time in the studio with Guastella. “By the last three episodes of this season, I had absorbed all of their creative intentions. I was able to get an episode to the point of a full mix and they would come in just for a few hours to review and make tweaks.”

With a bigger budget from HBO, Guastella is also able to record ADR when necessary, record loop group and perform Foley for the show at Nutmeg. “Now that we have a budget and the space to record actual Foley, we’re faced with the question of how much Foley do we want to do? When you Foley sound for every movement and footstep, it doesn’t always sound realistic, and the creators are very aware of that,” says Guastella.

5.1 Surround Mix
In addition to a minimalist approach, another way he keeps the Foley sounding real is by recording it in the real world. In Episode 3, the story is told from a dog’s POV. Using a TASCAM DR 680 digital recorder and a Sennheiser 416 shotgun mic, Guastella recorded an “enormous amount of Foley at home with my Beagle, Bailey, and my father-in-law’s Yorkie and Doberman. I did a lot of Foley recording at the dog park, too, to capture Foley for the dog outside.”

Another difference between the Vimeo episodes and the HBO series is the final mix format. “HBO requires a surround sound 5.1 mix and that’s something that demands the infrastructure of a professional studio, not my living room,” says Guastella. He takes advantage of the surround field by working with ambiences, creating a richer environment during exterior shots which he can then contrast with a closer, confined sound for the interior shots.

“This is a very dialogue-driven show so I’m not putting too much information in the surrounds. But there is so much sound in New York City, and you are really able to play with perspective of the interior and exterior sounds,” he explains. For example, the opening of Episode 3, “Grandpa,” follows Gatsby the dog as he enters the front of his house and eventually exits out of the back. Guastella says he was “able to bring the exterior surrounds in with the characters, then gradually pan them from surround to a heavier LCR once he began approaching the back door and the backyard was in front of him.”

The series may have made the jump from Vimeo to HBO but the soul of the show has changed very little, and that’s by design. “Ben, Katja, and Russell Gregory [the third executive producer] are just so loyal to the people who helped get this series off the ground with them. On top of that, they wanted to keep the show feeling how it did on the web, even though it’s now on HBO. They didn’t want to disappoint any fans that were wondering if the series was going to turn into something else… something that it wasn’t. It was really important to the show creators that the series stayed the same, for their fans and for them. Part of that was keeping on a lot of the people who helped make it what it was,” concludes Guastella.

Check out High Maintenance on HBO, Fridays at 11pm.


Jennifer Walden is a NJ-based audio engineer and writer. Follow her at @audiojeney.

The color and sound of Netflix’s The Get Down

The Get Down, Baz Luhrmann’s new series for Netflix, tells the story of the birth of hip-hop in the late 1970s in New York’s South Bronx. The show depicts a world filled with colorful characters pulsating to the rhythms of an emerging musical form.

Shot on the Red Dragon and Weapon in 6K, sound and picture finishing for the full series was completed over several months at Technicolor PostWorks New York. Re-recording mixers Martin Czembor and Eric Hirsch, working under Luhrmann’s direction and alongside supervising sound designer Ruy Garcia, put the show’s dense soundtrack into its final form.

The Get Down

Colorist John Crowley, meanwhile, collaborated with Luhrmann, cinematographer William Rexer and executive producer Catherine Martin in polishing its look. “Every episode is like a movie,” says Czembor. “And the expectations, on all levels, were set accordingly. It was complex, challenging, unique… and super fascinating.”

The Get Down’s soundtrack features original music from composer Elliott Wheeler, along with classic hip-hop tracks and flashes of disco, new wave, salsa and even opera. And the music isn’t just ambiance; it is intricately woven into the story. To illustrate the creative process, a character’s attempt to work out a song lyric might seamlessly transform into a full-blown finished song.

According to Garcia, the show’s music team began working on the project from the writing stage. “Baz uses songs as plot devices — they become part of the story. The music works together with the sound effects, which are also very musical. We tuned the trains, the phones and other sounds and synced them to the music. When a door closes, it closes on the beat.”

Ruy Garcia

The blending of story, music, dialogue and sound came together in the mix. Hirsch, who mixed Foley and effects, recalls an intensive trial-and-error process to arrive at a layering that felt right. “There was more music in this show than anything I’ve previously worked on,” he says. “It was a challenge to find enough sound effects to fill out the world without stepping on the music. We looked for places where they could breathe.”

In terms of tools, they used Avid Pro Tools 12 HD for sound and music, ADR manager for ADR cueing and Sound Miner for Sound FX library management. For sound design they called on Altiverb, Speakerphone and SoundToys EchoBoy to create spaces, and iZotope Iris for sampling. “We mixed using two Avid Pro Tools HDX2 systems and a double operator Avid S6 control surface,” explains Garcia. “The mix sessions were identical to the editorial sessions, including plug-ins, to allow seamless exchange of material and elaborate conformations.”

Music plays a crucial role in the series’ numerous montage sequences, acting as a bridge as the action shifts between various interconnecting storylines. “In Episode 2, Cadillac interrogates two gang members about a nightclub shooting, as Shaolin and Zeke are trying to work out the ‘get down’ — finding the break for a hip-hop beat,” recalls Czembor. “The way those two scenes are cut together with the music is great! It has an amazing intensity.”

Czembor, who mixed dialogue and music, describes the mix as a collaborative process. During the early phases, he and Hirsch worked closely with Wheeler, Garcia and other members of the sound and picture editing teams. “We spent several days pre-mixing the dialogue, effects and music to get it into a basic shape that we all liked,” he explains. “Then Baz would come in and offer ideas on what to push and where to take it next. It was a fun process. With Baz, bigger and bolder is always better.”

The team mostly called on Garcia’s personal sound library, “plus a lot of vintage New York E train and subway recordings from some very generous fellow sound editors,” he says. “Shaolin Fantastic’s kung-fu effects come from an old British DJ’s effects record. We also recorded and edited extensive Foley, which was edited against the music reference guide.”

The Color of Hip-Hop
Bigger and bolder also applied to the picture finishing. Crowley notes that cinematographer William Rexer employed a palette of rich reddish brown, avocado and other colors popular during the ‘70s, all elevated to levels slightly above simple realism. During grading sessions with Rexer, Martin and Luhrmann, Crowley spent time enhancing the look within the FilmLight Baselight, sharpening details and using color to complement the tone of the narrative. “Baz uses color to tell the story,” he observes. “Each scene has its own look and emotion. Sometimes, individual characters have their own presence.”

ohn-crowley

John Crowley

Crowley points to a scene where Mylene gives an electrifying performance in a church (photo above). “We made her look like a superstar,” he recalls. “We darkened the edges and did some vignetting to make her the focus of attention. We softened her image and added diffusion so that she’s poppy and glows.”

The series uses archival news clips, documentary material and stock footage as a means of framing the story in the context of contemporary events. Crowley helped blend this old material with the new through the use of digital effects. “In transitioning from stock to digital, we emulated the gritty 16mm look,” he explains. “We used grain, camera shake, diffusion and a color palette of warm tones. Then, once we got into a scene that was shot digitally, we would gradually ride the grain out, leaving just a hint.”

Crowley says it’s unusual for a television series to employ such complex, nuanced color treatments. “This was a unique project created by a passionate group of artists who had a strong vision and knew how to achieve it,” he says.

The Path‘s post path to UHD

By Randi Altman

On a recent visit to the Universal Studios lot in Los Angeles, I had the pleasure of meeting the post team behind Hulu’s The Path, which stars Aaron Paul and Hugh Dancy. The show is about a cult — or as their members refer to it, a movement — that on the outside looks like do-gooders preaching peace and love, but on the inside there are some freaky goings-on.

The first time I watched The Path, I was taken with how gorgeous the picture looked, and when I heard the show was posted and delivered in UHD, I understood why.

“At the time we began to prep season one — including the pilot — Hulu had decided they would like all of their original content shows to deliver in UHD,” explains The Path producer Devin Rich. “They were in the process of upgrading their streaming service to that format so the viewers at home, who had the capability, could view this show in its highest possible quality.”

For Rich (Parenthood, American Odyssey, Deception, Ironside), the difference that UHD made to the picture was significant. “There is a noticeable difference,” he says. “For lack of better words, the look is more crisp and the colors pop. There, of course, is a larger amount of information in a UHD file, which gives us a wider range to make it look how we want it to, or at least closer to how we want it to look.”

L-R: Tauzhan Kaiser, Craig Burdick (both standing), Jacqueline LeFranc and Joe Ralston.

While he acknowledges that as a storyteller UHD “doesn’t make much of a difference” because scripts won’t change, his personal opinion is that “most viewers like to feel as if they are living within the scene rather than being a third-party to the scene.” UHD helps get them there, as does the team at NBCUniversal StudioPost, which consists of editor Jacqueline LeFranc, who focuses on the finishing, adding titles, dropping in the visual effects and making the final file; colorist Craig Budrick; lead digital technical operations specialist Joe Ralston, who focuses on workflow; and post production manager Tauzhan Kaiser.

They were all kind enough to talk to us about The Path’s path to UHD.

Have you done an UHD workflow on any other shows?
Ralston: We have a lot of shows that shoot UHD or high resolution, but The Path was our first television show that finished UHD all the way through.

What is it shot on?
Ralston: They shoot Red 3840×2160, and they also shoot 4800×2700, so almost 5K. UHD is technically twice the height and twice the width of HD, so while it’s still 16×9, resolution-wise it’s double.

From an infrastructure perspective, were you guys prepared to deal with all that data?
Ralston: Yes. At the facility here at NBCUniversal StudioPost, not only do we do TV work, but there’s remastering work — all the centennial titles, for example.

Kaiser: We we’ve done Spartacus. All Quiet on the Western Front, The Birds, Buck Privates, Dracula (1931), Frankenstein, Out of Africa, Pillow Talk, The Sting, To Kill a Mockingbird, Touch of Evil, Double Indemnity, Holiday Inn and King of Jazz.

Ralston: The infrastructure as far as storage and monitoring were already in place here. We knew that this was coming. So slowly the facility has been preparing and gearing up for it. We had been ready, but this was really the first that requested end-to-end UHD. Usually, we do a show that maybe it’s shot UHD or 5K, but they finish in HD, so when we leave the editorial room, we’re then in an HD world. In this case, we were not.

LeFranc: Joe’s group, which is digital tech ops, doesn’t really exist in other places that I know of. They develop, train and work with everybody else in the facility to develop these kind of workflows in order to get ahead of it. So we are prepared, adequately trained and aware of all the pitfalls and any other concerns there might be. That’s a great thing for us, because it’s knowledge.

Other shows have gone UHD, but some in season two, and they were playing catch up in terms of workflow.
Ralston: We’d been thinking about it for a long time. Like I said, the difference with this show, versus some of the other ones who do it is that everyone else, when it got to color, went to HD. This one, when we got to color, we stayed UHD all the way through from there on out.

So, that was really the big difference for a show like this. The big challenges for this one were — and Jacqueline can go into it a little bit more — when you get into things like titling or creating electronic titles, there’s not a lot of gear out there that does that.

Jacqueline, can you elaborate on that?
LeFranc: There were obstacles that I encountered when trying to establish the initial workflow. So, for example, the character generator that is used to create the titles has an option for outputting 4K, but after testing it I realized it wasn’t 4K. It looked like it was just up-rezed.

So I came up with a workflow where, in the character generator, we would make the title larger than we needed it to be and then size it down in Flame. Then we needed a new UHD monitor, the Sony BVMX300. The broadcast monitor didn’t work anymore, because if you want to see UHD in RGB, it has to have a quad-link output.

Craig, did your color process change at all?
Budrick: No, there wasn’t really any change for me in color. The creative process is still the creative process. The color corrector supports a higher resolution file, so it wasn’t an issue of needing new equipment or anything like that.

What systems do you use?
Budrick: We are primarily an Autodesk facility, so we use Flame, Flame Premium and Lustre for color. We also have Avids.

Can you walk us through the workflow?
Ralston: We don’t do the dailies on this project here. It’s all done in New York at Bling. We receive all the camera master files. While they do use drones and a couple of other cameras, a large percent of the show is shot on Epic Red Dragon at 3840×2160.

We get all those camera master files and load them onto our server. Then we receive an Avid bin or sequence from the client and bring that into our Avid in here and we link to those camera master files on the SAN. Once they’re linked, we then have a high-res timeline we can play through. We take the low-res offline version that they gave us and we split it — our editor goes through it and makes sure that everything’s there and matched.

Once that part is complete, we transcode that out to the Avid codec DNX-HR444, which is basically 440Mb and a UHD file that the Avid is outputting. Once we get that UHD file out of the Avid, we flip that UHD DNX-MXF file into a DPX sequence that is a UHD 3840×2160 DPX sequence. That’s where Craig would pick up on color. He would take that DPX sequence and color from there.

Craig, in terms of the look of the show, what direction were you given?
Budrick: They shoot in New York, so the DP Yaron Orbach is in New York. Because of that distance, I had a phone conversation with them to start the look of the show. Then I do a first-day pass, and then he receives the file. Then, he just gives me notes via email on each scene. Then he gets the second file, and hopefully I’m there.

Can you give me an example of a note that he has given?
Budrick: It just might be, you know, let’s add some saturation, or let’s bring this scene down. Maybe make it more moody. Bring down the walls.

Overall, as the show has gone along and the stories have developed it’s gotten a little darker and more twisted, it’s leaned more toward a moody look and not a whole lot of happy.

Ralston: Because of the distance between us and the DP, we shipped a color-calibrated Sony HD monitor to New York. We wanted to make sure that what he was looking at was an exact representation of what Craig was doing.

Jacqueline, any challenges from your perspective other than the titles and stuff?
LeFranc: Just the differences that I noticed — the render time takes a little longer, obviously, because the files are a little bigger. We have to use certain SAN volumes, because some have larger bandwidths.

Ralston: We have 13 production volumes here, and for this particular show — like the feature mastering that we do — the volume is 156TB Quantum that is tuned for 4K. So, in other words, it performs better with these larger files on it.

Did you experiment at all at the beginning?
Ralston: For the first three episodes we had a parallel workflow. Everything we did in UHD, we did in HD as well — we didn’t want the producer showing up to a screening and running into a bandwidth issue. In doing this, we realized we weren’t experiencing bandwidth issues. We kind of underestimated what our SAN could do. So, we abandoned the HD.

Do you think finishing in UHD will be the norm soon?
Ralston: We were unofficially told that this time next year we should plan on doing network shows this way.

Atomic Cartoons helps bring Beatles music to kids for Netflix’s ‘Beat Bugs’

Vancouver’s Atomic Cartoons was recently called on by Netflix to help introduce kids to the music of The Beatles via its show Beat Bugs.

Set in an overgrown suburban backyard, Beat Bugs focuses on five friends as they band together to explore their environment. Iconic Beatles songs, including Magical Mystery Tour, Come Together, Penny Lane and Lucy in the Sky With Diamonds are woven into the narrative of each episode, in versions recorded by such artists as Eddie Vedder, Sia and Pink.

Atomic Cartoons is creating the 3D animation for Beat Bugs in Autodesk Maya, with The Foundry’s Nuke being used to composite the render passes. The studio estimates that it has worked on over 10,000 shots for the series, comprising more than 100,000 individual frames — some taking only two hours to render.

To ensure that it made optimal use of resources during such intensive work, the studio relied on PipelineFX’s 400-node renderfarm Qube! to divide rendering for the show.

Rachit Singh, Atomic’s head of technology, says a big benefit of Qube! is the option to define custom job types — essential in a studio like Atomic, which uses its own proprietary asset-management system alongside off-the-shelf production tracking and playback tools like Shotgun and RV. The studio also uses Flash, Harmony and After Effects on its 2D animated productions, all of which require their own custom job types.

“Even the built-in job types are great out of the box,” says Singh. “But as these job types are constructed as open architecture scripts, they can be customized to fit the studio’s pipeline needs.”

Beat Bugs debuted on Netflix in early August. You can check out the show’s trailer here.

Creating new worlds for Amazon’s The Man in the High Castle

Zoic Studios used visual effects to recreate occupied New York and San Francisco.

By Randi Altman

What if Germany and Japan had won World War II? What would the world look like? That is the premise of Philip K. Dick’s 1963 novel and Amazon’s series, The Man in the High Castle, which is currently gearing up for its second season premiere in December.

The Man in the High Castle features familiar landmarks with unfamiliar touches. For example, New York’s Time Square has its typical billboards, but sprinkled in are giant swastika banners, images of Hitler and a bizarro American flag, whose blue stripes have been replaced with yet another swastika. San Francisco, and the entire West Coast, is now under Japanese rule, complete with Japanese architecture and cultural influences. It’s actually quite chilling.

Jeff

Jeff Baksinski

Helping to create these “new” worlds was Zoic Studios, whose team received one of the show’s four Emmy nods for its visual effects work. That team was led by visual effects supervisor Jeff Baksinski.

Zoic’s path to getting the VFX gig was a bit different than most. Instead of waiting to be called for a bid, they got aggressive… in the best possible way. “Both myself and another supervisor here, Todd Shifflett, had read Dick’s book, and we really wanted this project.”

They began with some concept stills and bouncing ideas off each other of what a German-occupied New York would look like. One of Zoic’s producers saw what they were up to and helped secure some money for a real test. “Todd found a bunch of late ‘50s/early 60’s airline commercials about traveling to New York, and strung it together as one piece. Then we added various Nazi banners, swastikas and statues. Our commercial features a pullback from a 1960s-era TV. Then we pull back to reveal a New York penthouse with a Nazi solider standing at the window. The commercial’s very static-y and beat up, but as we pull back out the window, we have a very high-resolution version of Nazi New York.”

And that, my friends, is how they got the show. Let’s find out more from Baksinski…

The Man in the High Castle is an Amazon show. Does the workflow differ from traditional broadcast shows?
Yes. For example, on our network TV shows, typically you’ll get a script each week, you’ll break it down and maybe have 10 days worth of post to execute the visual effects. Amazon and Netflix shows are different. They have a good idea of where their season is going, so you can start building assets well in advance.

High Castle’s version of the Brooklyn Bridge features a Nazi/American flag.

When we did the pilot, we were already building assets while I was going out to set. We were building San Francisco’s Hirohito Airport, the airplane that featured heavily in a few episodes and the cities of New York and San Francisco — a lot of that started before we ever shot a single frame.

It’s a whole new world with the streaming channels.
Everybody does it a little bit differently. Right now when we work on Netflix shows, we are working in shooting order, episode 105, 106, 107, etc., but we have the flexibility to say, “Okay, that one’s going to push longer because it’s more effects-heavy. We’re going to need four weeks on that episode and only two on this other one.” It’s very different than normal episodic TV.

Do you have a preference?
At the moment, my preference is for the online shows. I come from a features background where we had much longer schedules. Even worse, I worked in the days where movies had a year-and-a-half worth of schedule to do their visual effects. That was a different era. When I came into television, I had never seen anything this fast in my life. TV has a super quick turnaround, and obviously audiences have gotten smarter and smarter and want better and better work; television is definitely pushing closer to a features-type look.

Assuming you get more time with the pilots?
I love pilots. You get a big chunk of the story going, and a longer post schedule — six to eight weeks. We had about six weeks on Man in the High Castle, which is a good amount of time to ask, “What does this world look like, and what do they expect? In the case of High Castle, it was really about building a world. We were never going to create a giant robot. It was about how do we do make the world interesting and support our actors and story? You need time to do that.

You were creating a world that doesn’t exist, but also a period piece that takes place in the early ‘60s. Can you talk about that?
We started with what the normal versions of New York and San Francisco looked like in the ‘60s. We did a lot of sketch work, some simple modeling and planning. The next step was what would New York look like if Germany had taken over, and how would San Francisco be different under the influence of Japan?

Zoic added a Japanese feel to San Francisco streets and buildings.

In the case of San Francisco, we found areas in other countries that have heavy Japanese populations and how they influence the architecture —so buildings that were initially built by somebody else and then altered for a Japanese feel. We used a lot of that for what you see in the San Francisco shots.

What about New York?
That was a little bit tougher, because if you’re going to reference back to Germany during the war, you have propaganda signs, but our story takes place in 1962, so you’ve got some 17 years there where the world has gotten used to this German and Nazi influence. So while those signs do exist, we scaled back and added normal signs with German names.

In terms of the architecture, we took some buildings down and put new ones in place. You’ll notice that in our Times Square, traffic doesn’t move as it does in real life. We altered the roads to show how traffic would move if somebody eliminated some buildings and put cross-traffic in.

You also added a little bit of German efficiency to some scenes?
Absolutely. It’s funny… in the show’s New York there are long lines of people waiting to get into various restaurants and stores, and it’s all very regimented and controlled. Compare that to San Francisco where we have people milling about everywhere and it’s overcrowded with a lot of randomness.

How much of what you guys did were matte paintings, and could those be reused?
We use a few different types of matte paintings. We have the Rocky Mountains, for example, in the Neutral Zone. Those are a series of matte paintings we did from different angles that show mountains, trees and rivers. That is reusable for the most part.

Other matte paintings are very specific. For example, in the alley outside of Frank’s apartment, you see clothes hanging out to dry, and buildings all the way down the alleyway that lead to this very crowded-looking city. Those matte paintings are shot-specific.

Then we use matte paintings to show things far off in the distance to cut off the CG. Our New York is maybe four square city blocks around in every direction. When we get down to that fourth block, we started using old film tricks — what they used to do on studio lots, where you start curving the roads, dead-ending, or pinching the roads together. There is no way we could build 30 blocks of CG in every direction. I just can’t get there, so we started curving the CG and doing little tricks so the viewer can’t tell the difference.

What was the most challenging type of effects you created for the show? Which shots are you most proud of?
We are most proud of the Hirohito Airport and the V9 rocket plane. What most people don’t realize is that there’s actually nothing there — we weren’t at a real airport and there’s no plane for the actors to interact with. The actors are literally standing on a giant set of grip pipe and crates and walking down a set of stairs. That plane looks very realistic, even super close-up. You see every bolt and hinge and everything as the actors walk out. The monorail and embassy are also cool.

What do you call on in terms of tools?
We use Maya for modeling and lighting environments and for any animation work, such as a plane flying or the cars driving. There is a plug-in for Maya called Miarmy that we used to create CG people walking around in backgrounds. Some of those shots have hundreds of extras, but it still felt a little bit thin, so we were used CG people to fill in the gaps.

What about compositing?
It’s all Nuke. A lot of our environments are combinations of Photoshop and Nuke or projections onto geometry. Nuke will actually let you use geometry and projections in 3D inside of the compositing package, so some of our compositors are doing environment work as well.

Did you do a lot of greenscreen work?
We didn’t do any on the pilot, but did on the following episodes. We decided to go all roto on the pilot because the show has such a unique lighting set-up — the way the DP wanted to light that show — that green would have completely ruined it. This is abnormal for visual effects, where everyone’s always greenscreening.

street-before-nyRoto is such a painstaking process.
Absolutely. Our DP Jim Hawkinson was coming off of Hannibal at the time. DPs are always super wary of visual effects supervisors because when you come on the set you’re immediately the enemy; you’re about to tell them how to screw up all their lighting (smiles).

He said very clearly, “This is how I like to use light, and these are the paintings and the artwork.” This is the stuff I really enjoy. Between talking to him and director David Semel, and knowing that it was an RSA project, your brain immediately starts going to things like Blade Runner. You’re just listening to the conversations. It’s like, “Oh, this is not straightforward. They’re going to have a very contrast-y, smoky look to this show.”

We did use greenscreen on the rest of the episodes because we had less time. So out of necessity we used green.

What about rendering?
We use V-Ray, which is a global illumination renderer. We’d go out and take HDR images of the entire area for lighting and capture all of the DPs lights — that’s what’s most important to me. The DP set up his lights for a reason. I want to capture as much of his lighting as humanly possible so when I need to add a building or car into the shot, I’m using his lighting to light it.

It’s a starting point because you usually build a little bit on top of that, but that’s typically what we do. We get our HDRs, we bring them into Maya, we light the scene inside of Maya, then we render through V-Ray, and it all gets composited together.

FuseFX moves LA headquarters to 27,000-square foot facility

Ten-year-old FuseFX, which was founded by VFX supervisor David Altenau, has moved its headquarters to a new 27,000-square-foot facility in the Sherman Oaks area of Los Angeles.

The two-building campus can accommodate FuseFX’s 150-person onsite staff and features a 10-gigabit infrastructure capable of managing 4K media and other large data files in realtime; four screening rooms, including a 20-seat theater equipped with 4K projection and monitoring systems; and Sohonet broadband connections with FuseFX’s production facilities in New York and Vancouver, which allows artists at the three sites to share data and resources. Space is available for more technical resources including the possible addition of 4K color grading and finishing.

“FuseFX remains committed to building on the success of our Los Angeles office and growing our operation here,” says Altenau, who has an Emmy Award under his belt for American Horror Story: Freak Show. “Our new space will allow us to continue to tap into the tremendous talent and resources in the LA area and provide services to what remains the largest center for entertainment production in the world.”

L-R FuseFX’s CTO Jason Fotter, David Altenau, LA Mayor Eric Garcetti and FuseFX EP Tim Jacobsen.

Altenau expects the company’s Los Angeles staff to eventually reach 250. “We have [developed] a highly efficient pipeline that results in better creative communication and quality control, and the faster turnaround of shots. Our artists are able to focus on the creativity in visual effects rather than the technical aspects of the process.”

FuseFX produces visual effects for a host of broadcast television shows, including Scorpion, Marvel’s Agents of S.H.I.E.L.D., Empire, The Blacklist and American Horror Story, as well as a growing number of feature films and commercials. The company recently formed a division focused on virtual reality under the banner FuseVR.

At the studio’s recent opening ceremony, LA’s mayor, Eric Garcetti, said, “FuseFX has been rooted right here in Los Angeles for the last decade because the company understands what our city has to offer: unmatched production resources, vast infrastructure and, of course, the best talent in the business. Fuse is creating opportunity and raising the next generation of animators and designers in LA, right where they belong.”

Bates Motel’s Emmy-nominated composer Chris Bacon

By Jennifer Walden

The creators of A&E’s Bates Motel series have proven that it is possible to successfully rework a classic film for the small screen. The series, returning for Season 5 in 2017, is a contemporary prequel to Alfred Hitchcock’s Psycho. It tells the story of how a young Norman Bates becomes the Norman Bates of film legend.

Understandably, when the words “contemporary” and “prequel” are combined, it may induce a cringe or two, as LA-based composer Chris Bacon admits. “When I first heard about the series, I thought, ‘That sounds like a terrible idea.’ Usually when you mess with an iconic film, the project can go south pretty quick, but then I heard who was involved — writers/producers Carlton Cuse and Kerry Ehrin. I’m a huge fan of their work on Lost and Friday Night Lights, so the idea sounded much more appealing. I went from feeling like ‘this is a terrible idea’ to ‘how do I get involved in this!’”

Chris Bacon

Bacon, who has been the Bates Motel composer since Season 1, says their goal from the start was to make a series that wasn’t a Psycho knock-off. “It was not our goal to tip our hats in obvious ways to Psycho. We weren’t trying to make it an homage. We weren’t trying to inhabit the universe that was so masterfully created by Alfred Hitchcock and composer Bernard Herrmann,” he explains.

Borrowing Some Strings
Having a long-established love of Herrmann’s music, it was hard for Bacon not to follow the composer’s lead, particularly when it came to instrumentation. Bates Motel’s score strongly features — you guessed it — strings. “One reason Herrmann stuck solely to strings was because the film was black and white,” explains Bacon. “He chose a monochromatic palette, as far as sound goes, without having woodwind and percussion. On the series, I take it farther. I use percussion and synth effects, but it is mostly string driven.”

Since the strings are the core of the score, Bacon felt the expressive qualities that live musicians add to the music would be more emotionally impactful than what he could get from virtual instruments. “I did the first three episodes using all virtual instruments. They sounded good and they did their job dramatically, but in talking further to the people involved who handle the purse strings, I was able to convince them to try an episode with a real string section.”

Once Bacon was able to A/B the virtual strings against the real strings, there was no denying the benefit of a live string section. “They could hear what real musicians can bring to the music —the kind of homogenous imperfection that comes when you have that many people who are all outstanding but with each of them treating the music just a little bit differently. It brings new life to it. There’s a lot of depth to it. I feel very fortunate and appreciative that the account team on the show has been supportive of this.”

Tools & Workflow
For the score each week, Bacon composes in Steinberg Cubase and runs Ableton Live via ReWire by Propellerhead. His samples are hosted in Vienna Ensemble Pro on a separate computer while all audio is monitored and processed through Avid Pro Tools on a separate rig. Since his compositions start with virBates Motel -- "Forever" -- Cate Cameron/A&E Networks -- © 2016 A&E Networks, LLC. All Rights Reservedtual instruments, Bacon has an extensive collection of sample libraries. He uses string libraries from Cinesamples, 8Dio and Spitfire Audio. “I have lots of custom stuff,” he says. “I love the Vintage Steinway D Piano from Galaxy.”

Once he’s completed the cues, he hands his MIDI tracks over to orchestrator Robert Litton, who determines the note assignments for each member of the 18-piece string section. The group is recorded at The Bridge Recording studio in Glendale, California, owned by Greg Curtis. There, Bacon joins recording engineer James Hill in the control room. “I enjoy conducting if I can, but on this series it makes more sense for me to be in the control room because we often have only three hours to record roughly 35 minutes of music. Also, the music always sounds different in the control room. Ultimately, when you are doing this kind of work, what matters is what is coming out of the speakers because that’s what you’re actually going to hear in the soundtrack.”

After the live strings are recorded, engineer Hill mixes those against the virtual instrument stems of the woodwinds, percussion and synth elements. He creates a stem of the live strings to replace the virtual strings stem.

“We don’t do an in-depth mix like you would typically do for film,” says Bacon. “At this point, I’m able to leave Jim [Hill] the demo song as a reference and let him do a quick mix. Then, he creates a live string stem and all of those stems are sent over to the dub stage.”

In Season 4, Ep. 9 “Forever,” the moment arrived when young Norman (Freddie Highmore) did what he was destined to do— kill his mother Norma (Vera Farmiga). That’s not much of a spoiler if you’re familiar with the film Psycho, but what wasn’t known was just how Norman would do it. “This death scene was something I had been thinking about for four years. The way they did it — and I think they got it right — was that they made the death a very personal, emotional, thoughtful and, in a very twisted way, probably the most considerate way you can kill your mother,” laughs Bacon. “But really it was the only way that he and his mother, these two damaged broken people, could find peace together… and that was in death.”

The Norma/Norman Theme
In the four-and-a half-minute scene that reveals Norma and Norman’s lifeless bodies, Bacon portrays tension, fear and sadness by weaving a theme that he wrote for Norma with a theme he wrote for Norman and Norma. Norma’s theme in the death scene plays on a large section of violins as her new husband Alex (Nestor Carbonell) tries to resuscitate her.

“The foundation for her song has been laid over the course of two seasons, starting in Season 2, where we look at her family background, like her parents and her brother, and discover how she became her,” explains Bacon. “I didn’t really know which of her themes I was going to use for her death scene, but it seemed to feel, as we’re looking at Norma, that this is a moment about her, so I went back to that theme.”

As Norman wakes up, Bacon’s theme for Norman and Norma plays on sparse piano. In comparison to Norma’s theme on soaring strings, this theme feels small, and lost, highlighting Norman’s shock and sadness that he’s survived but now she’s gone. “The score had to convey a lot of emotions,” describes Bacon. “I tried to keep it relatively simple but as we went bigger it seemed to fit the enormity of the moment. This is a big moment that we all knew was coming and kind of dreaded. The piece feels different than the rest of the show, and rightfully so, while still being a part of the same sonicBates Motel -- "Norman" -- Cate Cameron/A&E Networks -- © 2016 A&E Networks, LLC. All Rights Reserved landscape.”

His original dramatic score on the “Forever” episode has been nominated for a 2016 Emmy award. Bacon says he chose this episode, above all the others in Season 4, because of the emotionally visceral death scene. It’s a big moment for the story and the score.

“During the whole death sequence there are barely any words. Alex is saying, ‘Stay with me,’ to Norma. Then you have Norman wake up and say, ‘Mother?’ at the very end. But that’s it. That section is like a silent movie in a lot of ways. It’s very reliant on the score.”

Andy Williams to head up new DnegTV in LA

Oscar-winning VFX house Double Negative (Dneg), which has its headquarters in London, is opening a studio in LA focusing on visual effects for television. Headng up DnegTV:LA is Andy Williams, former Stargate Studios head of production.

DNegTV, the television arm of Double Negative, was formed in 2013. It currently provides VFX for television shows including, Emerald City (NBC), BrainDead (CBS), Fortitude (Sky) and The Young Pope (Sky/HBO).

“Since our inception, we’ve enjoyed excellent working relationships with many of the major US production companies and to ensure we can continue to offer and expand our high levels of service it’s become key for us to have an LA presence,” explains DNegTV executive producer/co-founder Louise Hussey. “The fact that we will now be able to provide local facilities, support and investment in the US is very important to us and to our future plans.”

Williams has over 20 years experience in television. Prior to his time at Stargate, he spent seven years as head of production and executive producer at DIVE (now Alkemy X) in New York. During that time, he also served as DI producer, VFX producer and VFX production supervisor on the shows like The Leftovers, Silver Linings Playbook, The Visit, Power, The Road and How to Get Away With Murder.

“With the elevated ambitions of networks and streaming service content providers, the demands for quality are higher than ever before,” explains Williams. “DNegTV is suited to leverage the creativity and pipeline of an Oscar-winning facility, and then harness those resources in response to the budget and scheduling demands of TV clients. Opening in Los Angeles means we can make that more accessible to US-based productions and expand DNeg’s footprint in television. For me, it’s a chance to forge something new with the full support of one of the best brands in the business. It was just too attractive a collaboration to pass up.”

We asked Williams about the set-up in LA. He said this: “The physical facility presence of DNegTV:LA is still in its development phase. That said, not unlike Double Negative’s operations in Vancouver and Mumbai, any facility in Los Angeles will be modeled after, and tie into, London’s pipeline and toolset. The intent is to make sure that all studios operate with the same integration of tech, security and employee support that DNeg is known for. More to come as things develop on this front.”

Post FactoryNY adds David Feldman as SVP for features, TV

David Feldman has joined Post FactoryNY as senior VP for features and television. Feldman, a 20-year post veteran, previously worked as director of feature sales for Company 3 and Deluxe in New York City.

At Post FactoryNY, a SIM Group company, he will lead sales and customer relations for newly expanded features and television finishing services. “Like most of our team, David was a filmmaker first, which is key to understanding the needs of clients in helping achieve their goals,” says company founder Alex Halpern.

Feldman will also work with other SIM Group companies to provide film and television producers with packaged services covering multiple aspects of production and post.

The sound of sensory overload for Cinemax’s ‘Outcast’

By Jennifer Walden

As a cockroach crawls along the wall, each move is watched intensely by a boy whose white knuckles grip the headboard of his bed. His shallow breaths stop just before he head-butts the cockroach and sucks its bloody remains off the wall.

That is the fantastic opening scene of Robert Kirkman’s latest series, Outcast, airing now on Cinemax. Kirkman, writer/executive producer on The Walking Dead, sets his new horror series in the small town of Rome, West Virginia, where a plague of demonic-like possessions is infecting the residents.

Ben Cook

Outcast supervising sound editor Benjamin Cook, of 424 Post in Culver City, says the opening of the pilot episode featured some of his favorite moments in terms of sound design. Each scrape of the cockroach’s feet, every twitch of its antenna, and the juicy crunch of its demise were carefully crafted. Then, following the cockroach consumption, the boy heads to the pantry and snags a bag of chips. He mindlessly crunches away as his mother and sister argue in the kitchen. When the mother yells at the boy for eating chips after supper, he doesn’t seem to notice. He just keeps crunching away. The mother gets closer as the boy turns toward her and she sees that it’s not chips he’s crunching on but his own finger. This is not your typical child.

“The idea is that you want it to seem like he’s eating potato chips, but somewhere in there you need a crossover between the chips and the flesh and bone of his finger,” says Cook. Ultimately, the finger crunching was a combination of Foley — provided by Jeff Wilhoit, Brett Voss, and Dylan Tuomy-Wilhoit at Happy Feet Foley — and 424 Post’s sound design, created by Cook and his sound designers Javier Bennassar and Charles Maynes. “We love doing all of those little details that hopefully make our soundtracks stand out. I try to work a lot of detail into my shows as a general rule.”

Sensory Overload
While hitting the details is Cook’s m.o. anyway — as evidenced by his Emmy-nominated sound editing on Black Sails — it serves a double purpose in Outcast. When people are possessed in the world of Outcast, we imagine that they are more in tune with the micro details of the human experience. Every touch and every movement makes a sound.

“Whenever we are with a possessed person we try to play up the sense that they are overwhelmed by what they are experiencing because their body has been taken over,” says Cook. “Wherever this entity comes from it doesn’t have a physical body and so what the entity is experiencing inside the human body is kind of a sensory overload. All of the Foley and sound effects are really heightened when in that experience.”

Cook says he’s very fortunate to find shows where he and his team have a lot of creative freedom, as they do on Outcast. “As a sound person that is the best; when you really are a collaborator in the storytelling.”

His initial direction for sound came from Adam Wingard, the director on the pilot episode. Wingard asked for drones and distortion, for hard-edged sounds derived from organic sources. “There are definitely more processed kinds of sounds than I would typically use. We worked with the composer Atticus Ross, so there was a handoff between the music and the sound design in the show.”

Working with a stereo music track from composer Ross, Cook and his team could figure out their palette for the sound design well before they hit the dub stage. They tailored the sound design to the music so that both worked together without stepping on each other’s toes.

He explains that Outcast was similar to Black Sails in that they were building the episodes well before they mixed them. The 424 Post team had time to experiment with the design of key sounds, like the hissing, steaming sound that happens when series protagonist Kyle Barnes (Patrick Fugit) touches a possessed person, and the sound of the entity as it is ejected from a body in a jet of black, tar-like fluid, which then evaporates into thin air. For that sound, Cook reveals that they used everything from ocean waves to elephant sounds to bubbling goo. “The entity was tough because we had to find that balance between its physical presence and its spiritual presence because it dissipates back into its original plane, where ever it came from.”

Sound Design and More
When defining the sound design for possessed people, one important consideration was what to do with their voice. Or, in this case, what not to do with their voice. Series creator Kirkman, who gave Cook carte blanche on the majority of the show’s sound work, did have one specific directive: “He didn’t want any changes to happen with their voice. He didn’t want any radical pitch shifting or any weird processing. He wanted it to sound very natural,” explains Cook, who shared the ADR workload with supervising dialogue editor Erin Oakley-Sanchez.

There was no processing to the voices at all. What you hear is what the actors were able to perform, the only exception being Joshua (Gabriel Bateman), an eight-year-old boy who is possessed. For him, the show runners wanted to hear a slight bit of difference to drive home the fact that his body had indeed been taken over. “We have Kyle beating up this kid and so we wanted to make sure that the viewers really got a sense that this wasn’t a kid he was beating up, but that he was beating up a monster,” explains Cook.

To pull off Joshua’s possessed voice, Oakley-Sanchez and Wingard had actor Bateman change his voice in different ways during their ADR session. Then, Cook doubled certain lines in the mix. “The approach was very minimalistic. We never layered in other animal sounds or anything like that. All of the change came from the actor’s performance,” Cook says.

Cook is a big proponent of using fresh sounds in his work. He used field recordings captured in Tennessee, Virginia, and Florida to build the backgrounds. He recorded hard effects like doors, body hits and furniture crashing and breaking. There were other elements used as part of the sound design, like wind and water recordings. In Sound Particles —a CGI-like software for sound design created by Nuno Fonseca — he was able to manipulate and warp sound elements to create unique sounds.

“Sound Particles has really great UI to it, like virtual mics you can place and move to record things in a virtual 3D environment. It lets you create multiple instances of sound very easily. You can randomize things like pitch and timing. You can also automate the movements and create little vignettes that can be rendered out as a piece of audio that you can bring into Pro Tools or Nuendo or other audio workstations. It’s a very fascinating concept and I’ve been using it a lot.”

Cook enjoys building rich backgrounds in shows, which he uses to help further the storyline. For example, in Episode 2 the police chief and his deputy take a trek through the woods and find an abandoned trailer. Cook used busier tracks with numerous layers of sounds at first, but as the chief and deputy get farther into the woods and closer to the abandoned trailer, the backgrounds become sparser and eerily quiet. Another good example happens in Episode 9, where there is a growing storm that builds throughout the whole episode. “It’s not a big player, just more of a subtext to the story. We do really simple things that hopefully translate and come across to people as little subtleties they can’t put their finger on,” says Cook.

Outcast is mixed in 5.1 by re-recording mixers Steve Pederson (dialogue/music) and Dan Leahy (effects/Foley/ backgrounds) via Sony Pictures Post at Deluxe in Hollywood. Cook says, “They are super talented mixers who mostly do a lot of feature films and so they bring a theatrical vibe to the series.”

New episodes of Outcast air Fridays at 10pm on Cinemax, with the season finale on August 12th. Outcast has been renewed for Season 2, and while Cook doesn’t have any inside info on where the show will go next season, he says, “at the end of Season 1, we’re not sure if the entity is alien or demonic, and they don’t really give it away one way or another. I’m really excited to see what they do in Season 2. There is lots of room to go either way. I really like the characters, like the Reverend and Kyle — both have really great back stories. They’re both so troubled and flawed and there is a lot to build on there.”

Jennifer Walden is a New Jersey-based audio engineer and writer.