OWC 12.4

Category Archives: TV Series

Colorist Bob Festa on Yellowstone’s modern Western look

Paramount Network’s Yellowstone, from creator, writer and director Taylor Sheridan (Sicario, Hell or High Water), is a 10-episode modern-day Western starring Kevin Costner as the patriarch of the Duttons, owners of the largest ranch in the contiguous United States.

The Dutton family is in constant conflict with owners of the property surrounding their land, including developers, an Indian reservation and a national park. The series follows Costner’s character and his dysfunctional children as they navigate their bumpy road.

Cinematographer Ben Richardson and Efilm senior colorist Mitch Paulson already had a color lock on the pilot for Yellowstone, but brought on Encore senior colorist Bob Festa to work on the episodes. “As a Deluxe sister company, it was only natural to employ Encore Hollywood’s television resources,” explains Festa. “I was keen to collaborate with both Ben and Mitch. Mitch then served as supervising colorist.”

Let’s find out more from the veteran colorist.

How did you work with the director and DP?
Honestly, my first discussions with Ben were quite involved and fully articulated. For instance, while Ben’s work with Beasts of the Southern Wild and Wind River are wildly different looking projects —and shot on different formats — the fundamentals that he shared with me were fully in place in both of those projects, as well as with Yellowstone.

There is always a great deal of talking that goes on beforehand, but nothing replaces collaboration in the studio. I guess I auditioned for the job by spending a full day with Ben and Mitch at Encore. Talk is a cheap abstraction, and there is nothing like the feeling you get when you dim the lights, sit in the chair and communicate with pictures.

The only way I can describe it is it’s like improvising with another musician when you have never played together before. There’s this buildup of ideas and concepts that happens over a few shots, grades get thrown out or refined, layers are added, apprehension gives way to creativity, and a theme takes place. If you do this over 50 shots, you develop a language that is unique to a given project and a “look” is born.

What was your workflow for this project? What did you use tool-wise on Yellowstone?
ARRI RAW and Resolve were the foundation, but the major lifting came from using a Log Offset workflow, better known as the printer lights workflow. Although printer lights has its roots in a photochemical laboratory setting, it has tremendous real-world digital applications. Many feel this relationship to printer lights is very elementary, but the results can be scaled up very quickly to build an amazingly natural and beautiful grade.

The Resolve advanced panel can be remapped to use an additional fourth trackball as a fuel-injected printer light tool that is not only very fast and intuitive, but also exceptionally high quality. The quality angle comes from the fact that Log Offset grading works in a fashion that keeps all of the color channels moving together during a grade. All curves work in complete synchronicity, resulting in a very natural transition between the toe and the knee, and the shoulder and head of the grade.

This is all enhanced using pivot and contrast controls to establish the transfer characteristic of a scene. There is always a place for cross process, bleach bypass and other twisted aggressive grades, but this show demanded honest emotion and beauty from the outset. The Log Offset workflow delivered that.

What inspired the look of Yellowstone? Are there any specific film looks it is modeled after?
As a contemporary western, you can draw many correlations to cinematic looks from the past, from Sergio Leone to Deadwood, but the reality is the look is decidedly modern western.

In the classic film world, the look is very akin to a release print, or in the DI world it emulates a show print (generationally closer to the original negative). The look demands that the curves and saturation are very high quality. Ben has refined an ARRI LUT that really enhances the skies and flesh tones to create a very printy film laboratory look. We also use Livegrain for the most part using a 35mm 5219 emulation for night shots and a 5207 look for day exteriors to create texture. That is the Yellowstone recipe.

How did you approach the sweeping landscape shots?
Broad, cinematic and we let the corners bleed. Vignettes were never used on the wide vistas. The elements are simple: you have Kevin Costner on a horse in Montana. The best thing I can think of is to follow the medical credo of “do no harm.”

What was the most challenging aspect of coloring Yellowstone?
Really just the time constraints. Coordinating with the DP, the VFX teams and the post crew on a weekly basis for color review sessions is hard for everyone. The show is finished week by week, generally delivering just days before air. VFX shots are dropped in daily. Throw in the 150 promos, teasers and trailers, and scheduling that is a full-time job.

Other than color, did you perform any VFX shots?
Every VFX vendor supplied external mattes with their composites. We always color composite plates using a foreground and a background grade to serve the story. This is where the Resolves external matte node structure can be a lifesaver.

What is your favorite scene or scenes?
I have to go with episode one of the pilot. That opening shot sets the tone for the entire series. The first time I saw that opening shot, my jaw dropped both from a cinematography and story background. If you have seen the show, you know what I’m talking about.

Showrunner/EP Robert Carlock talks Netflix’s Unbreakable Kimmy Schmidt

By Iain Blair

When Unbreakable Kimmy Schmidt first premiered back in 2015, the sitcom seemed quite shocking — and not just because NBC sold it off to Netflix so quickly. While at the streaming service, it has been a big hit with audiences and critics alike, racking up dozens of industry awards and nominations, including 18 Primetime Emmy nominations.

Robert Carlock

Created by Tina Fey and Robert Carlock, the sunny comedy with a dark premise stars Ellie Kemper as the title character. She moves to New York City after being rescued from an underground bunker where she and three other women were held captive for 15 years by a doomsday cult leader (Jon Hamm).

Alone in the Big Apple, and armed only with her unbreakable sense of optimism, Kimmy soon forges a new life that includes her colorful landlady Lillian Kaushtupper (Carol Kane), her struggling actor roommate (Tituss Burgess) and her socialite employer (Jane Krakowski). The strong cast also boasts recurring talent and A-list guests, such as Tina Fey, Martin Short, Fred Armisen, Jeff Goldblum, Amy Sedaris and Lisa Kudrow.

Last year Netflix renewed the show for a final season, with the first six episodes premiering in May 2018.

I recently spoke with Carlock about making the show, the Emmys and the planned movie version.

When Kimmy Schmidt first came out, its premise seemed bizarre and shocking — a young woman who was kidnapped, abused and held captive in an underground bunker. But looking back today, it seems ahead of its time.
Unfortunately, I think you’re right. At the time we felt strongly it was a way to get people talking about things and issues they didn’t necessarily want to talk about, such as how women are really treated in this society. And with the #MeToo movement it’s more timely than ever. Tina would say, “It keeps happening, it’s in the news all the time, and at this level,” and it’s really sad that it’s true. The last two seasons we’ve been dealing more and more with issues like this, and now people really are talking about sexual harassment in the workplace. But we have the added burden of also trying to make it funny.

Is Season 4 definitely the final one?
I think so, and the second half will stream sometime early next year. In the meantime, we’re talking about the movie deal that Netflix wants and what that will entail. We kind of thought about it as, “Let’s give our characters endings since there’s still so much to talk about,” but you also have to bear in mind the topicality of it all in a year or so. So it gave us the luxury of being able to finish the show in a way that felt right, and Season 5 — the second half of Season 4 — will satisfy fans, I think. We’re also very happy that Netflix is so enthusiastic about doing it this way.

Do you like being a showrunner?
I do, and I love it better than not being in charge. The beauty of TV is that, unlike in movies, and for a variety of reasons, writers get to be in charge. I love the fact that when you’re a showrunner, you get to learn so much about everything, including all the post production. You work with all these really skilled artisans and get to oversee the entire process from start to finish, including picking out what shade of blue the dress should be (laughs). It’s much better than watching other people make all the key decisions.

What are the big challenges of showrunning?
The big one is trying to think outside of the writer’s room. You have all that ambition on the page, but then you have to deal with the reality of actually shooting it and making it work. It’s a lot easier to type it than execute it. Then you have to be really objective about what’s working and what isn’t, because you fall in love with what you write. So you have to realize, “Maybe this needs a little insert, or more jokes here to get the point across,” and you have to put that producer hat on — and that can be really tricky. It’s a challenge for sure, but we’ve also been fortunate in having a great crew that’s been with us a while, so there’s that shorthand, and things move quickly on the set and we get a lot done.

Where do you shoot and post?
We do the shooting at the Broadway Stages in Brooklyn, and have all the editing setup there as well. Then we have Tina’s production offices at Columbus Circle, and we do all the sound at Sync Sound in midtown Manhattan.

Do you like the post process?
I love post and the whole process of seeing a script come alive as you edit.  You find ways of telling the story that you maybe didn’t expect.

You have a big cast and a lot of stuff going on in each episode. What are the big editing challenges?
One of the big creative challenges of a single-camera show — which ultimately also gives you so many more tools in writing, shooting and editing — is that you don’t get to see rehearsals. So one of the reasons our episodes are going into post and often coming out of post so stuffed with story and jokes is that we don’t get so many opportunities to see exactly what’s making the scene tick. We’re hitting the story, hitting the jokes and hitting the characters too many times, and  a lot of the challenge is scraping all those away. Our episodes come in around the mid-30s often, and we think they live and play best around 26 or 27 minutes. That’s where I think the sweet spot is. So you can feel, “Oh, I love that joke,” but the hard reality is that the scene plays so much better without it.

Talk about the importance of sound and music.
I think it’s so important in comedy, and it can totally change the feel of a scene. Jeff Richmond — Tina’s husband and one of our producers — does all the music. He’s also fantastic in the edit. So if I’m not available or Tina isn’t, then he or Sam Means, another producer, can take our edit notes and interpret them. We’ll type up 15 pages on a Director’s Cut, and then we hone the show until it’s a lock for the network, and we go through it all frame by frame.

How important are the Emmys to you and a show like this?
Increasingly now, with all the noise and static out there, and so many other good shows, it’s really important. I think it helps cut through the clutter. When you’re working hard on a show like this, with your head down all the time, you don’t really know where you stand sometimes. So to be nominated by your peers means a lot. (Laughs) I wish it didn’t, but we’re small-minded people who only really care about other people’s opinions.

What’s the latest on talk about a movie? Will it be a theatrical release or just Netflix, or both?
That’s a great question. Who knows? We’re in the middle of trying to figure out the budget. I imagined it would be just streaming, but maybe it will be theatrical as well. One thing’s for sure. We won’t be one of those TV shows that gets a whole new cast for the movie version. Lightning struck with our first cast, and we’re not looking to replace anyone.

There’s been a lot of talk about lack of opportunity for women in movies. Are things better in TV?
I can only speak for us, but we like shows where there’s a lot of diversity and different voices, and sometimes we step in a bear trap we didn’t even know was there because we’re trying to write for so many different voices. For us, it just makes sense to embrace diversity, but it’s such a complicated and thorny issue. I’m just glad we’re talking about it more now. It’s what interests us. When Tina and I first sat down to write this, we didn’t want to do something salacious and exploitive. We were thinking about a really startling way to get people talking about gender and class. It’s been a fun challenge.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

OWC 12.4

Behind the Title: DigitalFilm Tree colorist Rick Dalby

NAME: Rick Dalby

COMPANY: DigitalFilm Tree

CAN YOU DESCRIBE YOUR COMPANY?
I would describe DigitalFilm Tree (DFT) as a smaller, bleeding-edge, independently-owned post house that is capable of remote dailies, color and edit. I work with fellow colorists Dan Judy and Patrick Woodard.

AS A COLORIST, WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
What should fall under that title is that a Resolve colorist has become the creative gatekeeper for the producer’s, director’s and DP’s vision. You don’t just hand off a show and add some color. We have color tools that work akin to the way you work in Photoshop, using layer-mixers and alpha-mattes.

On-set, the DP and/or DIT that uses Resolve can send projects or custom LUTs or nodes that we can carry directly into the final color session. I would describe the colorist-driven post workflow as more holistic than ever before.

ARE YOU SOMETIMES ASKED TO DO MORE THAN JUST COLOR ON PROJECTS? 
Yes, we are asked to stabilize shots and add OFX plug-in looks and effects, which will evolve further with Resolve 15’s addition of Fusion. It’s really show-dependent. On a larger scale, with 4K and HDR, our colorists are redesigning workflows on a continual basis. Our online conform artists are doing most of the editing, though with Resolve, any of us might make the deliverables.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Collaborating with everyone at DFT and working with the clients that depend on us is rewarding for me. I like the art of color correction. When I can just sit down and get to work on scene looks and matching, the day passes quickly, and I can feel the creativity flow. It’s fun. When the client comes in to view and I’m in sync with their vision, there’s a great sense of accomplishment.

WHAT’S YOUR LEAST FAVORITE?
Post facilities have few windows. Sometimes, I just like to open the door and see something alive and green. Seriously though, the business end and paperwork are the things that don’t interest me.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Park ranger, running an animal rescue, Buddhist monk or one of my previous jobs, like being a broadcast news technical director. That sounds like a silly answer, but it’s not meant to be.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
That’s a long and difficult question to answer. When I was small, I used to get up very early with my dad and wait for the engineer to turn on the transmitter, so I could see the test pattern and watch some cartoons. I was a computer science major in college, but I didn’t like it. My brother worked at Compact Video and urged me to change career paths. I trained in journalism and broadcast news and worked in Sacramento television in graphics, studio and ENG camera, editing, technical directing and, finally, directing.

Next, was film transfer of 35mm prints for syndication, then on to master control and transmitter operations requiring an FCC license. That was all by the time I was 24, when I moved to Los Angeles to work with my brother in post. Within a year I was running a Rank and transferring features for most of the majors.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
The re-boot of Roseanne. Also Wrecked for TBS. I’ve been doing collaborative color with Dan and Patrick on NCIS: Los Angeles, Angie Tribeca, Great News and The 100.

The 100

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
I’m very satisfied to have worked on iconic long-running shows like Friends and Everybody Loves Raymond and developing looks for shows like Friday Night Lights with David Boyd and Todd McMullen. Recently, having a chance to work with the creative team on the Roseanne reboot was a great experience. DP Johnny Simmons and Sara Gilbert were a great pleasure to work with.

WHERE DO YOU FIND INSPIRATION? ART? PHOTOGRAPHY?
Inspiration comes from the people I meet and the challenges I face. I also love the changing exhibits at the Broad Museum and LACMA. I’m always looking at films and television to dissect what other people think and do. I don’t like the work when it seems copy-cat.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
DaVinci, which has been part of most of my career — with the exception of a few years on Lustre. The best quality hero monitor that can display the great color and resolution we need to do this job. Anything Apple. My iPad is in much need of replacement.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
I like Theo Miesner’s YouTube posts and the rapid-fire way he delivers. Recently, there’s a slew of YouTube posts that are helping me with Fusion. I use Facebook to follow my fellow meditators.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I don’t take it too seriously after this many decades. The stress is there sometimes. I acknowledge it, meditate and even go on long silent meditation retreats once or twice a year.

I walk to work, hike and sometimes just walk outside and breathe deeply. Ultimately, the stress is up to me, and how I choose to respond. Equanimity has become a guiding concept with the worldly winds.


Kari Skogland — Emmy-nominated director of Hulu’s The Handmaid’s Tale

By Iain Blair

From day one, the stark images of pure white bonnets and blood-red cloaks in The Handmaid’s Tale have come to symbolize one thing — the oppression of women. The Hulu hit series has also come to symbolize that rare moment in pop culture where difficult subject matter and massive artistic ambition cross over into impressive ratings.

In fact, the show — based on Margaret Atwood’s dystopian and prescient 1985 novel of the same name — just received 20 Emmy nominations, including eight acting noms and a second nod for best drama series. It reportedly doubled its audience for the Season 2 premiere (as compared to the first season), after becoming the first show from a streaming service to win best drama at the 2017 Emmys.

Many of the most searing episodes, including “Night,” the finale to Season 1, and “Other Women” in Season 2, were directed by the award-winning Kari Skogland. As CEO of Mad Rabbit, which launched in 2016, Skogland produces one-hour dramas for the international market while she continues her work as a director on The Handmaid’s Tale and the upcoming pilot for Starz’s The Rook. Skogland was included in the 2018 Emmy nominations with recognition of her directing work on the Season 2 episode “After.”

A prolific female director of TV and film, Skogland’s television credits include episodes for the premiere season of Condor (Audience), and such shows as The Borgias and Penny Dreadful (Showtime), Boardwalk Empire (HBO), The Killing, The Walking Dead and Fear the Walking Dead (AMC), Under the Dome (CBS), Vikings (History Channel), Power-Starring 50 Cent (Starz), The Americans (FX) and House of Cards and The Punisher (Netflix). Skogland also directed Sons of Liberty (History), a six-part event miniseries for which she won the Directors Guild of Canada (DGC) award for Best Director of a Television Miniseries.

As a feature film writer, director and producer, Skogland’s film Fifty Dead Men Walking, starring Ben Kingsley and Jim Sturgess, premiered at the Toronto International Film Festival. The film won the Canadian Screen Award for Best Adapted Screenplay and was nominated for another six awards, including Best Film.  Additionally, Skogland was recognized by the DGC as Best Director. Her previous film as director, writer and producer was The Stone Angel, starring Ellen Burstyn and Ellen Page. It was nominated for Best Director and Best Writer by WGC as well as Best Screenplay and Best Actress. It also won a Best Film award from the DGC.

I recently spoke with Skogland — the only female nominated in the best directing drama category at this year’s Emmys — about the show, her workflow and mentoring other women.

Why do you think the show’s caught the public’s imagination so much?
I think it’s rooted in many things, one of them being a cautionary tale. Another would be these compelling performances that engage you in the story in an emotional context and a narrative that has the possibility of actually coming true, especially given what we’re seeing on the news all the time now. It’s a weird perfect storm where today’s political climate and this show sort of merge.

I recently read something where Margaret Atwood, who wrote it over 30 years ago, says that everything has happened. It was fiction, but it has happened somewhere in the world since she wrote it, and it’s happening today. So I think the authenticity of the characters and the performances, even more than the events, is what really drives it even further into being so incredibly watchable.

Every character is so complex.
Exactly. You love to hate Serena Joy, but then there are moments where you really feel for her in ways you can’t predict. So your emotional barometer is going up and down.

Fair to say that Atwood’s book and its themes seem more timely than ever?
Definitely. Not only is it very timely now, but it was probably very timely when it first came out too, which makes it even more interesting when you think about progress. Are we really on a treadmill? Have we really moved the political needle at all? It doesn’t seem that different from when she wrote it, when Reagan and the rise of conservatism in America were making headlines.

Have you started Season 3?
Not yet. It’ll probably start filming in September. They’ve asked me to come back, but they don’t have a schedule yet.

Kari Skogland on set

What are the big challenges of directing this show?
First of all, you have to be very aware of all of it. When I did the Season 1 finale, I had to watch everything very carefully up until that point so I could continue the emotional story. It was the same thing for Season 2. They’re very challenging performance pieces for everyone, and you have to maintain that sense of continuity and trust. You have to really plan for the season’s arc for each character, and someone like Lizzie [Moss] is so collaborative. But it’s also this path of discovery, where you want to capture the inspiration of the moment.

Where do you post?
We shoot in Toronto and do all the post at Take 5 Productions there. I’ve known and worked with them for years — they’ve won so many awards for their great work. They do all the editing and finishing.

Do you like the post process?
I love it, and with a show like this it’s where you can combine the plan you went into post with, along with those happy accidents and inspired moments, and see the scene or episode come alive in ways you didn’t expect. I always think of it as a way to re-direct the episode. Post is always full of surprises.

Talk about editing. Didn’t you start off as an editor?
Yes, and I am really involved in the edit. I always want to have two options in post. I don’t want to be handcuffed by any decisions made on the set. I need to be able to re-sculpt the footage and rediscover stuff as we go.

You have a big cast, and a lot of stuff going on in each episode. What are the big editing challenges?
One of the things I really like to avoid is what I call “ping-pong” editing, and doing lazy coverage of a scene where it’s so predictable — there’s the closeup, there’s the wide shot, there’s another closeup!  I always want coverage that actually eliminates edits. The goal is to not interrupt the flow by jumping all over the place. With that in mind, I try and shoot with the idea of “the elegant accident,” and that means you sometimes shoot a lot of extra footage so you can find the gold and the gems as you re-sculpt in post. It’s like documentary filmmaking in that sense, and those gems happen in the oddest of moments.

This show has a great score and great sound design. Talk about the importance of sound and music.
The show’s creator, Bruce Miller, is very really instrumental in all that, but we’re all involved too. For episode eight, Joe Fiennes came up with the idea of a record player, and then we built this whole storyline around the record player. The wonderful thing about Bruce’s writing and his aesthetic is that it’s so spare, so it leaves such great opportunities for performance. The actors can convey a lot without any words.

How important are the Emmys to you and a show like this?
It’s incredibly important! When your peers nominate you it’s a real nod from industry professionals, and it indicates tremendous appreciation.

There’s been a lot of talk about lack of opportunity for women in movies. Are things better in TV?
I’ve been advocating for women for years, and the truth is, nothing’s really changed that much. There’s been so much talk recently, and it was the same thing 20 years ago. One female director had a big hit with Wonder Woman, but real change will only come when half the superhero movies are directed by women.

What advice would you give young women who would like to direct and run shows like this?
Not only can you do it — just do it! Obviously, it’s hard and there are many sacrifices you have to make, but don’t take “no” for an answer.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.


DP Patrick Stewart’s path and workflow on Netflix’s Arrested Development

With its handheld doc-style camerawork, voiceover narration and quirky humor, Arrested Development helped revolutionize the look of TV sitcoms. Created by Mitchell Hurwitz, with Ron Howard serving as one of its executive producers, the half-hour comedy series follows the once-rich Bluth family, that continues to live beyond their means in Southern California. At the center of the family is the mostly sane Michael Bluth (Jason Bateman), who does his best to keep his dysfunctional family intact.

Patrick Stewart

The series first aired for three seasons on the Fox TV network (2003-2006) but was canceled due to low ratings. Because the series was so beloved, in 2013, Netflix brought it back to life with its original cast in place. In May 2018, the fifth season began streaming, shot by cinematographer Patrick Stewart (Curb Your Enthusiasm, The League, Flight of the Conchords). He called on VariCam LT cinema cameras.

Stewart’s path to becoming a cinematographer wasn’t traditional. Growing up in Los Angeles and graduating with a degree in finance from the University of Santa Clara, he got his start in the industry when a friend called him up and asked if he’d work on a commercial as a dolly grip. “I did it well enough where they called me for more and more jobs,” explains Stewart. “I started as a dolly grip but then I did sound, worked as a tape op and then started in the camera department. I also worked with the best gaffers in San Francisco, who showed me how to look at the light, understand it and either augment it or recreate it. It was the best practical film school I could have ever attended.”

Not wanting to stay “in a small pond with big fish” Stewart decided to move back to LA and started working for MTV, which brought him into the low-budget handheld world. It also introduced him to “interview lighting” where he lit celebrities like Barbara Streisand, Mick Jagger and Paul McCartney. “At that point I got to light every single amazing musician, actor, famous person you could imagine,” he says. “This practice afforded me the opportunity to understand how to light people who were getting older, and how to make them look their best on camera.”

In 1999, Stewart received an offer to shoot Mike Figgis’ film Time Code (2000), which was one of the landmark films of the DV/film revolution. “It was groundbreaking not only in the digital realm but the fact that Time Code was shot with four cameras from beginning to end, 93 minutes, without stopping, shown in a quad split with no edits — all handheld,” explains Stewart. “It was an amazingly difficult project, because having no edits meant you couldn’t make mistakes. I was very fortunate to work with a brilliant renegade director like Mike Figgis.”

Triple Coverage
When hired for Arrested Development, the first request Stewart approached Hurwitz with was to add a third camera. Shooting with three cameras with multiple characters can be a logistical challenge, but Stewart felt he could get through scenes more quickly and effectively, in order to get the actors out on time. “I call the C camera the center camera and the A and the B are screen left and screen right,” Stewart explains. “C covers the center POV, while A and B cover the scene from their left and right side POV, which usually starts with overs. As we continue to shoot the scene, each camera will get tighter and tighter. If there are three or more actors in the scene, C will get tighter on whoever is in the center. After that, C camera might cover the scene following the dialogue with ‘swinging’ singles. If no swinging singles are appropriate, then the center camera can move over and help out coverage on the right or left side.

“I’m on a walkie — either adjusting the shots during a scene for either of their framing or exposure, or I’m planning ahead,” he continues. “You give me three cameras and I’ll shoot a show really well for you and get it done efficiently, and with cinematic style.”

Because it is primarily a handheld show, Stewart needed lenses that would not weigh down his operators during long takes. He employed Fujinon Cabrio zooms (15-35mm, 19-90mm, and 85-300mm), which are all f/2.8 lenses.

For camera settings, Stewart captures 10-bit 422 UHD (3840×2160) AVC Intra files at 23.98-fps. He also captures in V-Log but uses the V-709 LUT. “To me, you can create all the LUTs you want,” he says, “but more than likely you get to color correction and end up changing things. I think the basic 709 LUT is really nice and gentle on all the colors.”

Light from Above
Much of Arrested Development is shot on a stage, so lighting can get complicated, especially when there are multiple characters in a scene. To makes things less complicated, Stewart provided a gentle soft light from softboxes covering the top of each stage set, using 4-by-8 wooden frames with Tungsten-balanced Quasar tubes dimmed down to 50%. His motivated lighting explanation is that the unseen source could basically be a skylight. If characters are close to windows, he uses HMIs creating “natural sunlight” punching through to light the scene. “The nice thing about the VariCam is that you don’t need as many photons, and I did pretty extensive tests during pre-production on how to do it.”

On stage, Stewart sets his ISO to 5000 base and dials down to 2500 and generally shoots at an f/2.8 and ½. He even uses one level of ND on top of that. “You can imagine 27-foot candles at one level of ND at a 2.8 and 1/2 — that’s a pretty sensitive camera, and I noticed very little noise. My biggest concern was mid-tones, so I did a lot of testing — shooting at 5000, shooting at 2500, 800, 800 pushed up to 1600 and 2500.

“Sometimes with certain cameras, you can develop this mid-tone noise that you don’t really notice until you’re in post. I felt like shooting at 5000 knocked down to 2500 was giving me the benefit of lighting the stage at these beautifully low-lit levels where we would never be hot. I could also easily put 5Ks outside the windows to have enough sunlight to make it look like it’s overexposed a bit. I felt that the 5000 base knocked down to 2500, the noise level was negligible. At native 5000 ISO, there was a little bit more mid-tone noise, even though it was still acceptable. For daytime exteriors, we usually shot at ISO 800, dialing down to 500 or below.”

Stewart and Arrested Development director Troy Miller have known each other for many years since working together on the HBO’s Flight of the Conchords. “There was a shorthand between director and DP that really came in handy,” says Stewart. “Troy knows that I know what I’m doing, and I know on his end that he’s trying to figure out this really complicated script and have us shoot it. Hand in hand, we were really able to support Mitch.”


The score for YouTube Red’s Cobra Kai pays tribute to original Karate Kid

By Jennifer Walden

In the YouTube Red comedy series Cobra Kai, Daniel LaRusso (Ralph Macchio), the young hero of the Karate Kid movies, has grown up to be a prosperous car salesman, while his nemesis Johnny Lawrence (William Zabka) just can’t seem to shake that loser label he earned long ago. Johnny can’t hold down his handy-man job. He lives alone in a dingy apartment, and his personality hasn’t benefited from maturity at all. He lives a very sad reality until one day he finds himself sticking up for a kid being bullied, and that redeeming bit of character makes you root for him. It’s an interesting dynamic that the series writers/showrunners have crafted, and it works.

L-R: Composers Leo Birenberg and Zack Robinson

Fans of the 1980’s film franchise will appreciate the soundtrack of the new Cobra Kai series. Los Angeles-based composers Leo Birenberg and Zach Robinson were tasked with capturing the essence of both composer Bill Conti’s original film scores and the popular music tracks that also defined the sound of the films.

To find that Karate Kid essence, Birenberg and Robinson listened to the original films and identified what audiences were likely latching onto sonically. “We concluded that it was mostly a color palette connection that people have. They hear a certain type of orchestral music with a Japanese flute sound, and they hear ‘80s rock,” says Birenberg. “It’s that palette of sounds that people connect with more so than any particular melody or theme from the original movies.”

Even though Conti’s themes and melodies for Karate Kid don’t provide the strongest sonic link to the films, Birenberg and Robinson did incorporate a few of them into their tracks at appropriate moments to create a feeling of continuity between the films and the series. “For example, there were a couple of specific Japanese flute phrases that we redid. And we found a recurring motif of a simple pizzicato string melody,” explains Birenberg. “It’s so simple that it was easy to find moments to insert it into our cues. We thought that was a really cool way to tie everything together and make it feel like it is all part of the same universe.”

Birenberg and Robinson needed to write a wide range of music for the show, which can be heard en masse on the Cobra Kai OST. There are the ’80s rock tracks that take over for licensed songs by bands like Poison and The Alan Parsons Project. This direction, as heard on the tracks “Strike First” and “Quiver,” covered the score for Johnny’s character.

The composers also needed to write orchestral tracks that incorporated Eastern influences, like the Japanese flutes, to cover Daniel as a karate teacher and to comment on his memories of Miyagi. A great example of this style is called, fittingly, “Miyagi Memories.”

There’s a third direction that Birenberg and Robinson covered for the new Cobra Kai students. “Their sound is a mixture of modern EDM and dance music with the heavier ‘80s rock and metal aesthetics that we used for Johnny,” explains Robinson. “So it’s like Johnny is imbuing the new students with his musical values. This style is best represented in the track ‘Slither.’”

Birenberg and Robinson typically work as separate composers, but they’ve collaborated on several projects before Cobra Kai. What makes their collaborations so successful is that their workflows and musical aesthetics are intrinsically similar. Both use Steinberg’s Cubase as their main DAW, while running Ableton Live in ReWire mode. Both like to work with MIDI notes while composing, as opposed to recording and cutting audio tracks.

Says Birenberg, “We don’t like working with audio from the get-go because TV and film are such a notes-driven process. You’re not writing music as much as you are re-writing it to specification and creative input. You want to be able to easily change every aspect of a track without having to dial in the same guitar sound or re-record the toms that you recorded yesterday.”

Virtual Instruments
For Cobra Kai, they first created demo songs using MIDI and virtual instruments. Drums and percussion sounds came from XLN Audio’s Addictive Drums. Spectrasonics Trilian was used for bass lines and Keyscape and Omnisphere 2 provided many soft-synth and keyboard sounds. Virtual guitar sounds came from MusicLab’s RealStrat and RealLPC, Orange Tree, and Ilya Efimov virtual instrument libraries. The orchestral sections were created using Native Instruments Kontakt, with samples coming from companies such as Spitfire, Cinesamples, Cinematic Strings, and Orchestral Tools.

“Both Zach and I put a high premium on virtual instruments that are very playable,” reports Birenberg. “When you’re in this line of work, you have to work superfast and you don’t want a virtual instrument that you have to spend forever tweaking. You want to be able to just play it in so that you can write quickly.”

For the final tracks, they recorded live guitar, bass and drums on every episode, as well as Japanese flute and small percussion parts. For the season finale, they recorded a live orchestra. “But,” says Birenberg, “all the orchestra and some Japanese percussion you hear earlier in the series, for the most part, are virtual instruments.”

Live Musicians
For the live orchestra, Robinson says they wrote 35 minutes of music in six days and immediately sent that to get orchestrated and recorded across the world with the Prague Radio Symphony Orchestra. The composing team didn’t even have to leave Los Angeles. “They sent us a link to a private live stream so we could listen to the session as it was going on, and we typed notes to them as we were listening. It sounds crazy but it’s pretty common. We’ve done that on numerous projects and it always turns out great.”

When it comes to dividing up the episodes — deciding who should score what scenes — the composing team likes to “go with gut and enthusiasm,” explains Birenberg. “We would leave the spotting session with the showrunners, and usually each of us would have a few ideas for particular spots.”

Since they don’t work in the same studio, the composers would split up and start work on the sections they chose. Once they had an idea down, they’d record a quick video of the track playing back to picture and share that with the other composer. Then they would trade tracks so they each got an opportunity to add in parts. Birenberg says, “We did a lot of sending iPhone videos back and forth. If it sounds good over an iPhone video, then it probably sounds pretty good!”

Both composers have different and diverse musical backgrounds, so they both feel comfortable diving right in and scoring orchestral parts or writing bass lines, for instance. “For the scope of this show, we felt at home in every aspect of the score,” says Birenberg. “That’s how we knew this show was for both of us. This score covers a lot of ground musically, and that ground happened to fit things that we understand and are excited about.” Luckily, they’re both excited about ‘80s rock (particularly Robinson) because writing music in that style effectively isn’t easy. “You can’t fake it,” he says.

Recreating ‘80s Rock
A big part of capturing the magic of ‘80s rock happened in the mix. On the track “King Cobra,” mix engineer Sean O’Brien harnessed the ‘80s hair metal style by crafting a drum sound that evoked Motley Crew and Bon Jovi. “I wanted to make the drums as bombastic and ‘80s as possible, with a really snappy kick drum and big reverbs on the kick and snare,” says O’Brien.

Using Massey DRT — a drum sample replacement plug-in for Avid Pro Tools, he swapped out the live drum parts with drum samples. Then on the snare, he added a gated reverb using Valhalla VintageVerb. He also used Valhalla Room to add a short plate sound to thicken up the kick and snare drums.

To get the toms to match the cavernous punchiness of the kick and snare, O’Brien augmented the live toms with compression and EQ. “I chopped up the toms so there wasn’t any noise in between each hit and then I sent those to the nonlinear short reverbs in Valhalla Room,” he says. “Next, I did parallel compression using the Waves SSL E-Channel plug-in to really squash the tom hits so they’re big and in your face. With EQ, I added more top end then I normally would to help the toms compete with the other elements in the mix. You can make the close mics sound really crispy with those SSL EQs.”

Next, he bussed all the drum tracks to a group aux track, which had a Neve 33609 plug-in by UAD and a Waves C4 multi-band compressor “to control the whole drum kit after the reverbs were laid in to make sure those tracks fit in with the other instruments.”

Sean O’Brien

On “Slither,” O’Brien also focused on the drums, but since this track is more ‘80s dance than ‘80s rock, O’Brien says he was careful to emphasize the composers’ ‘80s drum machine sounds (rather than the live drum kit), because that is where the character of the track was coming from. “My job on this track was to enhance the electric drum sounds; to give the drum machine focus. I used UAD’s Neve 1081 plug-in on the electronic drum elements to brighten them up.”

“Slither” also features Taiko drums, which make the track feel cinematic and big. O’Brien used Soundtoys Devil-Loc to make the taiko drums feel more aggressive, and added distortion using Decapitator from Soundtoys to help them cut through the other drums in the track. “I think the drums were the big thing that Zach [Robinson] and Leo [Birenberg] were looking to me for because the guitars and synths were already recorded the way the composers wanted them to sound.”

The Mix
Mix engineer Phil McGowan, who was responsible for mixing “Strike First,” agrees. He says, “The ‘80s sound for me was really based on drum sounds, effects and tape saturation. Most of the synth and guitar sounds that came from Zach and Leo were already very stylized so there wasn’t a whole lot to do there. Although I did use a Helios 69 EQ and Fairchild compressor on the bass along with a little Neve 1081 and Kramer PIE compression on the guitars, which are all models of gear that would have been used back then. I used some Lexicon 224 and EMT 250 on the synths, but otherwise there really wasn’t a whole lot of processing from me on those elements.”

Phil McGowan’s ‘Strike First’ Pro Tools session.

To get an ‘80s gated reverb sound for the snare and toms on “Strike First,” McGowan used an AMS RMX16 nonlinear reverb plug-in in Pro Tools. For bus processing, he mainly relied on a Pultec EQ, adding a bit of punch with the classic “Pultec Low End Trick” —which involves boosting and attenuating at the same frequency — plus adding a little bump at 8k for some extra snap. Next in line, he used an SSL G-Master buss compressor before going into UAD’s Studer A800 tape plug-in set to 456 tape at 30 ips and calibrated to +3 dB.

“I did end up using some parallel compression using a Distressor plug-in by Empirical Labs, which was not around back then, but it’s my go-to parallel compressor and it sounded fine, so I left it in my template. I also used a little channel EQ from FabFilter Pro-Q2 and the Neve 88RS Channel Strip,” concludes McGowan.


Jennifer Walden is a New Jersey-based audio engineer and writer. You can follow her on Twitter at @audiojeney.com.


Netflix’s Lost in Space: mastering for Dolby Vision HDR, Rec.709

There is a world of difference between Netflix’s ambitious science-fiction series Lost in Space (recently renewed for another 10 episodes) and the beloved but rather low-tech, tongue-in-cheek 1960s show most fondly remembered for the repartee between persnickety Dr. Smith and the rather tinny-looking Robot. This series, starring Molly Parker, Toby Stevens and Parker Posey (in a very different take on Dr. Smith), is a very modern, VFX-intensive adventure show with more deeply wrought characters and elaborate action sequences.

Siggy Ferstl

Colorist Siggy Ferstl of Company 3 devoted a significant amount of his time and creative energy to the 10-episode release over the five-and-a-half-month period the group of 10 episodes was in the facility. While Netflix’s approach to dropping all 10 episodes at once, rather than the traditional series schedule of an episode a week, fuels excitement and binge-watching among viewers, it also requires a different kind of workflow, with cross-boarded shoots across multiple episodes and different parts of episodes coming out of editorial for color grading throughout the story arc. “We started on episode one,” Ferstl explains, “but then we’d get three and portions of six and back to four, and so on.”

Additionally, the series was mastered both for Dolby Vision HDR and Rec.709, which added additional facets to the grading process over shows delivered exclusively for Rec.709.

Ferstl’s grading theater also served as a hub where the filmmakers, including co-producer Scott Schofield, executive producer Zack Estrin and VFX supervisor Jabbar Raisani could see iterations of the many effects sequences as they came in from vendors (Cinesite, Important Looking Pirates and Image Engine, among others).

Ferstl himself made use of some new tools within Resolve to create a number of effects that might once have been sent out of house or completed during the online conform. “The process was layered and very collaborative,” says Ferstl. “That is always a positive thing when it happens but it was particularly important because of this series’ complexity.”

The Look
Shot by Sam McCurdy, the show’s aesthetic was designed, “to have a richness and realness to the look,” Ferstl explains. “It’s a family show but it doesn’t have that vibrant and saturated style you might associate with that. It has a more sophisticated kind of look.”

One significant alteration to the look involves changes to the environment of the planet onto which the characters crash land. The filmmakers wanted the exteriors to look less Earthlike with foliage a bit reddish, less verdant than the actual locations. The visual effects companies handled some of the more pronounced changes, especially as the look becomes more extreme in later episodes, but for a significant amount of this work, Ferstl was able to affect the look in his grading sessions — something that until recently would likely not have been achievable.

Ferstl, who has always sought out and embraced new technology to help him do his job, made use of some features that were then brand new to Resolve 14. In the case of the planet’s foliage, he made use of the Color Compressor tool within the OpenFX tab on the color corrector. “This allowed me take a range of colors and collapse that into a single vector of color,” he explains. “This lets you take your selected range of colors, say yellows and greens in this case, and compress them in terms of hue, saturation and luminance.” Sometimes touted as a tool to give colorists more ability to even out flesh tones, Ferstl applied the tool to the foliage and compressed the many shades of green into a narrower range prior to shifting the resulting colors to the more orange look.

“With foliage you have light greens and darker greens and many different ranges within the color green,” Ferstl explains. “If we’d just isolated those ranges and turned them orange individually, it wouldn’t give us the same feel. But by limiting the range and latitude of those greens in the Color Compressor and then changing the hue we were able to get much more desirable results.” Of course, Ferstl also used multiple keys and windows to isolate the foliage that needed to change from the elements of the scenes that didn’t.

He also made use of the Camera Shake function, which was particularly useful in a scene in the second episode in which an extremely heavy storm of sharp hail-like objects hits the planet, endangering many characters. The storm itself was created at the VFX houses, but the additional effect of camera shake on top of that was introduced and fine-tuned in the grade. “I suggested that we could add the vibration, and it worked very well,” he recalls. By doing the work during color grading sessions, Ferstl and the filmmakers in the session could see that effect as it was being created, in context and on the big screen, and could fine-tune the “camera movement” right then and there.

Fortunately, the colorist notes, the production afforded the time to go back and revise color decisions as more episodes came into Company 3. “The environment of the planet changes throughout. But we weren’t coloring episodes one after the other. It was really like working on a 10-hour feature.

“If we start at episode one and jump to episode six,” Ferstl notes, “exactly how much should the environment have changed in-between? So it was a process of estimating where the look should land but knowing we could go back and refine those decisions if it proved necessary once we had the surrounding episodes for context.”

Dolby Vision Workflow
As most people reading this know, mastering in high dynamic range (Dolby Vision in this case) opens up the possibility of working within a significantly expanded contrast range and wider color gamut over Rec.709 standard for traditional HD. Lost in Space was mastered concurrently for both, which required Ferstl to use Dolby’s workflow. And this involves making all corrections for the HDR version and then allowing the Dolby hardware/software to analyze the images to bring them into the Rec.709 space for the colorist to do a standard-def pass.

Ferstl, who worked with two Sony X-300 monitors, one calibrated for Rec.709 and the other for HDR, explains, “Everyone is used to looking at Rec. 709. Most viewers today will see the show in Rec.709 and that’s really what the clients are most concerned with. At some point, if HDR becomes the dominant way people watch television, then that will probably change. But we had to make corrections in HDR and then wait for the analysis to show us what the revised image looked like for standard dynamic range.”

He elaborates that while the Dolby Vision spec allows the brightest whites to read at 4000 nits, he and the filmmakers preferred to limit that to 1000 nits. “If you let highlights go much further than we did,” he says, “some things can become hard to watch. They become so bright that visual fatigue sets in after too long. So we’d sometimes take the brightest portions of the frame and slightly clamp them,” he says of the technique of holding the brightest areas of the frame to levels below the maximum the spec allows.

“Sometimes HDR can be challenging to work with and sometimes it can be amazing,” he allows. Take the vast vistas and snowcapped mountains we first see when the family starts exploring the planet. “You have so much more detail in the snow and an amazing range in the highlights than you could ever display in Rec.709,” he says.

“In HDR, the show conveys the power and majesty of these vast spaces beyond what viewers are used to seeing. There are quite a few sections that lend themselves to HDR,” he continues. But as with all such tools, it’s not always appropriate to the story to use the extremes of that dynamic range. Some highlights in HDR can pull the viewer’s attention to a portion of the frame in a way that simply can’t be replicated in Rec. 709 and, likewise, a bright highlight from a practical or a reflection in HDR can completely overpower an image that tells the story perfectly in standard dynamic range. “The tools can re-map an image mathematically,” Ferstl notes, “but it still requires artists to interpret an image’s meaning and feel from one space to the other.”

That brings up another question: How close do you want the HDR and the Rec.709 to look to each other when they can look very different? Overall, the conclusion of all involved on the series was to constrain the levels in the HDR pass a bit in order to keep the two versions in the same ballpark aesthetically. “The more you let the highlights go in HDR,” he explains, “the harder it is to compress all that information for the 100-nit version. If you look at scenes with the characters in space suits, for example, they have these small lights that are part of their helmets and if you just let those go in HDR, those lights become so distracting that it becomes hard to look at the people’s faces.”

Such decisions were made in the grading theater on a case by case basis. “It’s not like we looked at a waveform monitor and just said, ‘let’s clamp everything above this level,’” he explains, “it was ultimately about the feeling we’d get from each shot.”


Creator Justin Simien talks Netflix’s Dear White People

By Iain Blair

The TV graveyard is bursting at the seams with failed adaptations of hit movies. But there are rare exceptions, such as Netflix’s acclaimed hit comedy Dear White People, which creator Justin Simien adapted from his 2014 indie movie of the same name. The film premiered at the Sundance Film Festival, where it won the Special Jury Award for Breakthrough Talent. Simien went on to also win Best First Screenplay and a nomination for Best First Feature at the Independent Spirit Awards.

Justin Simien (Photo by Rick Proctor).

Now a series on Netflix and enjoying its second season (it was just picked up for its third!), this college dramedy is set at Winchester University, a fictional, predominantly white Ivy League college, where racial tensions bubble just below the surface. It stars a large, charismatic ensemble cast (most of whom appeared in the film) that includes Logan Browning, Brandon P. Bell, Antoinette Robertson, DeRon Horton, John Patrick Amedori, Ashley Blaine Featherson, Marque Richardson and Giancarlo Esposito (as the narrator), dealing with such timely and timeless issues as racism, inclusion, social injustice, politics, abortion, body image, cultural bias, political correctness (or lack thereof), activism and, of course, romance in the millennial age.

Through an absurdist lens, Dear White People uses sharp, quick-fire dialogue, biting irony, self-deprecation and brutal honesty to hold up a mirror to some of the problems plaguing society today. It also makes the medicine go down easy by leading with big laughs.

The show is also a master class in how to successfully make that tricky transition from the big to small screen, and tellingly it has retained a coveted and rare 100% on Rotten Tomatoes for both seasons (take note, Emmy voters!).

I recently spoke with Simien about making the show, the changing TV landscape, the Emmys and his next movie.

The TV landscape is full of the corpses of failed movie adaptations. How did you avoid that fate when you adapted your film for TV?
(Laughs) You’re so right. Movies often don’t translate very well to TV, but I felt my film was in the great tradition of multi-protagonist ensemble films I love so much. I also felt that in the confines of 90 minutes or so, you can never really truly get into the hearts of all the characters. By the end, the audience wanted more from them, so it lent itself to the longer format. And I felt it would be much more interesting than the typical show if we [borrowed] a bit of that cinematic tradition — like films by Robert Altman and Spike Lee — where you really get a strong point-of-view and multiple stories are carefully woven together, and then apply it to TV.

It seems that in many ways, the film’s concerns and issues work even better in an extended TV series. What were the big themes you wanted to explore?
As with the film, it’s really a conversation about identity and self, and the roles that you play in society. We all do it in order to navigate society, but for people of color, those identities have been chosen for them, so it often takes us a lot longer to get to the heart of who we really are and what the self is. We’re taught from a very early age to always be aware that you’re different, and that people see you differently. We deal with all that through comedy and satire. It has a lot on its mind.

Where do you shoot?
All in LA. Most of the interiors are done at Tamarack Studios in Sun Valley, and then we shoot our exteriors at UCLA and at a former school in Alhambra.

Do you direct a lot of the episodes?
I direct some. I did three in the first season, and four in the second, but since I run the show along with Yvette Lee Bowser, I’m just too busy to direct them all. So I handpick other directors who come in, such as Barry Jenkins, Charlie McDowell, Tina Mabry and others. But they don’t come into this world to paint by numbers. It’s more a case of them riffing off of what I did, like a jazz musician. It’s a very cohesive and collaborative process, and I’m very involved in all the episodes.

Do you like being a showrunner?
I do, but to be honest I like directing and writing more. The storytelling is the part of the gig that I’m in it for. But it is satisfying to run the larger operation and work closely with all these fantastic writers, directors and actors, and creating this environment where they can all do their best work.

Where do you post?
All at Tamarack, and it’s very convenient since it’s important for me to be able to bounce between the set and the edit bay on each episode. We did all the sound at Warners, and the DI at Universal with colorist Scott Gregory.

Do you like the post process?
I love post because it’s where you figure out if what you shot really works, and it’s your last chance to write the show. It’s the final rewrite, and a chance to fix the things that don’t work, so it’s scary and challenging. Post is also where you get to see the arc of the whole season and see all the episodes as like a five-hour movie. It’s where I get to apply all my final ideas. When I’m writing the show, we’re in a process of discovery, and it’s not until post that you really get a sense of how the beginning fits with the end, and that what you’re trying to say is there and working.

Justin Simien

Can you talk about the editing? You have several editors on the show, yes?
We use two editors per season. Phil Bartell, who cut the film for me, is always one of them. Steve Edwards was the other one on Season 1, and Omar Hassan-Reep was on Season 2. Post schedules are so jammed in TV that using two editors helps speed it all up. We allot a certain amount of time for each episode, so I can spend time with it. Same with the director and the editor.

You have a big cast and a lot of storylines. What are the big editing challenges?
The big one is that none of the show is turnkey. Directors don’t paint by numbers and the scripts are not written to any kind of format or formula — other than we stay with one point of view at a time. So that means that editing each episode is like editing its own mini-movie. One episode is film noir, another’s about mushrooms and hallucinations, so each one requires different styles, techniques, and different approaches work for different points of view. Each time we have to reinvent the wheel.

VFX play a big role in some episodes. Can you talk about working on them?
There’s far more than normal for a show like this, and mostly because social media is such an integral part of the characters’ lives. So we really try and use all that in a cinematic way and give you the feeling of what they’re going through instead of just cutting to the cell phone or computer every time. We really work hard to integrate all that.

Ingenuity does all the overlay VFX and it can take a while to figure it all out and get it right.

Unlike movies, sound in television has arguably always played second fiddle to the images, but this has a great score by Kris Bowers and great sound design. Please talk about the importance of sound and music to you.
Sound in movies has always gotten more attention, but TV’s changing and getting more cinematic. Music is so important to me, and I make sure the score isn’t just filler or interstitial — it has to be able to operate independently of the visuals, like it does with the movies of my favorite filmmakers, like Stanley Kubrick. It’s not just supplemental, and Kris is brilliant — just as adept at jazz as classical — and we have recurring themes and motifs and thematic hooks, and it’s very multi-layered.

How important are the Emmys to a show like this?
Very. We live in a world where there’s so much to watch now, and I don’t think there’s anything like it out there. But it can take effort to get people to watch and give the show and the characters a shot. So the Emmys can really help shine a light.

What’s next?
I’ll be directing my second film, which I wrote and is titled Bad Hair. It’s a horror satire that’s set in the late ‘80s about an ambitious young woman who wants to be a DJ but who doesn’t have the right look, so she gets a weave that may or may not have a mind of its own. I’m casting right now and hope to start shooting this summer.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.


The Orville VFX supervisor on mixing practical and visual effects

By Barry Goch

What do you get when you mix Family Guy and Ted creator Seth MacFarlane with science fiction? The most dysfunctional spaceship in the galaxy, that’s what. What is the Fox series The Orville? Well, it’s more Galaxy Quest/Space Balls than it is Star Trek/Star Wars.

Set 400 years in the future, The Orville is a spaceship captained by MacFarlane’s Ed Mercer, who has to work alongside his ex-wife as they wing their way through space on a science mission. As you might imagine with a show that is set in space, The Orville features a large amount of visual and practical effects shots, including real and CG models of The Orville.

Luke McDonald

We reached out to the show’s VFX supervisor Luke McDonald to find out more.

How did the practical model of The Orville come about?
Jon Favreau was directing the pilot, and he and Seth MacFarlane had been kidding around about doing a practical model of The Orville. I jumped at the chance. In this day and age, visual effects supervisors shooting models is an unheard of thing to do, but something I was absolutely thrilled about.

Favreau’s visual effects supervisor is Rob Legato. I have worked with Rob on many projects, including Martin Scorsese’s Aviator, Shine a Light and Shutter Island, so I was very familiar with how Rob works. The only other chance that I had had to shoot models was with Rob during Shutter Island and Aviator, so in a sense, whenever Rob Legato shows up it’s model time (he laughs). It’s so amazing because it’s just something that the industry shies away from, but given the opportunity it was absolutely fantastic.

Who built the practical model of The Orville?
Glenn Derry made it. He’s worked with Rob Legato on a few things, including Aviator. Glen is kind of a fantastic. He basically does motion controls, models and motion capture. Glen would also look at all the camera moves and all the previz that we did to make sure the camera moves were not doing something that the motion control rig could not do.

How were you able to seamlessly blend the practical model and the CG version of The Orville?
Once we had the design for The Orville, we would then previz out the ships flying by camera, doing whatever, and work out these specific moves. Any move that was too technical for the motion control rig, we would do a CG link-up instead— meaning that it would go from model to a CG ship or vice versa — to get the exact camera move that we wanted. We basically shot all of the miniatures of The Orville at three frames a second. It was kind of like shooting in slow-mo with the motion control rig, and we did about 16 passes per shot — lights on, lights off, key lights, field light, back light, ambient, etc. So, when we got all the passes back, we composited them just like we would any kind of full CG shot.

From the model shoot, we ended up with about 25 individual shots of The Orville. It’s a very time-consuming process, but it’s very rewarding because of how many times you’re going to have to reuse these elements to achieve completely new shots, even though it’s from the same original motion control shoot.

How did the shots of The Orville evolve over the length of the season?
We started to get into more dynamic things, such as big space battles and specific action patenting, where it really wasn’t feasible to continue shooting the model itself. But now we have a complete match for our CG version of The Orville that we can use for our big space battles, where the ship’s flying and whipping around. I need to emphasize that previz on this project was very crucial.

The Orville is a science vessel, but when it needs to throw down and fight, it has the capabilities to be quite maneuverable — it can barrel roll, flip and power slide around to get itself in position to get the best shot off. Seth was responding to these hybrid-type ship-to-ship shots and The Orville moving through space in a unique way when it’s in battle.
There was never a playbook. It was always, “Let’s explore, let’s figure out, and let’s see where we fit in this universe. Do we fit into the traditional Star Trek-y stuff, or do we fit into the Star Wars-type stuff. I’m so pleased that we fit into this really unique world.

How was working with Seth MacFarlane?
Working with Seth has been absolutely amazing. He’s such a dedicated storyteller, even down to the most minute things. He’s such an encyclopedia of sci-fi knowledge, be it Star Trek, Star Wars, Battlestar Galactica or the old-school Buck Rogers and Flash Gordon. All of them are part of his creative repertoire. It’s very rare that he makes a reference that I don’t get, because I’m exactly the same way about sci-fi.

How different is creating VFX for TV versus film?
TV is not that new to me, but for the last 10 years I’ve been doing film work for Bad Robot and JJ Abrams. It was a strange awakening coming to TV, but it wasn’t horrifying. I had to approach things in a different way, especially from a budget standpoint.

Rachel Matchett brought on to lead Technicolor Visual Effects

Technicolor has hired on Rachel Matchett to head the post production group’s newly formed VFX brand, Technicolor Visual Effects. Working side-by-side within the same facilities where post services are offered, Technicolor Visual Effects is expanding to a global offering with an integrated pipeline. Technicolor is growing its namesake VFX team apart from the company’s other visual effects brands: MPC, The Mill, Mr. X and Mikros.

A full-service creative VFX house with local studios in Los Angeles, Toronto and London, Technicolor Visual Effects’ recent credits include the feature films Avengers: Infinity Wars, Black Panther, Paddington 2, and episodic series such as This is Us, Anne With an E and Black Mirror.

Matchett joins Technicolor from her long-tenured position at MPC Film. Her background at MPC London includes nearly a decade of senior management positions at the studio. She most recently served as MPC London’s global head of production. In that role, her divisions at MPC Film oversaw and carried out visual effects on a number of films each year, including director Jon Favreau’s Academy Award-winning The Jungle Book and the critically acclaimed Blade Runner 2049.

“Technicolor Visual Effects is emerging from its position as one of the industry’s best-kept secrets. While continuing to support clients who do color finishing with us, we are excited to work with storytellers from script to screen,” says Matchett. “Having been at the heart of MPC Film’s rapid growth over the past decade, I feel that there is a great opportunity for Technicolor’s future role in VFX to forge a new path within the industry.”

Showtime’s Homeland: Producer/director Lesli Linka Glatter

By Iain Blair

Since it first premiered back in 2011, the provocative, edgy and timely spy thriller Homeland has been a huge hit with audiences and critics alike. It has also racked up dozens of awards, including Primetime Emmys and Golden Globes.

The show, which features an impressive cast — namely Claire Danes and Mandy Patinkin — is Showtime’s number one drama series is produced by Fox 21 Television Studios and was developed for American television by Alex Gansa and Howard Gordon. Homeland is based on the Israeli series Prisoners of War from Gideon Raff.

Lesli Linka Glatter

Producer Lesli Linka Glatter is an award-winning director of film and episodic dramas. Her TV work includes The Newsroom, The Walking Dead, Justified, Ray Donovan, Masters of Sex, Nashville, True Blood, Mad Men, The Good Wife, House, The West Wing, NYPD Blue, ER and Freaks and Geeks, just to name a few. Most recently, she directed the first two episodes of Dick Wolf’s limited series Law & Order: True Crime — The Menendez Murders for NBC.

Glatter was nominated for a fifth Emmy for directing the Homeland episode “America First,” and in 2015 and 2016 she was also among the producers acknowledged when Homeland received back-to-back Emmy nominations for Best Drama. 

Glatter began her directing career through American Film Institute’s Directing Workshop for Women, and her short film Tales of the Meeting and Parting was nominated for an Academy Award. Her first series was Amazing Stories, followed by Twin Peaks, for which she received her first Directors Guild Award nomination. She made her feature film directorial debut with Now and Then, followed by The Proposition. For HBO she directed State of Emergency, Into the Homeland and The Promise.

To say her career has been prolific is an understatement. I recently spoke with Glatter about making Homeland, the Emmys, her love of post and mentoring other women.

Have you started Season 8?
Not yet. We’re not even prepping yet since we just finished Season 7. The first thing that happens is the writers, myself, Claire, Mandy and the DP go to DC to meet with the intelligence community, and what we find out from talking to these people then becomes the next season.

Is it definitely the final one?
I think that’s unclear yet. It might go on.

Do you like being a showrunner?
I love it. As a producing director I love being involved with the whole novel, the whole big picture of the season, as well as the individual chapters. There’s an overall look and feel and tone to each season, and I also get to direct four of the 12 episodes. We have other amazing directors who come in, and that creates energy and brings in a different point of view, yet it fits into the whole, overall storyline and feel of the season. We have this wonderful working environment on the show where the best idea wins, so it’s very creative. Then every year we reinvent the wheel, with a new look and feel for the show.

What are the big challenges of showrunning?
A complex show like this is filled with all sorts of challenges and joys, in equal parts. Obviously, everything starts with the material and the script, then I have my partners in crime — Claire and Mandy — who’re so creative and collaborative. The big challenge is that we try to make each season new and fresh. People might look at one of Season 7’s shows and think we have it all dialed in with the same sets, the same crew in place and so on, but we’re always going to a new place with a new crew and new sets, and we shoot for 11 days, but nine of those are usually on location, so we have very few on stage. In terms of logistics, that is really challenging. Every episode’s different, but that’s generally how it works. Then we’re exploring very relevant and timely issues. We just dealt with “a nation divided” and Russian meddling, and these are things that everyone’s talking about right now.

As mentioned, you direct a lot of shows. Do you prefer doing that?
It’s more that I see myself as a director first and foremost, although I love showrunning and producing as well. I want to be the producer that every director would love to have, since I try to give them whatever they need to tell their best stories. I have a great line producer/partner named Michael Klick. He’s the magic man who makes it all happen. The key in TV is to have great partners, and our core creative team — DPs David Klein and Giorgio Scali, our editors, production designers, costume designers — are all so talented. You want the smartest team you can get, and then let the best idea win, and we always aim for a very cinematic look.

Where do you post?
We did all the editing on the Fox lot and all the sound mixing at Universal. Encore does the VFX.

Do you like the post process?
I love it. It’s where it all comes together, and you get to look at everything you’ve done and re-shape it and make it the best it can be. Along with everyone else, I have my idea of what each episode will be, and then we have our editing team and they bring all their ideas to it, so it’s very exciting to watch it evolve.

Talk about editing. You have several editors, I assume because of the time factor. How does that work?
We have three editors — Jordan Goldman, Harvey Rosenstock and Philip Carr Neel — because of the tight schedule, and they each handle different episodes and focus solely on those… unless we run into a problem.

You have a big cast and a lot of stuff going on in each episode. What are the big editing challenges?
Telling the best possible story and staying true to the theme and subtext and intent of that story. The show really lives in shades of gray with a lot of ambiguity. A classic Homeland scene will feature two characters on completely opposing sides of an issue, and they’re both right and both wrong. So maybe that makes you think more about that issue and question your beliefs, and I love that about the show.

This show has a great score by Sean Callery, as well as great sound design. Can you talk about the importance of sound and music.
Sean’s an amazing storyteller and brilliant at what he does, as the show has a huge amount of anxiety in it, and he captures that and helps amplify it — but without making it obvious. He’s been on the show since the start, and we’ve also worked with the same sound team for a long time, and sound design’s such a key element in our show. We spend a lot of time on all the little details that you may not notice in a scene.

How important are the Emmys to you and a show like this?
You can’t ever think about awards while you’re working. You just focus on trying to tell the best possible story, but in this golden age of TV it’s great to be recognized by your peers. It’s huge!

There’s been a lot of talk about lack of opportunity for women in movies. Are things better in TV?
Things are changing and improving. I’ve been involved with mentoring women directors for many, many years, and I hope we soon get to a point where gender is no longer an issue. If you’d asked me back when I began directing over 20 years ago if we’d still be discussing all this today, I’d have said, “Absolutely not!” But here we still are. The truth is, showrunning and directing are hard and challenging jobs, but women should have the same opportunities as men. Simple as that.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Luke Scott to run newly created Ridley Scott Creative Group

Filmmaker Ridley Scott has brought together all of his RSA Films-affiliated companies together in a multi-business restructure to form the Ridley Scott Creative Group. The Ridley Scott Creative Group aims to strengthen the network across the related companies to take advantage of emerging opportunities across all entertainment genres as well as their existing work in film, television, branded entertainment, commercials, VR, short films, documentaries, music video, design and animation, and photography.

Ridley Scott

Luke Scott will assume the role of global CEO, working with founder Ridley Scott and partners Jake and Jordan Scott to oversee the future strategic direction of the newly formed group.

“We are in a new golden age of entertainment,” says Ridley Scott. “The world’s greatest brands, platforms, agencies, new entertainment players and studios are investing hugely in entertainment. We have brought together our talent, capabilities and creative resources under the Ridley Scott Creative Group, and I look forward to maximizing the creative opportunities we now see unfolding with our executive team.”

The companies that make up the RSCG will continue to operate autonomously but will now offer clients synergy under the group offering.

The group includes commercial production company RSA Films, which produced such ads such as Apple’s 1984, Budweiser’s Super Bowl favorite Lost Dog and more recently, Adidas Originals’ Original is Never Finished campaign, as well as branded content for Johnnie Walker, HBO, Jaguar, Ford, Nike and the BMW Films series; the music video production company founded by Jake Scott, Black Dog Films (Justin Timberlake, Maroon 5, Nicki Minaj, Beyoncé, Coldplay, Björk and Radiohead); the entertainment marketing company 3AM; commercial production company Hey Wonderful founded by Michael Di Girolamo; newly founded UK commercial production company Darling Films; and film and television production company Scott Free (Gladiator, Taboo, The Martian, The Good Wife), which continues to be led by David W. Zucker, president, US television; Kevin J. Walsh, president, US film; and Ed Rubin-Managing, director, UK television/film.

“Our Scott Free Films and Television divisions have an unprecedented number of movies and shows in production,” reports Luke Scott. “We are also seeing a huge appetite for branded entertainment from our brand and agency partners to run alongside high-quality commercials. Our entertainment marketing division 3AM is extending its capabilities to all our partners, while Black Dog is moving into short films and breaking new, world-class talent. It is a very exciting time to be working in entertainment.”

 

 

 

 

 

Netflix’s Lost in Space: New sounds for a classic series

By Jennifer Walden

Netflix’s Lost in Space series, a remake of the 1965 television show, is a playground for sound. In the first two episodes alone, the series introduces at least five unique environments, including an alien planet, a whole world of new tech — from wristband communication systems to medical analysis devices — new modes of transportation, an organic-based robot lifeform and its correlating technologies, a massive explosion in space and so much more.

It was a mission not easily undertaken, but if anyone could manage it, it was four-time Emmy Award-winning supervising sound editor Benjamin Cook of 424 Post in Culver City. He’s led the sound teams on series like Starz’s Black Sails, Counterpart and Magic City, as well as HBO’s The Pacific, Rome and Deadwood, to name a few.

Benjamin Cook

Lost in Space was a reunion of sorts for members of the Black Sails post sound team. Making the jump from pirate ships to spaceships were sound effects editors Jeffrey Pitts, Shaughnessy Hare, Charles Maynes, Hector Gika and Trevor Metz; Foley artists Jeffrey Wilhoit and Dylan Tuomy-Wilhoit; Foley mixer Brett Voss; and re-recording mixers Onnalee Blank and Mathew Waters.

“I really enjoyed the crew on Lost in Space. I had great editors and mixers — really super-creative, top-notch people,” says Cook, who also had help from co-supervising sound editor Branden Spencer. “Sound effects-wise there was an enormous amount of elements to create and record. Everyone involved contributed. You’re establishing a lot of sounds in those first two episodes that are carried on throughout the rest of the season.”

Soundscapes
So where does one begin on such a sound-intensive show? The initial focus was on the soundscapes, such as the sound of the alien planet’s different biomes, and the sound of different areas on the ships. “Before I saw any visuals, the showrunners wanted me to send them some ‘alien planet sounds,’ but there is a huge difference between Mars and Dagobah,” explains Cook. “After talking with them for a bit, we narrowed down some areas to focus on, like the glacier, the badlands and the forest area.”

For the forest area, Cook began by finding interesting snippets of animal, bird and insect recordings, like a single chirp or little song phrase that he could treat with pitching or other processing to create something new. Then he took those new sounds and positioned them in the sound field to build up beds of creatures to populate the alien forest. In that initial creation phase, Cook designed several tracks, which he could use for the rest of the season. “The show itself was shot in Canada, so that was one of the things they were fighting against — the showrunners were pretty conscious of not making the crash planet sound too Earthly. They really wanted it to sound alien.”

Another huge aspect of the series’ sound is the communication systems. The characters talk to each other through the headsets in their spacesuit helmets, and through wristband communications. Each family has their own personal ship, called a Jupiter, which can contact other Jupiter ships through shortwave radios. They use the same radios to communicate with their all-terrain vehicles called rovers. Cook notes these ham radios had an intentional retro feel. The Jupiters can send/receive long-distance transmissions from the planet’s surface to the main ship, called Resolute, in space. The families can also communicate with their Jupiters ship’s systems.

Each mode of communication sounds different and was handled differently in post. Some processing was handled by the re-recording mixers, and some was created by the sound editorial team. For example, in Episode 1 Judy Robinson (Taylor Russell) is frozen underwater in a glacial lake. Whenever the shot cuts to Judy’s face inside her helmet, the sound is very close and claustrophobic.

Judy’s voice bounces off the helmet’s face-shield. She hears her sister through the headset and it’s a small, slightly futzed speaker sound. The processing on both Judy’s voice and her sister’s voice sounds very distinct, yet natural. “That was all Onnalee Blank and Mathew Waters,” says Cook. “They mixed this show, and they both bring so much to the table creatively. They’ll do additional futzing and treatments, like on the helmets. That was something that Onna wanted to do, to make it really sound like an ‘inside a helmet’ sound. It has that special quality to it.”

On the flipside, the ship’s voice was a process that Cook created. Co-supervisor Spencer recorded the voice actor’s lines in ADR and then Cook added vocoding, EQ futz and reverb to sell the idea that the voice was coming through the ship’s speakers. “Sometimes we worldized the lines by playing them through a speaker and recording them. I really tried to avoid too much reverb or heavy futzing knowing that on the stage the mixers may do additional processing,” he says.

In Episode 1, Will Robinson (Maxwell Jenkins) finds himself alone in the forest. He tries to call his father, John Robinson (Toby Stephens — a Black Sails alumni as well) via his wristband comm system but the transmission is interrupted by a strange, undulating, vocal-like sound. It’s interference from an alien ship that had crashed nearby. Cook notes that the interference sound required thorough experimentation. “That was a difficult one. The showrunners wanted something organic and very eerie, but it also needed to be jarring. We did quite a few versions of that.”

For the main element in that sound, Cook chose whale sounds for their innate pitchy quality. He manipulated and processed the whale recordings using Symbolic Sound’s Kyma sound design workstation.

The Robot
Another challenging set of sounds were those created for Will Robinson’s Robot (Brian Steele). The Robot makes dying sounds, movement sounds and face-light sounds when it’s processing information. It can transform its body to look more human. It can use its hands to fire energy blasts or as a tool to create heat. It says, “Danger, Will Robinson,” and “Danger, Dr. Smith.” The Robot is sometimes a good guy and sometimes a bad guy, and the sound needed to cover all of that. “The Robot was a job in itself,” says Cook. “One thing we had to do was to sell emotion, especially for his dying sounds and his interactions with Will and the family.”

One of Cook’s trickiest feats was to create the proper sense of weight and movement for the Robot, and to portray the idea that the Robot was alive and organic but still metallic. “It couldn’t be earthly technology. Traditionally for robot movement you will hear people use servo sounds, but I didn’t want to use any kind of servos. So, we had to create a sound with a similar aesthetic to a servo,” says Cook. He turned to the Robot’s Foley sounds, and devised a processing chain to heavily treat those movement tracks. “That generated the basic body movement for the Robot and then we sweetened its feet with heavier sound effects, like heavy metal clanking and deeper impact booms. We had a lot of textures for the different surfaces like rock and foliage that we used for its feet.”

The Robot’s face lights change color to let everyone know if it’s in good-mode or bad-mode. But there isn’t any overt sound to emphasize the lights as they move and change. If the camera is extremely close-up on the lights, then there’s a faint chiming or tinkling sound that accentuates their movement. Overall though, there is a “presence” sound for the Robot, an undulating tone that’s reminiscent of purring when it’s in good-mode. “The showrunners wanted a kind of purring sound, so I used my cat purring as one of the building block elements for that,” says Cook. When the Robot is in bad-mode, the sound is anxious, like a pulsing heartbeat, to set the audience on edge.

It wouldn’t be Lost in Space without the Robot’s iconic line, “Danger, Will Robinson.” Initially, the showrunners wanted that line to sound as close to the original 1960’s delivery as possible. “But then they wanted it to sound unique too,” says Cook. “One comment was that they wanted it to sound like the Robot had metallic vocal cords. So we had to figure out ways to incorporate that into the treatment.” The vocal processing chain used several tools, from EQ, pitching and filtering to modulation plug-ins like Waves Morphoder and Dehumaniser by Krotos. “It was an extensive chain. It wasn’t just one particular tool; there were several of them,” he notes.

There are other sound elements that tie into the original 1960’s series. For example, when Maureen Robinson (Molly Parker) and husband John are exploring the wreckage of the alien ship they discover a virtual map room that lets them see into the solar system where they’ve crashed and into the galaxy beyond. The sound design during that sequence features sound material from the original show. “We treated and processed those original elements until they’re virtually unrecognizable, but they’re in there. We tried to pay tribute to the original when we could, when it was possible,” says Cook.

Other sound highlights include the Resolute exploding in space, which caused massive sections of the ship to break apart and collide. For that, Cook says contact microphones were used to capture the sound of tin cans being ripped apart. “There were so many fun things in the show for sound. From the first episode with the ship crash and it sinking into the glacier to the black hole sequence and the Robot fight in the season finale. The show had a lot of different challenges and a lot of opportunities for sound.”

Lost in Space was mixed in the Anthony Quinn Theater at Sony Pictures in 7.1 surround. Interestingly, the show was delivered in Dolby’s Home Atmos format. Cook explains, “When they booked the stage, the producer’s weren’t sure if we were going to do the show in Atmos or not. That was something they decided to do later so we had to figure out a way to do it.”

They mixed the show in Atmos while referencing the 7.1 mix and then played those mixes back in a Dolby Home Atmos room to check them, making any necessary adjustments and creating the Atmos deliverables. “Between updates for visual effects and music as well as the Atmos mixes, we spent roughly 80 days on the dub stage for the 10 episodes,” concludes Cook.

JoJo Whilden/Hulu

Color and audio post for Hulu’s The Looming Tower

Hulu’s limited series, The Looming Tower, explores the rivalries and missed opportunities that beset US law enforcement and intelligence communities in the lead-up to the 9/11 attacks. Based on the Pulitzer Prize-winning book by Lawrence Wright, who also shares credit as executive producer with Dan Futterman and Alex Gibney, the show’s 10 episodes paint an absorbing, if troubling, portrait of the rise of Osama bin Laden and al-Qaida, and offer fresh insight into the complex people who were at the center of the fight against terrorism.

For The Looming Tower’s sound and picture post team, the show’s sensitive subject matter and blend of dramatizations and archival media posed significant technical and creative challenges. Colorist Jack Lewars and online editor Jeff Cornell of Technicolor PostWorks New York, were tasked with integrating grainy, run-and-gun news footage dating back to 1998 with crisply shot, high-resolution original cinematography. Supervising sound designer/effects mixer Ruy García and re-recording mixer Martin Czembor from PostWorks, along with a Foley team from Alchemy Post Sound, were charged with helping to bring disparate environments and action to life, but without sensationalizing or straying from historical accuracy.

L-R: colorist Jack Lewars and editor Jeff Cornell

Lewars and Cornell mastered the series in Dolby Vision HDR, working from the production’s camera original 2K and 3.4K ArriRaw files. Most of the color grading and conforming work was done with a light touch, according to Lewars, as the objective was to adhere to a look that appeared real and unadulterated. The goal was for viewers to feel they are behind the scenes, watching events as they happened.

Where more specific grades were applied, it was done to support the narrative. “We developed different look sets for the FBI and CIA headquarters, so people weren’t confused about where we were,” Lewars explains. “The CIA was working out of the basement floors of a building, so it’s dark and cool — the light is generated by fluorescent fixtures in the room. The FBI is in an older office building — its drop ceiling also has fluorescent lighting, but there is a lot of exterior light, so its greener, warmer.”

The show adds to the sense of realism by mixing actual news footage and other archival media with dramatic recreations of those same events. Lewars and Cornell help to cement the effect by manipulating imagery to cut together seamlessly. “In one episode, we matched an interview with Osama bin Laden from the late ‘90s with new material shot with an Arri Alexa,” recalls Lewars. “We used color correction and editorial effects to blend the two worlds.”

Cornell degraded some scenes to make them match older, real-world media. “I took the Alexa material and ‘muddied’ it up by exporting it to compressed SD files and then cutting it back into the master timeline,” he notes. “We also added little digital hits to make it feel like the archival footage.”

While the color grade was subtle and adhered closely to reality, it still packed an emotional punch. That is most apparent in a later episode that includes the attack on the Twin Towers. “The episode starts off in New York early in the morning,” says Lewars. “We have a series of beauty shots of the city and it’s a glorious day. It’s a big contrast to what follows — archival footage after the towers have fallen where everything is a white haze of dust and debris.”

Audio Post
The sound team also strove to remain faithful to real events. García recalls his first conversations about the show’s sound needs during pre-production spotting sessions with executive producer Futterman and editor Daniel A. Valverde. “It was clear that we didn’t want to glamorize anything,” he says. “Still, we wanted to create an impact. We wanted people to feel like they were right in the middle of it, experiencing things as they happened.”

García says that his sound team approached the project as if it were a documentary, protecting the performances and relying on sound effects that were authentic in terms of time and place. “With the news footage, we stuck with archival sounds matching the original production footage and accentuating whatever sounds were in there that would connect emotionally to the characters,” he explains. “When we moved to the narrative side with the actors, we’d take more creative liberties and add detail and texture to draw you into the space and focus on the story.”

He notes that the drive for authenticity extended to crowd scenes, where native speakers were used as voice actors. Crowd sounds set in the Middle East, for example, were from original recordings from those regions to ensure local accents were correct.

Much like Lewars approach to color, García and his crew used sound to underscore environmental and psychological differences between CIA and FBI headquarters. “We did subtle things,” he notes. “The CIA has more advanced technology, so everything there sounds sharper and newer versus the FBI where you hear older phones and computers.”

The Foley provided by artists and mixers from Alchemy Post Sound further enhanced differences between the two environments. “It’s all about the story, and sound played a very important role in adding tension between characters,” says Leslie Bloome, Alchemy’s lead Foley artist. “A good example is the scene where CIA station chief Diane Marsh is berating an FBI agent while casually applying her makeup. Her vicious attitude toward the FBI agent combined with the subtle sounds of her makeup created a very interesting juxtaposition that added to the story.”

In addition to footsteps, the Foley team created incidental sounds used to enhance or add dimension to explosions, action and environments. For a scene where FBI agents are inspecting a warehouse filled with debris from the embassy bombings in Africa, artists recorded brick and metal sounds on a Foley stage designed to capture natural ambience. “Normally, a post mixer will apply reverb to place Foley in an environment,” says Foley artist Joanna Fang. “But we recorded the effects in our live room to get the perspective just right as people are walking around the warehouse. You can hear the mayhem as the FBI agents are documenting evidence.”

“Much of the story is about what went wrong, about the miscommunication between the CIA and FBI,” adds Foley mixer Ryan Collison, “and we wanted to help get that point across.”

The soundtrack to the series assumed its final form on a mix stage at PostWorks. Czembor spent weeks mixing dialogue, sound and music elements into what he described as a cinematic soundtrack.

L-R: Martin Czember and Ruy Garcia

Czembor notes that the sound team provided a wealth of material, but for certain emotionally charged scenes, such as the attack on the USS Cole, the producers felt that less was more. “Danny Futterman’s conceptual approach was to go with almost no sound and let the music and the story speak for themselves,” he says. “That was super challenging, because while you want to build tension, you are stripping it down so there’s less and less and less.”

Czembor adds that music, from composer Will Bates, is used with great effect throughout the series, even though it might go by unnoticed by viewers. “There is actually a lot more music in the series than you might realize,” he says. “That’s because it’s not so ‘musical;’ there aren’t a lot of melodies or harmonies. It’s more textural…soundscapes in a way. It blends in.”

Czembor says that as a longtime New Yorker, working on the show held special resonance for him, and he was impressed with the powerful, yet measured way it brings history back to life. “The performances by the cast are so strong,” he says. “That made it a pleasure to work on. It inspires you to add to the texture and do your job really well.”

Showrunner Dan Pyne — Amazon Studios’ Bosch

By Iain Blair

How popular is Amazon’s Emmy-nominated detective show Bosch? So popular that the streaming service ordered up Season 5 before Season 4 even debuted in April. This critically acclaimed hour-long series is Amazon’s longest-running Prime Original.

Based on the best-selling novels by Michael Connelly, the show stars Titus Welliver (Lost) as LAPD homicide detective Harry Bosch, alongside a large ensemble cast that includes James Hector (The Wire), Amy Aquino (Being Human), Madison Lintz (The Walking Dead) and Lance Reddick (The Wire).

Dan Pyne

Season 4 kicked off with the murder of a high-profile attorney on the eve of his civil rights trial against the LAPD. Bosch is assigned to lead a task force — that suspects fellow cops — to solve the crime before the city erupts in a riot. Bosch must pursue every lead, even if it turns the spotlight back on his own department. One murder intertwines with another, and Bosch must reconcile his not-so-simple past to find a justice that has long eluded him.

Bosch was developed for television by Eric Overmyer (Treme, The Wire, Homicide: Life on the Streets) and is executive produced by Dan Pyne, whose film credits include The Manchurian Candidate, Pacific Heights, Sum of All Fears and Fracture. He also co-created and co-produced The Street, a syndicated police procedural starring Stanley Tucci.

I recently spoke with Pyne about making the show, the Emmys, production and post.

Eric Overmyer, who took a break to work on another Amazon production, The Man in the High Castle, is coming back to act as co-showrunner with you on Season 5. How will you split duties?
Good question! We’re making it up as we go along. I’d never worked with him before, but I did have a longtime partner before. Basically, we talk a lot and come to an agreement about any issues. The great thing about this show is that every season is its own entity, with its own rhythm and voice.

Have you started on Season 5?
We have, and we have almost six episodes plotted out and we start shooting in early August.

Where do you shoot?
We’re based at Red Studios in Hollywood, which isn’t far from the local police station, and we recreated that interior on a set, and it’s so uncannily similar — apart from a few details — that it’s hard to tell them apart. We shoot a lot in Hollywood and then locations all over the city and further afield.

Bosch has a very cinematic feel and look.
Yes, and that’s in part due to our producer, Pieter Jan Brugge, who comes from film and who’s worked a lot with Michael Mann on movies like Heat and Miami Vice — this is his first TV show.  I come from a film background too, so we take more of a film approach and discuss stuff like the sound and visuals and what they should be like and how a scene should play before we even start shooting

Do you like being a showrunner?
I do, and I was well-trained. I came up intending to write movies and then fell into TV and got lucky with the show Hard Copy. I became a specialist in 10-episode arcs (like Bosch). I got to work with several legendary showrunners, including Richard Levinson and William Link, who created such classic TV shows as Columbo and Murder, She Wrote, and William Sackheim who did Gidget and The Flying Nun. I haven’t done showrunning that much, but I always ran my own shows. And it’s the closest thing to being a film director because you have control and get to collaborate with everyone else, including post, which is so much fun.

Where do you post?
At Red Studios. We have a great team, including post producer Mark Douglas and post supervisor Tayah Geist, and we do color correction at Warner Bros. with colorist Scott Klein. He works closely with Pieter and the DPs, and sometimes we’ll make an entire season cooler or warmer. We’ll get inspiration from movies — maybe Japanese films or Blade Runner and so on — to help us with the color palette.

Do you like the post process?
Very much. It’s so true when people say, like with movies, there are three TV shows — the one you write, the one you shoot and the one you make in post. It’s in post where you start all over again each time, see what you’ve got, what works and doesn’t. I’ve always enjoyed collaborating with editors and all the sound people. I love what they bring to storytelling: the way they can help say things and elevate the material and help make stuff clear for the audience, and show or hide emotionality.

Let’s talk about editing. You have several editors, I assume because of the time factor. How does that work?
Yes, it’s the tight schedule, and in order to get it done we rotate three editors. One does four shows, and the other two do three, and we alternate. That way they have enough time to cut a show and pretty much finish it before moving to the next one. If you only have two editors, the workload’s overwhelming, so we use three — Steve Cohen, Jacque Toberen and Kevin Casey. There are three assistant editors — Rafael Nur, Judy Guerrero and Knar Kitabjian.

You have a big cast, and a lot of stuff going on in each episode. What are the big editing challenges?
We have two big ones. As it’s an investigative show it can be very detailed, so clarity is always a priority — making sure that all the story and plot points get pulled in the right way and land correctly. Mike’s books are very twisty, and they have a lot of tiny clues, so as detectives walk through scenes they’ll see things. There’s a lot of visual foreshadowing of things that come back later. The other one is making sure the episodes have pace. Police procedurals tend to fall into a pattern of walk-and-talk, and we try and avoid that.

Who does the VFX, and what’s involved?
Moving Target, and their Alan Munro is our VFX supervisor. We use a lot more VFX than you would imagine. One of the show’s hallmarks is making it all as real as possible, so when we recently did a show with scenes in tunnels we used a lot of masking and CGI as we were limited in the way we could light and shoot the actual tunnels. I find that visual effects really make TV a lot easier. We do some plates depending on the situation, but often it’s really small stuff and cleanup to make it all even more realistic.

This show has a great score by Jesse Voccia and great sound design. Can you talk about the importance of sound and music?
Sound is always difficult because it’s TV. It’s the nature of the beast. TV often shoots so fast that the production sound can be problematic, what with traffic noise and so on, and you have to fix all that. But I love playing with sound and working with the sound designers and ADR guys. We do all the mixing at Technicolor at Paramount, and we have a great crew. There’s not a lot of music in the show, but we try to make it all count and not use short little stingers like TV usually does, or score a chase. We’ll use sound design or natural sound instead.

How important are the Emmys to a show like this, which seems a little underrated?
It would be great to be nominated, although maybe the fans don’t care that much. We do fly under the radar a bit, I think, so more recognition would be very welcome.

Thanks to the ongoing source material, the show could easily run for many more years. Will you do more seasons of the show?
I’d love to. What’s so great is that every season is brand new and different, with its own beginning, middle and end. It plays like a book, so we can really work on the tone and feel.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Netflix’s Godless offers big skies and big sounds

By Jennifer Walden

One of the great storytelling advantages of non-commercial television is that content creators are not restricted by program lengths or episode numbers. The total number of episodes in a show’s season can be 13 or 10 or less. An episode can run 75 minutes or 33 minutes. This certainly was the case for writer/director/producer Scott Frank when creating his series Godless for Netflix.

Award-winning sound designer, Wylie Stateman, of Twenty Four Seven Sound explains why this worked to their advantage. “Godless at its core is a story-driven ‘big-sky’ Western. The American Western is often as environmentally beautiful as it is emotionally brutal. Scott Frank’s goal for Godless was to create a conflict between good and evil set around a town of mostly female disaster survivors and their complex and intertwined pasts. The Godless series is built like a seven and a half hour feature film.”

Without the constraints of having to squeeze everything into a two-hour film, Frank could make the most of his ensemble of characters and still include the ride-up/ride-away beauty shots that show off the landscape. “That’s where Carlos Rafael Rivera’s terrific orchestral music and elements of atmospheric sound design really came together,” explains Stateman.

Stateman has created sound for several Westerns in his prodigious career. His first was The Long Riders back in 1980. Most recently, he designed and supervised the sound on writer/director Quentin Tarantino’s Django Unchained (which earned a 2013 Oscar nom for sound, an MPSE nom and a BAFTA film nom for sound) and The Hateful Eight (nominated for a 2016 Association of Motion Picture Sound Award).

For Godless, Stateman, co-supervisor/re-recording mixer Eric Hoehn and their sound team have already won a 2018 MPSE Award for Sound Editing for their effects and Foley work, as well as a nomination for editing the dialogue and ADR. And don’t be surprised if you see them acknowledged with an Emmy nom this fall.

Capturing authentic sounds: L-R) Jackie Zhou, Wylie Stateman and Eric Hoehn.

Capturing Sounds On Set
Since program length wasn’t a major consideration, Godless takes time to explore the story’s setting and allows the audience to live with the characters in this space that Frank had purpose-built for the show. In New Mexico, Frank had practical sets constructed for the town of La Belle and for Alice Fletcher’s ranch. Stateman, Hoehn and sound team members Jackie Zhou and Leo Marcil camped out at the set locations for a couple weeks, capturing recordings of everything from environmental ambience to gunfire echoes to horse hooves on dirt.

To avoid the craziness that is inherent to a production, the sound team would set up camp in a location where the camera crew was not. This allowed them to capture clean, high-quality recordings at various times of the day. “We would record at sunrise, sunset and the middle of the night — each recording geared toward capturing a range of authentic and ambient sounds,” says Stateman. “Essentially, our goal was to sonically map each location. Our field recordings were wide in terms of channel count, and broad in terms of how we captured the sound of each particular environment. We had multiple independent recording setups, each capable of recording up to eight channels of high bandwidth audio.”

Near the end of the season, there is a big shootout in the town of La Belle, so Stateman and Hoehn wanted to capture the sounds of gunfire and the resulting echoes at that location. They used live rounds, shooting the same caliber of guns used in the show. “We used live rounds to achieve the projectile sounds. A live round sounds very different than a blank round. Blanks just go pop-pop. With live rounds you can literally feel the bullet slicing through the air,” says Stateman.

Eric Hoehn

Recording on location not only supplied the team with a wealth of material to draw from back in the studio, it also gave them an intensive working knowledge of the actual environments. Says Hoehn, “It was helpful to have real-world references when building the textures of the sound design for these various locations and to know firsthand what was happening acoustically, like how the wind was interacting with those structures.”

Stateman notes how quiet and lifeless the location was, particularly at Alice’s ranch. “Part of the sound design’s purpose was to support the desolate dust bowl backdrop. Living there, eating breakfast in the quiet without anybody from the production around was really a wonderful opportunity. In fact, Scott Frank encouraged us to look deep and listen for that feel.”

From Big Skies to Big City
Sound editorial for Godless took place at Light Iron in New York, which is also where the show got its picture editing — by Michelle Tesoro, who was assisted by Hilary Peabody and Charlie Greene. There, Hoehn had a Pro Tools HDX 3 system connected to the picture department’s Avid Media Composer via the Avid Nexis. They could quickly pull in the picture editorial mix, balance out the dialog and add properly leveled sound design, sending that mix back to Tesoro.

“Because there were so many scenes and so much material to get through, we really developed a creative process that centered around rapid prototype mixing,” says Hoehn. “We wanted to get scenes from Michelle and her team as soon as possible and rapidly prototype dialogue mixing and that first layer of sound design. Through the prototyping process, we could start to understand what the really important sounds were for those scenes.”

Using this prototyping audio workflow allowed the sound team to very quickly share concepts with the other creative departments, including the music and VFX teams. This workflow was enhanced through a cloud-based film management/collaboration tool called Pix. Pix let the showrunners, VFX supervisor, composer, sound team and picture team share content and share notes.

“The notes feature in Pix was so important,” explains Hoehn. “Sometimes there were conversations between the director and editor that we could intuitively glean information from, like notes on aesthetic or pace or performance. That created a breadcrumb trail for us to follow while we were prototyping. It was important for us to get as much information as we could so we could be on the same page and have our compass pointed in the right direction when we were doing our first pass prototype.”

Often their first pass prototype was simply refined throughout the post process to become the final sound. “Rarely were we faced with the situation of having to re-cut a whole scene,” he continues. “It was very much in the spirit of the rolling mix and the rolling sound design process.”

Stateman shares an example of how the process worked. “When Michelle first cut a scene, she might cut to a beauty shot that would benefit from wind gusts and/or enhanced VFX and maybe additional dust blowing. We could then rapidly prototype that scene with leveled dialog and sound design before it went to composer Carlos Rafael Rivera. Carlos could hear where/when we were possibly leveraging high-density sound. This insight could influence his musical thinking — if he needed to come in before, on or after the sound effects. Early prototyping informed what became a highly collaborative creative process.”

The Shootout
Another example of the usefulness of Pix was shootout in La Belle in Episode 7. The people of the town position themselves in the windows and doorways of the buildings lining the street, essentially surrounding Frank Griffin (Jeff Daniels) and his gang. There is a lot of gunfire, much of it bridging action on and off camera, and that needed to be represented well through sound.

Hoehn says they found it best to approach the gun battle like a piece of music by playing with repeated rhythms. Breaking the anticipated rhythm helped to make the audience feel off-guard. They built a sound prototype for the scene and shared it via Pix, which gave the VFX department access to it.

“A lot of what we did with sound helped the visual effects team by allowing them to understand the density of what we were doing with the ambient sounds,” says Hoehn. “If we found that rhythmically it was interesting to have a wind gust go by, we would eventually see a visual effect for that wind going by.”

It was a back-and-forth collaboration. “There are visual rhythms and sound rhythms and the fact that we could prototype scenes early led us to a very efficient way of doing long-form,” says Stateman. “It’s funny that features used to be considered long-form but now ‘long-form’ is this new, time-unrestrained storytelling. It’s like we were making a long-form feature, but one that was seven and a half hours. That’s really the beauty of Netflix. Because the shows aren’t tethered to a theatrical release timeframe, we can make stories that linger a little bit and explore the wider eccentricities of character and the time period. It’s really a wonderful time for this particular type of filmmaking.”

While program length may be less of an issue, production schedule lengths still need to be kept in line. With the help of Pix, editorial was able to post the entire show with one team. “Everyone on our small team understood and could participate in the mission,” says Stateman. Additionally, the sound design rapid prototype mixing process allowed everyone in editorial to carry all their work forward, from day one until the last day. The Pro Tools session that they started with on day one was the same Pro Tools session that they used for print mastering seven months later.

“Our sound design process was built around convenient creative approval and continuous refinement of the complete soundtrack. At the end of the day, the thing that we heard most often was that this was a wonderful and fantastic way to work, and why would we ever do it any other way,” Stateman says.

Creating a long-form feature like Godless in an efficient manner required a fluid, collaborative process. “We enjoyed a great team effort,” says Stateman. “It’s always people over devices. What we’ve come to say is, ‘It’s not the devices. It’s people left to their own devices who will discover really novel ways to solve creative problems.’”


Jennifer Walden is a New Jersey-based audio engineer and writer. Follow her on Twitter at @audiojeney.

The Duffer Brothers: Showrunners on Netflix’s Stranger Things

By Iain Blair

Kids in jeopardy! The Demogorgon! The Hawkins Lab! The Upside Down! Thrills and chills! Since they first pitched their idea for Stranger Things, a love letter to 1980’s genre films set in 1983 Indiana, twin brothers Matt and Ross Duffer have quickly established themselves as masters of suspense in the science-fiction and horror genres.

The series was picked up by Netflix, premiered in the summer of 2016, and went on to become a global phenomenon, with the brothers at the helm as writers, directors and executive producers.

The Duffer Brothers

The atmospheric drama, about a group of nerdy misfits and strange events in an outwardly average small town, nailed its early ’80s vibe and overt homages to that decade’s master pop storytellers: Steven Spielberg and Stephen King. It quickly made stars out of its young ensemble cast — Millie Bobby Brown, Natalia Dyer, Charlie Heaton, Joe Keery, Gaten Matarazzo, Caleb McLaughlin, Noah Schnapp, Sadie Sink and Finn Wolfhard.

It also quickly attracted a huge, dedicated fan base, critical plaudits and has won a ton of awards, including Emmys, a SAG Award for Best Ensemble in a Drama Series and two Critics Choice Awards for Best Drama Series and Best Supporting Actor in a Drama Series. The show has also been nominated for a number of Golden Globes.

I recently talked with the Duffers, who are already hard at work on the highly anticipated third season (which will premiere on Netflix in 2019) about making the ambitious hit series, their love of post and editing, and VFX.

How’s the new season going?
Matt Duffer: We’re two weeks into shooting, and it’s going great. We’re very excited about it as there are some new tones and it’s good to be back on the ground with everyone. We know all the actors better and better, the kids are getting older and are becoming these amazing performers — and they were great before. So we’re having a lot of fun.

Are you shooting in Atlanta again?
Ross Duffer: We are, and we love it there. It’s really our home base now, and we love all these pockets of neighborhoods that have not changed at all since the ‘80s, and there is an incredible variety of locations. We’re also spreading out a lot more this season and not spending so much time on stages. We have more locations to play with.

Will all the episodes be released together next year, like last time? That would make binge-watchers very happy.
Matt: Yes, but we like to think of it more as like a big movie release. To release one episode per week feels so antiquated now.

The show has a very cinematic look and feel, so how do you balance that with the demands of TV?
Ross: It’s interesting, because we started out wanting to make movies and we love genre, but with a horror film they want big scares every few minutes. That leaves less room for character development. But with TV, it’s always more about character, as you just can’t sustain hours and hours of a show if you don’t care about the people. So ‘Stranger Things’ was a world where we could tell a genre story, complete with the monster, but also explore character in far more depth than we could in a movie.

Matt: Movies and TV are almost opposites in that way. In movies, it’s all plot and no character, and in TV it’s about character and you have to fight for plot. We wanted this to have pace and feel more like a movie, but still have all the character arcs. So it’s a constant balancing act, and we always try and favor character.

Where do you post the show?
Matt: All in Hollywood, and the editors start working while we’re shooting. After we shoot in Atlanta, we come back to our offices and do all the post and VFX work right there. We do all the sound mix and all the color timing at Technicolor down the road. We love post. You never have enough time on the set, and there’s all this pressure if you want to redo a shot or scene, but in post if a scene isn’t working we can take time to figure it out.

Tell us about the editing. I assume you’re very involved?
Ross: Very. We have two editors this season. We brought back one of our original editors, Dean Zimmerman, from season one. We are also using Nat Fuller, who was on season two. He was Dean’s assistant originally and then moved up, so they’ve been with us since the start. Editing’s our favorite part of the whole process, and we’re right there with them because we love editing. We’re very hands on and don’t just give notes and walk away. We’re there the whole time.

Aren’t you self-taught in terms of editing?
Matt: (Laughs) I suppose. We were taught the fundamentals of Avid at film school, but you’re right. We basically taught ourselves to edit as kids, and we started off just editing in-camera, stopping and starting, and playing the music from a tape recorder. They weren’t very good, but we got better.

When iMovie came out we learned how to put scenes together, so in college the transition to Avid wasn’t that hard. We fell in love with editing and just how much you can elevate your material in post. It’s magical what you can do with the pace, performances, music and sound design, and then you add all the visual effects and see it all come together in post. We love seeing the power of post as you work to make your story better and better.

How early on do you integrate post and VFX with the production?
Ross: On day one now. The biggest change from season one to two was that we integrated post far earlier in the second season — even in the writing stage. We had concept artists and the VFX guys with us the whole time on set, and they were all super-involved. So now it all kind of happens together.

All the VFX are a much bigger deal. For last season we had a lot more VFX than the first year — about 1,400 shots, which is a huge amount, like a big movie. The first season it wasn’t a big deal. It was a very old-school approach, with mainly practical effects, and then in the middle we realized we were being a bit naïve, so we brought in Paul Graff as our VFX supervisor on season two, and he’s very experienced. He’s worked on big movies like The Wolf of Wall Street as well as Game of Thrones and Boardwalk Empire, and he’s doing this season too. He’s in Atlanta with us on the shoot.

We have two main VFX houses on the show — Atomic Fiction and Rodeo — they’re both incredible, and I think all the VFX are really cinematic now.

But isn’t it a big challenge in terms of a TV show’s schedule?
Ross: You’re right, and it’s always a big time crunch. Last year we had to meet that Halloween worldwide release date and we were cutting it so close trying to finish all the shots in time.

Matt: Everyone expects movie-quality VFX — just in a quarter of the time, or less. So it’s all accelerated.

The show has a very distinct, eerie, synth-heavy score by Kyle Dixon and Michael Stein, the Grammy nominated duo. How important is the music and sound, which won several Emmys last year?
Ross: It’s huge. We use it so much for transitions, and we have great sound designers — including Brad North and Craig Henighan — and great mixers, and we pay a lot of attention to all of it. I think TV has always put less emphasis on great sound compared to film, and again, you’re always up against the scheduling, so it’s always this balancing act.

You can’t mix it for a movie theater as very few people have that set up at home, so you have to design it for most people who’re watching on iPhones, iPads and so on, and optimize it for that, so we mostly mix in stereo. We want the big movie sound, but it’s a compromise.

The DI must be vital?
Matt: Yes, and we work very closely with colorist Skip Kimball (who recently joined Efilm), who’s been with us since the start. He was very influential in terms of how the show ended up looking. We’d discussed the kind of aesthetic we wanted, and things we wanted to reference and then he played around with the look and palette. We’ve developed a look we’re all really happy with. We have three different LUTs on set designed by Skip and the DP Tim Ives will choose the best one for each location.

Everyone’s calling this the golden age of TV. Do you like being showrunners?
Ross: We do, and I feel we’re very lucky to have the chance to do this show — it feels like a big family. Yes, we originally wanted to be movie directors, but we didn’t come into this industry at the right time, and Netflix has been so great and given us so much creative freedom. I think we’ll do a few more seasons of this, and then maybe wrap it up. We don’t want to repeat ourselves.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Capturing, creating historical sounds for AMC’s The Terror

By Jennifer Walden

It’s September 1846. Two British ships — the HMS Erebus and HMS Terror — are on an exploration to find the Northwest Passage to the Pacific Ocean. The expedition’s leader, British Royal Navy Captain Sir John Franklin, leaves the Erebus to dine with Captain Francis Crozier aboard the Terror. A small crew rows Franklin across the frigid, ice-choked Arctic Ocean that lies north of Canada’s mainland to the other vessel.

The opening overhead shot of the two ships in AMC’s new series The Terror (Mondays 9/8c) gives the audience an idea of just how large those ice chunks are in comparison with the ships. It’s a stunning view of the harsh environment, a view that was completely achieved with CGI and visual effects because this series was actually shot on a soundstage at Stern Film Studio, north of Budapest, Hungary.

 Photo Credit: Aidan Monaghan/AMC

Emmy- and BAFTA-award-winning supervising sound editor Lee Walpole of Boom Post in London, says the first cut he got of that scene lacked the VFX, and therefore required a bit of imagination. “You have this shot above the ships looking down, and you see this massive green floor of the studio and someone dressed in a green suit pushing this boat across the floor. Then we got the incredible CGI, and you’d never know how it looked in that first cut. Ultimately, mostly everything in The Terror had to be imagined, recorded, treated and designed specifically for the show,” he says.

Sound plays a huge role in the show. Literally everything you hear (except dialogue) was created in post — the constant Arctic winds, the footsteps out on the packed ice and walking around on the ship, the persistent all-male murmur of 70 crew members living in a 300-foot space, the boat creaks, the ice groans and, of course, the creature sounds. The pervasive environmental sounds sell the harsh reality of the expedition.

Thanks to the sound and the CGI, you’d never know this show was shot on a soundstage. “It’s not often that we get a chance to ‘world-create’ to that extent and in that fashion,” explains Walpole. “The sound isn’t just there in the background supporting the story. Sound becomes a principal character of the show.”

Bringing the past to life through sound is one of Walpole’s specialties. He’s created sound for The Crown, Peaky Blinders, Klondike, War & Peace, The Imitation Game, The King’s Speech and more. He takes a hands-on approach to historical sounds, like recording location footsteps in Lancaster House for the Buckingham Palace scenes in The Crown, and recording the sounds on-board the Cutty Sark for the ships in To the Ends of the Earth (2005). For The Terror, his team spent time on-board the Golden Hind, which is a replica of Sir Francis Drake’s ship of the same name.

During a 5am recording session, the team — equipped with a Sound Devices 744T recorder and a Schoeps CMIT 5U mic — captured footsteps in all of the rooms on-board, pick-ups and put-downs of glasses and cups, drops of various objects on different surfaces, gun sounds and a selection of rigging, pulleys and rope moves. They even recorded hammering. “We took along a wooden plank and several hammers,” describes Walpole. “We laid the plank across various surfaces on the boat so we could record the sound of hammering resonating around the hull without causing any damage to the boat itself.”

They also recorded footsteps in the ice and snow and reached out to other sound recordists for snow and ice footsteps. “We wanted to get an authentic snow creak and crunch, to have the character of the snow marry up with the depth and freshness of the snow we see at specific points in the story. Any movement from our characters out on the pack ice was track-laid, step-by-step, with live recordings in snow. No studio Foley feet were recorded at all,” says Walpole.

In The Terror, the ocean freezes around the two ships, immobilizing them in pack ice that extends for miles. As the water continues to freeze, the ice grows and it slowly crushes the ships. In the distance, there’s the sound of the ice growing and shifting (almost like tectonic plates), which Walpole created from sourced hydrophone recordings from a frozen lake in Canada. The recordings had ice pings and cracking that, when slowed and pitched down, sounded like massive sheets of ice rubbing against each other.

Effects editor Saoirse Christopherson capturing sounds on board a kayak in the Thames River.

The sounds of the ice rubbing against the ships were captured by one of the show’s sound effects editor, Saoirse Christopherson, who along with an assistant, boarded a kayak and paddled out onto the frozen Thames River. Using a Røde NT2 and a Roland R26 recorder with several contact mics strapped to the kayak’s hull, they spent the day grinding through, over and against the ice. “The NT2 was used to directionally record both the internal impact sounds of the ice on the hull and also any external ice creaking sounds they could generate with the kayak,” says Walpole.

He slowed those recordings down significantly and used EQ and filters to bring out the low-mid to low-end frequencies. “I also fed them through custom settings on my TC Electronic reverbs to bring them to life and to expand their scale,” he says.

The pressure of the ice is slowly crushing the ships, and as the season progresses the situation escalates to the point where the crew can’t imagine staying there another winter. To tell that story through sound, Walpole began with recordings of windmill creaks and groans. “As the situation gets more dire, the sound becomes shorter and sharper, with close, squealing creaks that sound as though the cabins themselves are warping and being pulled apart.”

In the first episode, the Erebus runs aground on the ice and the crew tries to hack and saw the ice away from the ship. Those sounds were recorded by Walpole attacking the frozen pond in his backyard with axes and a saw. “That’s my saw cutting through my pond, and the axe material is used throughout the show as they are chipping away around the boat to keep the pack ice from engulfing it.”

Whether the crew is on the boat or on the ice, the sound of the Arctic is ever-present. Around the ships, the wind rips over the hulls and howls through the rigging on deck. It gusts and moans outside the cabin windows. Out on the ice, the wind constantly groans or shrieks. “Outside, I wanted it to feel almost like an alien planet. I constructed a palette of designed wind beds for that purpose,” says Walpole.

He treated recordings of wind howling through various cracks to create a sense of blizzard winds outside the hull. He also sourced recordings of wind at a disused Navy bunker. “It’s essentially these heavy stone cells along the coast. I slowed these recordings down a little and softened all of them with EQ. They became the ‘holding airs’ within the boat. They felt heavy and dense.”

Below Deck
In addition to the heavy-air atmospheres, another important sound below deck was that of the crew. The ships were entirely occupied by men, so Walpole needed a wide and varied palette of male-only walla to sustain a sense of life on-board. “There’s not much available in sound libraries, or in my own library — and certainly not enough to sustain a 10-hour show,” he says.

So they organized a live crowd recording session with a group of men from CADS — an amateur dramatics society from Churt, just outside of London. “We gave them scenarios and described scenes from the show and they would act it out live in the open air for us. This gave us a really varied palette of worldized effects beds of male-only crowds that we could sit the loop group on top of. It was absolutely invaluable material in bringing this world to life.”

Visually, the rooms and cabins are sometimes quite similar, so Walpole uses sound to help the audience understand where they are on the ship. In his cutting room, he had the floor plans of both ships taped to the walls so he could see their layouts. Life on the ship is mainly concentrated on the lower deck — the level directly below the upper deck. Here is where the men sleep. It also has the canteen area, various cabins and the officers’ mess.

Below that is the Orlop deck, where there are workrooms and storerooms. Then below that is the hold, which is permanently below the waterline. “I wanted to be very meticulous about what you would hear at the various levels on the boat and indeed the relative sound level of what you are hearing in these locations,” explains Walpole. “When we are on the lower two decks, you hear very little of the sound of the men above. The soundscapes there are instead focused on the creaks and the warping of the hull and the grinding of the ice as it crushes against the boat.”

One of Walpole’s favorite scenes is the beginning of Episode 4. Capt. Francis Crozier (Jared Harris) is sitting in his cabin listening to the sound of the pack ice outside, and the room sharply tilts as the ice shifts the ship. The scene offers an opportunity to tell a cause-and-effect story through sound. “You hear the cracks and pings of the ice pack in the distance and then that becomes localized with the kayak recordings of the ice grinding against the boat, and then we hear the boat and Crozier’s cabin creak and pop as it shifts. This ultimately causes his bottle to go flying across the table. I really enjoyed having this tale of varying scales. You have this massive movement out on the ice and the ultimate conclusion of it is this bottle sliding across the table. It’s very much a sound moment because Crozier is not really saying anything. He’s just sitting there listening, so that offered us a lot of space to play with the sound.”

The Tuunbaq
The crew in The Terror isn’t just battling the elements, scurvy, starvation and mutiny. They’re also being killed off by a polar bear-like creature called the Tuunbaq. It’s part animal, part mythical creature that is tied to the land and spirits around it. The creature is largely unseen for the first part of the season so Walpole created sonic hints as to the creature’s make-up.

Walpole worked with showrunner David Kajganich to find the creature’s voice. Kajganich wanted the creature to convey a human intelligence, and he shared recordings of human exorcisms as reference material. They hired voice artist Atli Gunnarsson to perform parts to picture, which Walpole then fed into the Dehumaniser plug-in by Krotos. “Some of the recordings we used raw as well, says Walpole. “This guy could make these crazy sounds. His voice could go so deep.”

Those performances were layered into the track alongside recordings of real bears, which gave the sound the correct diaphragm, weight, and scale. “After that, I turned to dry ice screeches and worked those into the voice to bring a supernatural flavor and to tie the creature into the icy landscape that it comes from.”

Lee Walpole

In Episode 3, an Inuit character named Lady Silence (Nive Nielsen) is sitting in her igloo and the Tuunbaq arrives snuffling and snorting on the other side of the door flap. Then the Tuunbaq begins to “sing” at her. To create that singing, Walpole reveals that he pulled Lady Silence’s performance of The Summoning Song (the song her people use to summon the Tuunbaq to them) from a later episode and fed that into Dehumaniser. “This gave me the creature’s version. So it sounds like the creature is singing the song back to her. That’s one for the diehards who will pick up on it and recognize the tune,” he says.

Since the series is shot on a soundstage, there’s no usable bed of production sound to act as a jumping off point for the post sound team. But instead of that being a challenge, Walpole finds it liberating. “In terms of sound design, it really meant we had to create everything from scratch. Sound plays such a huge role in creating the atmosphere and the feel of the show. When the crew is stuck below decks, it’s the sound that tells you about the Arctic world outside. And the sound ultimately conveys the perils of the ship slowly being crushed by the pack ice. It’s not often in your career that you get such a blank canvas of creation.”


Jennifer Walden is a New Jersey-based audio engineer and writer. You can follow her on Twitter at @audiojeney.

Netflix’s Altered Carbon: the look, the feel, the post

By Randi Altman

Netflix’s Altered Carbon is a new sci-fi series set in a dystopian future where people are immortal thanks to something called “stacks,” which contain their entire essence — their personalities, their memories, everything. The one setback is that unless you are a Meth (one of the rich and powerful), you need to buy a “sleeve” (a body) for your stack, and it might not have any resemblance to your former self. It could be a different color, a different sex, a different age, a different everything. You have to take what you can get.

Based on a 2002 novel by Richard K. Morgan, it stars Swedish actor Joel Kinnaman.

Jill Bogdanowicz

We reached out to the show’s colorist, Jill Bogdanowicz, as well as post producer Allen Marshall Palmer to find out more about the show’s varied and distinctive looks.

The look has a very Blade Runner-type feel. Was that in homage to the films?
Bogdanowicz: The creators wanted a film noir look. Blade Runner is the same genre, but the show isn’t specifically an homage to Blade Runner.

Palmer: I’ll leave that for fans to dissect.

Jill, can you talk about your process? What tools did you use?
Bogdanowicz: I designed a LUT to create that film noir look before shooting. I actually provided a few options, and they chose my favorite one and used it throughout. After they shot everything and I had all 10 episodes in my bay, I got familiar with the content, wrapped my head around the story and came up with ideas to tell that story with color.

The show covers many different times and places so scenes needed to be treated visually to show audiences where the story is and what’s happened. I colored both HDR (Dolby Vision) and SDR passes using DaVinci Resolve.

I worked very closely with both DPs — Martin Ahlgren and Neville Kidd — in pre-timing the show, and they gave me a nice idea of what they were looking for so I had a great starting point. They were very close knit. The entire team on this project was an absolute pleasure, and it was a great creative collaboration, which comes through in the final product of the show.

The show is shot and posted like a feature and has a feature feel. Was that part of your marching orders?
Bogdanowicz: I’m primarily a features colorist, so I’m very familiar with the film noir look and heavy VFX, and that’s one reason I was included on this project. It was right up my alley.

Palmer: We approached Altered Carbon as a 10-part feature rather than a television series. I coined the term “feature episodic entertainment,” which describes what we were aspiring to — destination viewing instead of something merely disposable. In a world with so many viewing options, we wanted to command the viewer’s full attention, and fans are rewarded for that attention.

We were very concerned about how images, especially VFX, were going to look in HDR so we had weekly VFX approval sessions with Jill, our mastering colorist, in her color timing bay.

Executive producers and studio along with the VFX and post teams were able to sit together — adjusting color corrections if needed before giving final approval on shots. This gave us really good technical and creative quality control. Despite our initial concerns about VFX shots in HDR, we found that with vendors like Double Negative and Milk with their robust 16-bit EXR pipelines we weren’t “breaking” VFX shots when color correcting for HDR.

How did the VFX affect the workflow?
Bogdanowicz: Because I was brought on so early, the LUT I created was shared with the VFX vendors so they had a good estimation of the show’s contrast. That really helped them visualize the look of the show so that the look of the shots was pretty darn close by the time I got them in my bay.

Was there a favorite scene or scenes?
Bogdanowicz: There are so many spectacular moments, but the emotional core for me is in episode 104 when we see the beginning of the Kovacs and Quell love story in the past and how that love gives Kovacs the strength to survive in the present day.

Palmer: That’s a tough question! There are so many, it’s hard to choose. I think the episode that really jumps out is the one in which Joel Kinnaman’s character is being tortured and the content skips back and forth in time, changes and alternates between VR and reality. It was fun to create a different visual language for each space.

Can you talk about challenges in the process and how you overcame them?
Bogdanowicz: The show features a lot of VFX and they all need to look as real as possible, so I had to make sure they felt part of the worlds. Fortunately, VFX supervisor Everett Burrell and his team are amazing and the VFX is top notch. Coming up with different ideas and collaborating with producers James Middleton and Laeta Kalogridis on those ideas was a really fun creative challenge. I used the Sapphire VFX plugin for Resolve to heavily treat and texture VR looks in different ways.

Palmer: In addition to the data management challenges on the picture side, we were dealing with mixing in Dolby Atmos. It was very easy to get distracted with how great the Atmos mix sounds — the downmixes generally translated very well, but monitoring in 5.1 and 2.0 did reveal some small details that we wanted to adjust. Generally, we’re very happy with how both the picture and sound is translating into viewer’s homes.

Dolby Vision HDR is great at taking what’s in the color bay into the home viewing environment, but there are still so many variables in viewing set-ups that you can still end up chasing your own tail. It was great to see the behind the scenes of Netflix’s dedication to providing the best picture and sound quality through the service.

The look of the AI hotel was so warm. I wanted to live there. Can you talk about that look?
Bogdanowicz: The AI hotel look was mostly done in design and lighting. I saw the warm practical lights and rich details in the architecture and throughout the hotel and ran with it. I just aimed to keep the look filmic and inviting.

What about the look of where the wealthy people lived?
Bogdanowicz: The Meth houses are above the clouds, so we kept the look very clean and cool with a lot of true whites and elegant color separation.

Seems like there were a few different looks within the show?
Bogdanowicz: The same LUT for the film noir look is used throughout the show, but the VR looks are very different. I used Sapphire to come up with different concepts and textures for the different VR looks, from rich quality of the high-end VR to the cheap VR found underneath a noodle bar.

Allen, can you walk us through the workflow from production to post?
Palmer: With the exception of specialty shots, the show was photographed on Alexa 65 — mostly in 5K mode, but occasionally in 6.5K and 4K for certain lenses. The camera is beautiful and a large part of the show’s cinematic look, but it generates a lot of data (about 1.9TB/hour for 5K) so this was the first challenge. The camera dictates using the Codex Vault system, and Encore Vancouver was up to the task for handling this material. We wanted to get the amount of data down for post, so we generated 4096×2304 ProRes 4444XQ “mezzanine” files, which we used for almost all of the show assembly and VFX pulls.

During production and post, all of our 4K files were kept online at Efilm using their portal system. This allowed us fast, automated access to the material so we could quickly do VFX pulls, manage color, generate 16-bit EXR frames and send those off to VFX vendors. We knew that time saved there was going to give us more time on the back end to work creatively on the shots so the Portal was a very valuable tool.

How many VFX shots did you average per episode? Seems like a ton, especially with the AI characters. Who provided those and what were those turnarounds like?
Palmer: There were around 2,300 visual effects shots during this season — probably less than most people would think because we built a large Bay City street inside a former newspaper printing facility outside of Vancouver. The shot turnaround varied depending on the complexity and where we were in the schedule. We were lucky that something like episode 1’s “limo ride” sequence was started very early on because it gave us a lot of time to refine our first grand views of Bay City. Our VFX supervisor Everett Burrell and VFX producer Tony Meagher were able to get us out in front of a lot of challenges like the amount of 3D work in the last two episodes by starting that work early on since we knew we would need those shots from the script and prep phase.

The 16th annual VES Award winners

The Visual Effects Society (VES) celebrated artists and their work at the 16th Annual VES Awards, which recognize outstanding visual effects artistry and innovation in film, animation, television, commercials, video games and special venues.

Seven-time host, comedian Patton Oswalt, presided over more than 1,000 guests at the Beverly Hilton. War for the Planet of the Apes was named photoreal feature film winner, earning four awards. Coco was named top animated film, also earning four awards. Games of Thrones was named best photoreal episode and garnered five awards — the most wins of the night. Samsung; Do What You Can’t; Ostrich won top honors in the commercial field, scoring three awards. These top four contenders collectively garnered 16 of the 24 awards for outstanding visual effects.

President of Marvel Studios Kevin Feige presented the VES Lifetime Achievement Award to producer/writer/director Jon Favreau. Academy Award-winning producer Jon Landau presented the Georges Méliès Award to Academy Award-winning visual effects master Joe Letteri, VES. Awards presenters included fan-favorite Mark Hamill, Coco director Lee Unkrich, War for the Planet of the Apes director Matt Reeves, Academy Award-nominee Diane Warren, Jaime Camil, Dan Stevens, Elizabeth Henstridge, Sydelle Noel, Katy Mixon and Gabriel “Fluffy” Iglesias.

Here is a list of the winners:

Outstanding Visual Effects in a Photoreal Feature

War for the Planet of the Apes

Joe Letteri

Ryan Stafford

Daniel Barrett

Dan Lemmon

Joel Whist

 

Outstanding Supporting Visual Effects in a Photoreal Feature

Dunkirk

Andrew Jackson

Mike Chambers

Andrew Lockley

Alison Wortman

Scott Fisher

 

Outstanding Visual Effects in an Animated Feature

Coco

Lee Unkrich

Darla K. Anderson

David Ryu

Michael K. O’Brien

 

Outstanding Visual Effects in a Photoreal Episode

Game of Thrones: Beyond the Wall

Joe Bauer

Steve Kullback

Chris Baird

David Ramos

Sam Conway

 

Outstanding Supporting Visual Effects in a Photoreal Episode

Black Sails: XXIX

Erik Henry

Terron Pratt

Yafei Wu

David Wahlberg

Paul Dimmer

 

Outstanding Visual Effects in a Real-Time Project

Assassin’s Creed Origins

Raphael Lacoste

Patrick Limoges

Jean-Sebastien Guay

Ulrich Haar

 

Outstanding Visual Effects in a Commercial

Samsung Do What You Can’t: Ostrich

Diarmid Harrison-Murray

Tomek Zietkiewicz

Amir Bazazi

Martino Madeddu

 

 

Outstanding Visual Effects in a Special Venue Project

Avatar: Flight of Passage

Richard Baneham

Amy Jupiter

David Lester

Thrain Shadbolt

 

Outstanding Animated Character in a Photoreal Feature

War for the Planet of the Apes: Caesar

Dennis Yoo

Ludovic Chailloleau

Douglas McHale

Tim Forbes

 

Outstanding Animated Character in an Animated Feature

Coco: Hèctor

Emron Grover

Jonathan Hoffman

Michael Honsel

Guilherme Sauerbronn Jacinto

 

Outstanding Animated Character in an Episode or Real-Time Project

Game of Thrones The Spoils of War: Drogon Loot Train Attack

Murray Stevenson

Jason Snyman

Jenn Taylor

Florian Friedmann

 

Outstanding Animated Character in a Commercial

Samsung Do What You Can’t: Ostrich

David Bryan

Maximilian Mallmann

Tim Van Hussen

Brendan Fagan

 

Outstanding Created Environment in a Photoreal Feature

Blade Runner 2049; Los Angeles

Chris McLaughlin

Rhys Salcombe

Seungjin Woo

Francesco Dell’Anna

 

Outstanding Created Environment in an Animated Feature

Coco: City of the Dead

Michael Frederickson

Jamie Hecker

Jonathan Pytko

Dave Strick

 

Outstanding Created Environment in an Episode, Commercial, or Real-Time Project

Game of Thrones; Beyond the Wall; Frozen Lake

Daniel Villalba

Antonio Lado

José Luis Barreiro

Isaac de la Pompa

 

Outstanding Virtual Cinematography in a Photoreal Project

Guardians of the Galaxy Vol. 2: Groot Dance/Opening Fight

James Baker

Steven Lo

Alvise Avati

Robert Stipp

 

Outstanding Model in a Photoreal or Animated Project

Blade Runner 2049: LAPD Headquarters

Alex Funke

Steven Saunders

Joaquin Loyzaga

Chris Menges

 

Outstanding Effects Simulations in a Photoreal Feature

War for the Planet of the Apes

David Caeiro Cebrián

Johnathan Nixon

Chet Leavai

Gary Boyle

 

Outstanding Effects Simulations in an Animated Feature

Coco

Kristopher Campbell

Stephen Gustafson

Dave Hale

Keith Klohn

 

Outstanding Effects Simulations in an Episode, Commercial, or Real-Time Project 

Game of Thrones; The Dragon and the Wolf; Wall Destruction

Thomas Hullin

Dominik Kirouac

Sylvain Nouveau

Nathan Arbuckle

  

Outstanding Compositing in a Photoreal Feature

War for the Planet of the Apes

Christoph Salzmann

Robin Hollander

Ben Warner

Beck Veitch

 

Outstanding Compositing in a Photoreal Episode

Game of Thrones The Spoils of War: Loot Train Attack

Dom Hellier

Thijs Noij

Edwin Holdsworth

Giacomo Matteucci

 

Outstanding Compositing in a Photoreal Commercial

Samsung Do What You Can’t: Ostrich

Michael Gregory

Andrew Roberts

Gustavo Bellon

Rashabh Ramesh Butani

 

Outstanding Visual Effects in a Student Project

Hybrids

Florian Brauch

Romain Thirion

Matthieu Pujol

Kim Tailhades 

 

 

 

Capturing Foley for Epix’s Berlin Station

Now in its second season on Epix, the drama series Berlin Station centers on undercover agents, diplomats and whistleblowers inhabiting a shadow world inside the German capital.

Leslie Bloome

Working under the direction of series supervising sound editor Ruy Garcia, Westchester, New York-based Foley studio Alchemy Post Sound is providing Berlin Station with cinematic sound. Practical effects, like the clatter of weapons and clinking glass, are recorded on the facility’s main Foley stage. Certain environmental effects are captured on location at sites whose ambience is like the show’s settings. Interior footsteps, meanwhile, are recorded in the facility’s new “live” room, a 1,300-square-foot space with natural reverb that’s used to replicate the environment of rooms with concrete, linoleum and tile floors.

Garcia wants a soundtrack with a lot of detail and depth of field,” explains lead Foley artist and Alchemy Post founder Leslie Bloome. “So, it’s important to perform sounds in the proper perspective. Our entire team of editors, engineers and Foley artists need to be on point regarding the location and depth of field of sounds we’re recording. Our aim is to make every setting feel like a real place.”

A frequent task for the Foley team is to come up with sounds for high-tech cameras, surveillance equipment and other spy gadgetry. Foley artist Joanna Fang notes that sophisticated wall safes appear in several episodes, each one featuring differing combinations of electronic, latch and door sounds. She adds that in one episode a character has a microchip concealed in his suit jacket and the Foley team needed to invent the muffled crunch the chip makes when the man is frisked. “It’s one of those little ‘non-sounds’ that Foley specializes in,” she says. “Most people take it for granted, but it helps tell the story.”

The team is also called on to create Foley effects associated with specific exterior and interior locations. This can include everything from seedy safe houses and bars to modern office suites and upscale hotel rooms. When possible, Alchemy prefers to record such effects on location at sites closely resembling those pictured on-screen. Bloome says that recording things like creaky wood floors on location results in effects that sound more real. “The natural ambiance allows us to grab the essence of the moment,” he explains, “and keep viewers engaged with the scene.”

Footsteps are another regular Foley task. Fang points out that there is a lot of cat-and-mouse action with one character following another or being pursued, and the patter of footsteps adds to the tension. “The footsteps are kind of tough,” she says. “Many of the characters are either diplomats or spies and they all wear hard soled shoes. It’s hard to build contrast, so we end up creating a hierarchy, dark powerful heels for strong characters, lighter shoes for secondary roles.”

For interior footsteps, large theatrical curtains are used to adjust the ambiance in the live stage to fit the scene. “If it’s an office or a small room in a house, we draw the curtains to cut the room in half; if it’s a hotel lobby, we open them up,” Fang explains. “It’s amazing. We’re not only creating depth and contrast by using different types of shoes and walking surfaces, we’re doing it by adjusting the size of the recording space.”

Alchemy edits their Foley in-house and delivers pre-mixed and synced Foley that can be dropped right into the final mix seamlessly. “The things we’re doing with location Foley and perspective mixing are really cool,” says Foley editor and mixer Nicholas Seaman. “But it also means the responsibility for getting the sound right falls squarely on our shoulders. There is no ‘fix in the mix.’ From our point of view, the Foley should be able to stand on its own. You should be able to watch a scene and understand what’s going on without hearing a single line of dialogue.”

The studio used Neumann U87 and KMR81 microphones, a Millennia mic-pre and Apogee converter, all recorded into Avid Pro Tools on a C24 console. In addition to recording a lot of guns, Alchemy also borrowed a Doomsday prep kit for some of the sounds.

The challenge to deliver sound effects that can stand up to that level of scrutiny keeps the Foley team on its toes. “It’s a fascinating show,” says Fang. “One moment, we’re inside the station with the usual office sounds and in the next edit, we’re in the field in the middle of a machine gun battle. From one episode to the next, we never know what’s going to be thrown at us.”

House of Moves add Selma Gladney-Edelman, Alastair Macleod

Animation and motion capture studio House of Moves (HOM) has strengthened its team with two new hires — Selma Gladney-Edelman was brought on as executive producer and Alastair Macleod as head of production technology. The two industry vets are coming on board as the studio shifts to offer more custom short- and long-form content, and expands its motion capture technology workflows to its television, feature film, video game and corporate clients.

Selma Gladney-Edelman was most recently VP of Marvel Television for their primetime and animated series. She has worked in film production, animation and visual effects, and was a producer on multiple episodic series at Walt Disney Television Animation, Cartoon Network and Universal Animation. As director of production management across all of the Discovery Channels, she oversaw thousands of hours of television and film programming including TLC projects Say Yes To the Dress, Little People, Big World and Toddlers and Tiaras, while working on the team that garnered an Oscar nom for Werner Herzog’s Encounters at the End of the World and two Emmy wins for Best Children’s Animated Series for Tutenstein.

Scotland native Alastair Macleod is a motion capture expert who has worked in production, technology development and as an animation educator. His production experience includes work on films such as Lord of the Rings: The Two Towers, The Matrix Reloaded, The Matrix Revolutions, 2012, The Twilight Saga: Breaking Dawn — Part 2 and Kubo and the Two Strings for facilities that include Laika, Image Engine, Weta Digital and others.

Macleod pioneered full body motion capture and virtual reality at the research department of Emily Carr University in Vancouver. He was also the head of animation at Vancouver Film School and an instructor at Capilano University in Vancouver. Additionally, he developed PeelSolve, a motion capture solver plug-in for Autodesk Maya.

The sound of Netflix’s The Defenders

By Jennifer Walden

Netflix’s The Defenders combines the stories of four different Marvel shows already on the streaming service: Daredevil, Iron Fist, Luke Cage and Jessica Jones. In the new show, the previously independent superheroes find themselves all wanting to battle the same foe —a cultish organization called The Hand, which plans to destroy New York City. Putting their differences aside, the superheroes band together to protect their beloved city.

Supervising sound editor Lauren Stephens, who works at Technicolor at Paramount, has earned two Emmy nominations for her sound editing work on Daredevil. And she supervised the sound for each of the aforementioned Marvel series, with the exception of Jessica Jones. So when it came to designing The Defenders she was very conscious of maintaining the specific sonic characteristics they had already established.

“We were dedicated to preserving the palette of each of the previous Marvel characters’ neighborhoods and sound effects,” she explains. “In The Defenders, we wanted viewers of the individual series to recognize the sound of Luke’s Harlem and Daredevil’s Hell’s Kitchen, for example. In addition, we kept continuity for all of the fight material and design work established in the previous four series. I can’t think of another series besides Better Call Saul that borrows directly from its predecessors’ sound work.”

But it wasn’t all borrowed material. Eventually, Luke Cage (Mike Colter), Daredevil (Charlie Cox), Jessica Jones (Krysten Ritter), Iron Fist (Finn Jones) and Elektra Natchios (Elodie Yung) come together to fight The Hand’s leader Alexandra Reid (Sigourney Weaver). “We experience new locations, and new fighting techniques and styles,” says Stephens. “Not to mention that half the city gets destroyed by The Hand. We haven’t had that happen in the previous series.”

Even though these Netflix/Marvel series are based on superheroes, the sound isn’t overly sci-fi. It’s as though the superheroes have more practical superhuman abilities. Stephens says their fight sounds are all real punches and impacts, with some design elements added only when needed, such as when Iron Fist’s iron fist is activated. “At the heart of our punches, for instance, is the sound of a real fist striking a side of beef,” she says. “It sounds like you’d expect, and then we amp it up when we mix. We record a ton of cloth movement and bodies scraping and sliding and tumbling in Foley. Those elements connect us to the humans on-screen.”

Since most of the violence plays out in hand-to-hand combat, it takes a lot of editing to make those fight scenes, and it involves contributions from several sound departments. Stephens has her hard effects team — led by sound designer Jordon Wilby (who has worked on all the Netflix/Marvel series) cut sound effects for every single punch, grab, flip, throw and land. In addition, they cut metal shings and whooshes, impacts and drops for weapons, crashes and bumps into walls and furniture, and all the gunshot material.

Stephens then has the Technicolor Foley team — Foley artists Zane Bruce and Lindsay Pepper and mixer Antony Zeller —cover all the footsteps, cloth “scuffle,” wall bumps, body falls and grabs. Additionally, she has dialogue editor Christian Buenaventura clean up any dialogue that occurs within or around the fight scenes. With group ADR, they replace every grunt and effort for each individual in the fight so that they have ultimate control over every element during the mix.

Stephens finds Gallery’s SpotStudio to be very helpful for cueing all the group ADR. “I shoot a lot of group ADR for the fights and to help create the right populated feel for NYC. SpotStudio is a slick program that interfaces well with Avid’s Pro Tools. It grabs timecode location of ADR cues and can then output that to many word processing programs. Personally, I use FileMaker Pro. I can make great cuesheets that are easy to format and use for engineers and talent.”

All that effort results in fight scenes that feel “relentless and painful,” says Stephens. “I want them to have movement, tons of detail and a wide range of dynamics. I want the fights to sound great wherever our fans are listening.”

The most challenging fight in The Defenders happens in the season finale, when the superheroes fight The Hand in the sublevels of a building. “That underground fight was the toughest simply because it was endless and shot with a 360-degree turn. I focused on what was on-screen and continued those sounds just until the action passed out of frame. This kept our tracks from getting too cluttered but still gives us the right idea that 60 people are going at it,” concludes Stephens

Fear the Walking Dead: Encore colorist teams with DPs for parched look

The action of AMC’s zombie-infused Fear the Walking Dead this season is set in a brittle, drought-plagued environment, which becomes progressively more parched as the story unfolds. So when production was about to commence, the show’s principals were concerned that the normally-arid shoot locations in Mexico had undergone record rainfall and were suffused with verdant vegetation as far as the eye could see.

Pankaj Bajpai of Encore, who has color graded the series from the start, and the two new cinematographers hired for this season — Christopher LaVasseur and Scott Peck — conferred early on about how best to handle this surprising development.

It wouldn’t have been practical to move locations or try to “dress” the scenes to look as described on the page, nor would the budget allow for addressing the issue through VFX. Bajpai, who, in addition to his colorist work also oversees new workflows for Encore, realized that although he could produce the desired effect in his FilmLight Baselight toolset through multiple keys and windows, that too would be less than practical.

Instead, he proposed using a technique that’s far from standard operating procedure for a series. “We got ‘under the hood’ of the Baselight,” he says, “and set up color suppression matrices,” which essentially use mathematical equations to define the degree to which each of the primary colors — red, green and blue — can be represented in an image. The technique, he explains, “allows you to be much more specific about suppressing certain hues without affecting everything else as much as you would by keying a color or manipulating saturation.”

Once designed, these restrictions on the green within the imagery could be dialed up or down, primarily affecting just the colors in the foliage that the filmmakers wanted to re-define, without collateral damage to skin tones and other elements that they didn’t want effects. “I knew that the cinematographers could shoot in the location and we could alter the environment as necessary in the grade,” Bajpai says. He showed the DPs how effective the technique was, and they quickly got on board. Peck, who was able to sit in on the grading for the first episode, recalls, “One of the things I was concerned with was this whole question about the green [foliage] because I knew in the story as the season progresses, water becomes less available. So this idea of changing the greens had to be a gradual process up to around episode nine. There was still a lot of discussion about how we are going to do this. But I knew just working with Pankaj at Encore for a day, that we could do it in the color grade.”

Of course, there was more to work out between Bajpai and the cinematographers, who’d been charged by the producers with taking the look in a somewhat new direction. “Wherever possible I wanted to plan as much with the cinematographers early on so that we’re all working toward a common goal,” he says.

Prior to this season’s start of production, Bajpai and the two DPs developed a shooting LUT to use in conjunction with the specific combination of lenses and the Arri sensors they would use to shoot the season. “Scott recommended using the Hawk T1 prime lenses,” says LaVasseur, “and I suggested going with a fairly low-contrast LUT.” Borrowing language from the photochemical days, he explains, “We wanted to start with a soft image and then ‘print’ really hard,” to yield the show’s edgy, contrasty type of look.

Bajpai calibrated the DPs’ laptops so that they’d be able to get the most out of sample-graded images that he would send them as he started coloring episodes. “We would provide notes when Pankaj had completed a pass,” says LaVasseur, but it was usually just a few very small tweaks I was asking for. We were all working toward the same goal so there weren’t surprises in the way he graded anything.”

“Pankaj had it done very quickly, especially the handling of the green,” Peck adds. “The show needed that look to build to a certain point and then stay there, but the actual locations weren’t cooperative. We were able to just shoot and we all knew what it needed to look like after Pankaj worked on it.”

“Communication is so important,” LaVasseur stresses. “You need to have the DPs, production designer and costume designer working together on the look. You need to know that your colorist is part of the discussion so they’re not taking the images in some other direction than intended. I come from the film days and we would always say, ‘Plan your shoot. Shoot your plan.’ That’s how we approached this season, and I think it paid off.”

The challenges of dialogue and ice in Game of Thrones ‘Beyond the Wall’

By Jennifer Walden

Fire-breathing dragons and hordes of battle-ready White Walkers are big attention grabbers on HBO’s Game of Thrones, but they’re not the sole draw for audiences. The stunning visual effects and sound design are just the gravy on the meat and potatoes of a story that has audiences asking for more.

Every line of dialogue is essential for following the tangled web of storylines. It’s also important to take in the emotional nuances of the actors’ performances. Striking the balance between clarity and dynamic delivery isn’t an easy feat. When a character speaks in a gruff whisper because, emotionally, it’s right for the scene, it’s the job of the production sound crew and the post sound crew to make that delivery work.

At Formosa Group’s Hollywood location, an Emmy-winning post sound team works together to put as much of the on-set performances on the screen as possible. They are supervising sound editor Tim Kimmel, supervising dialogue editor Paul Bercovitch and dialogue/music re-recording mixer Onnalee Blank.

Blank and the show’s mixing team picked up a 2018 Emmy for Outstanding Sound Mixing For a Comedy or Drama Series (One Hour) for their work on Season 7’s sixth episode, “Beyond the Wall.”

Tim Kimmel and Onnalee Blank

“The production sound crew does such a phenomenal job on the show,” says Kimmel. “They have to face so many issues on set, between the elements and the costumes. Even though we have to do some ADR, it would be a whole lot more if we didn’t have such a great sound crew on-set.”

On “Beyond the Wall,” the sound team faced a number of challenges. Starting at the beginning of this episode, Jon Snow [Kit Harington] and his band of fighters trek beyond the wall to capture a White Walker. As they walk across a frozen, windy landscape, they pass the time by getting to know each other more. Here the threads of their individual stories from past seasons start to weave together. Important connections are being made in each line of dialogue.

Those snowy scenes were shot in Iceland and the actors wore metal spikes on their shoes to help them navigate the icy ground. Unfortunately, the spikes also made their footsteps sound loud and crunchy, and that got recorded onto the production tracks.

Another challenge came from their costumes. They wore thick coats of leather and fur, which muffled their dialogue at times or pressed against the mic and created a scratchy sound. Wind was also a factor, sometimes buffeting across the mic and causing a low rumble on the tracks.

“What’s funny is that parts of the scene would be really tough to get cleaned up because the wind is blowing and you hear the spikes on their shoes — you hear costume movements. Then all of a sudden they stop and talk for a minute and the wind stops and it’s the most pristine, quiet, perfect recording you can think of,” explains Kimmel. “It almost sounded like it was shot on a soundstage. In Iceland, when the wind isn’t blowing and the actors aren’t moving, it’s completely quiet and still. So it was tough to get those two to match.”

As supervising sound editor, Kimmel is the first to assess the production dialogue tracks. He goes through an episode and marks priority sections for supervising dialogue editor Bercovitch to tackle first. He says, “That helps Tim [Kimmel] put together his ADR plan. He wants to try to pare down that list as much as possible. For Beyond the Wall, he wanted me to start with the brotherhood’s walk-and-talk north of the wall.”

Bercovitch began his edit by trying to clean up the existing dialogue. For that opening sequence, he used iZotope RX 6’s Spectral Repair to clean up the crunchy footsteps and the rumble of heavy winds. Next, he searched for usable alt takes from the lav and boom tracks, looking for a clean syllable or a full line to cut in as needed. Once Bercovitch was done editing, Kimmel could determine what still needed to be covered in ADR. “For the walk-and-talk beyond the wall, the production sound crew really did a phenomenal job. We didn’t have to loop that scene in its entirety. How they got as good of recordings as they did is honestly beyond me.”

Since most of the principle actors are UK and Ireland-based, the ADR is shot in London at Boom Post with ADR supervisor Tim Hands. “Tim [Hands] records 90% of the ADR for each season. Occasionally, we’ll shoot it here if the actor is in LA,” notes Kimmel.

Hands had more lines than usual to cover on Beyond the Wall because of the battle sequence between the brotherhood and the army of the dead. The principle actors came in to record grunts, efforts and breaths, which were then cut to picture. The battle also included Bercovitch’s selects of usable production sound from that sequence.

Re-recording mixer Blank went through all of those elements on dub Stage 1 at Formosa Hollywood using an Avid S6 console to control the Pro Tools 12 session. She chose vocalizations that weren’t “too breathy, or sound like it’s too much effort because it just sounds like a whole bunch of grunts happening,” she says. “I try to make the ADR sound the same as the production dialogue choices by using EQ, and I only play sounds for whoever is on screen because otherwise it just creates too much confusion.”

One scene that required extensive ADR was for Arya (Maisie Williams) and Sansa (Sophie Turner) on the catwalk at Winterfell. In the seemingly peaceful scene, the sisters share an intimate conversation about their father as snow lightly falls from the sky. Only it wasn’t so peaceful. The snow was created by a loud snow machine that permeated the production sound, which meant the dialogue on the entire scene needed to be replaced. “That is the only dialogue scene that I had no hand in and I’ve been working on the show for three seasons now,” says Bercovitch.

For Bercovitch, his most challenging scenes to edit were ones that might seem like they’d be fairly straightforward. On Dragonstone, Daenerys (Emilia Clarke) and Tyrion (Peter Dinklage) are in the map room having a pointed discussion on succession for the Iron Throne. It’s a talk between two people in an interior environment, but Bercovitch points out that the change of camera perspective can change the sound of the mics. “On this particular scene and on a lot of scenes in the show, you have the characters moving around within the scene. You get a lot of switching between close-ups and longer shots, so you’re going between angles with a usable boom to angles where the boom is not usable.”

There’s a similar setup with Sansa and Brienne (Gwendoline Christie) at Winterfell. The two characters discuss Brienne’s journey to parley with Cersei (Lena Headey) in Sansa’s stead. Here, Bercovitch faced the same challenge of matching mic perspectives, and also had the added challenge of working around sounds from the fireplace. “I have to fish around in the alt takes — and there were a lot of alts — to try to get those scenes sounding a little more consistent. I always try to keep the mic angles sounding consistent even before the dialogue gets to Onnalee (Blank). A big part of her job is dealing with those disparate sound sources and trying to make them sound the same. But my job, as I see it, is to make those sound sources a little less disparate before they get to her.”

One tool that’s helped Bercovitch achieve great dialogue edits is iZotope’s RX 6. “It doesn’t necessarily make cleaning dialogue faster,”he says. “It doesn’t save me a ton of time, but it allows me to do so much more with my time. There is so much more that you can do with iZotope RX 6 that you couldn’t previously do. It still takes nitpicking and detailed work to get the dialogue to where you want it, but iZotope is such an incredibly powerful tool that you can get the result that you want,” he says.

On the dub stage, Blank says one of her most challenging scenes was the opening walk-and-talk sequence beyond the wall. “Half of that was ADR, half was production, and to make it all sound the same was really challenging. Those scenes took me four days to mix.”

Her other challenge was the ADR scene with Arya and Sansa in Winterfell, since every line there was looped. To help the ADR sound natural, as if it’s coming from the scene, Blank processes and renders multiple tracks of fill and backgrounds with the ADR lines and then re-records that back into Avid Pro Tools. “That really helps it sit back into the screen a little more. Playing the Foley like it’s another character helps too. That really makes the scene come alive.”

Bercovitch explains that the final dialogue you hear in a series doesn’t start out that way. It takes a lot of work to get the dialogue to sound like it would in reality. “That’s the thing about dialogue. People hear dialogue all day, every day. We talk to other people and it doesn’t take any work for us to understand when other people speak. Since it doesn’t take any work in one’s life why would it require a lot of work when putting a film together? There’s a big difference between the sound you hear in the world and recorded sound. Once it has been recorded you have to take a lot of care to get those recordings back to a place where your brain reads it as intelligible. And when you’re switching from angle to angle and changing mic placement and perspective, all those recordings sound different. You have to stitch those together and make them sound consistent so it sounds like dialogue you’d hear in reality.”

Achieving great sounding dialogue is a team effort — from production through post. “Our post work on the dialogue is definitely a team effort, from Paul’s editing and Tim Hands’ shooting the ADR so well to Onnalee getting the ADR to match with the production,” explains Kimmel. “We figure out what production we can use and what we have to go to ADR for. It’s definitely a team effort and I am blessed to be working with such an amazing group of people.”


Jennifer Walden is a New Jersey-based audio engineer and writer.

DP David Tattersall on shooting Netflix’s Death Note

Based on the manga series of the same name by Tsugumi Ohba and Takeshi Obata, Death Note stars Nat Wolff as Light Turner, a man who obtains a supernatural notebook that gives him the power to exterminate any living person by writing his or her name in the notebook. Willem Dafoe plays Ryuk, a demonic god of death and the creator of the Death Note. The stylized Netflix feature film was directed by Adam Wingard (V/H/S/, You’re Next) and shot by cinematographer David Tattersall (The Green Mile, Star Wars: Episode I, II and III) with VariCam 35s in 4K RAW with Codex VRAW recorders.

Tattersall had previously worked with Wingard on the horror television series, Outcast. Per Tattersall, he wasn’t aware of the manga series of books but during pre-production, he was able to go through a visual treasure trove of manga material that the art department compiled.

Instead of creating a “cartoony” look, Tattersall and Wingard were more influenced by classic horror films, as well as well-crafted movies by David Fincher and Stanley Kubrick. “Adam is a maestro of the horror genre, and he is very familiar with constructing scenes around scary moments and keeping tension,” explains Tattersall. “It wasn’t necessarily whole movies that influenced us — it was more about taking odd sequences that we thought might be relevant to what we were doing. We had a very cool extended foot chase that we referred to The French Connection and Se7en, both of which have a mix of handheld, extreme wides and long lens shots. Also, because of Adam’s love of Kubrick movies, we had compositions with composure and symmetry that are reminiscent of The Shining, or crazy wide-angle stuff from A Clockwork Orange. It sounds like a mish-mash, but we did have rules.”

Dialogue scenes were covered in a realistic non-flashy way and for Tattersall, one of his biggest challenges was dealing with the demon character, Ryuk, both physically and photographically. The team started with a huge puppet character with puppeteers operating it, but it wasn’t a practical approach since many of the scenes were shot in small spaces such as Light’s bedroom.

“Eventually, the practical issue led to us using a mime artist in full costume with the intention of doing face replacement later,” explains Tattersall. “From our testing, the approach of ‘less is more’ became a thing — less light, more shadow and mystery, less visible, more effective. It worked well for this character who is mostly seen hiding in the shadows. It’s similar to the first Jaws movie. The shark is strangely more scary and ominous when you only get a few glimpses in the frame here and there — a suggestion. And that was our approach for the first 75% of the film. You might get a brief lean out of the shadows and a quick lean back in. Often, we would just shoot him out of focus. We’d keep the focus in the foreground for the Light character and Ryuk would be an out-of-focus blob in the background. It’s not until the very end — the final murder sequence — that you get to see him in full head-to-toe clarity.”

Tattersall shot the film with two VariCam 35s as his A and B cameras and had a VariCam LT for backup. He shot in 4K DCI (4096 x 2160) capturing VRAW files to Codex VRAW recorders. For lensing, he shot with Zeiss Master primes with a 2:39:1 extraction. “This set has become a favorite of mine for the past few years and I’ve grown to love them,” says Tattersall. “They are a bit big and heavy, but they open to a T1.3 and they’re so velvety smooth. With this show having so much night work, that extra speed was very useful.”

In terms of RAW capture, Tattersall tried to keep it simple, using Fotokem’s nextLAB for on-set workflow. “It was almost like using a one light printing process,” he explains. “We had three basic looks — a fairly cool dingy look, one that sometimes falls back on the saturation or leans in the cold direction. I have a set of rules, but I occasionally break them. We tried as much as possible to shoot only in the shade — bringing in butterfly nets or shooting on the shady side of buildings during the day. It was Adam’s wish to keep this heavy, moody atmosphere.”

Tattersall used a few tools to capture unique visuals. To capture low angle shots, he used a P+S Skater Scope that lets you shoot low to the ground. “You can also incorporate floating Dutch angles with its motorized internal prism, so this was something we did throughout,” he says. “The horizon line would lean over to one side or the other.” He also used a remote rollover rig, which allowed the camera to roll 180-degrees when on a crane, giving Tattersall a dizzying visual.

“We also shot with a Phantom Flex to shoot 500fps,” continues Tattersall. “We would have low Dutch angles, an 8mm fish eye look and a Lensbaby to degrade the focus even more. The image could get quite wonky on occasion, which is counterpoint to the more classic coverage of the calmer dialogue moments.”

Although he did a lot of night work, Tattersall did not use the native 5,000 ISO. “I have warmed to a new range of LED lights — the Cineo Maverick, Matchbox and Matchstix. They’re all color balanced and they’re all multi-varied Daylight or Tungsten so it’s quick and easy to change the color temperature without the use of gels. We also made use of Arri Skypanels. Outside, we used tried and tested old school HMIs or 9-light or 12-light MaxiBrutes. There’s nothing quite like them in terms of powerful source lights.”

Death Note was finished at Technicolor by colorist Skip Kimball on Blackmagic Resolve. “The grade was mostly about smoothing out the bumps and tweaking the contrast” explains Tattersall. “Since it’s a dark feature, there was an emphasis on a heavy mood — keeping the blacks, with good contrast and saturated colors. But in the end, the photographic stylization came from the camera placement and lens choices working together with the action choreography.

Emmy Awards: OJ: Made in America composer Gary Lionelli

By Jennifer Walden

The aftermath of a tragic event plays out in front of the eyes of the nation. OJ Simpson, wanted for the gruesome murder of his wife and her friend, fails to turn himself in to the authorities. News helicopters follow the police chase that follows Simpson back to his Rockingham residence where they plan to take him into custody. Decades later, three-time Emmy-winning composer Gary Lionelli is presented with the opportunity to score that iconic Bronco chase.

Here, Lionelli talks about his approach to scoring ESPN’s massive documentary OJ: Made in America. His score on Part 3 is currently up for Emmy consideration for Outstanding Music Composition for a Limited Series. The entire OJ: Made in America score is available digitally through Lakeshore Records.

Gary Lionelli

Scoring OJ: Made in America seems like such a huge undertaking. It’s a five-part series, and each part is over 90 minutes long. How did you tackle this beast?
I’d never scored anything that long within such a short timeframe. Because each part was so long, it wasn’t like doing a TV series but more like scoring five 90-minute films back-to-back. I just focused on one cue at a time, putting one foot in front of the other so I wouldn’t feel overwhelmed by the full scope of the work and could relax enough to write the score! I knew I’d get to the finish line at some point, but it seemed so far away most of the time that I just didn’t want to dwell on that.

When you got this project, did they deliver it as one crazy, long piece? Or did they give it to you in its separate parts?
I got everything at once, which was totally mind-boggling. When you get any project, you need to watch it before you start working on it. For this one, it meant watching a seven-and-a-half-hour film, which was a feat in and of itself. The scale was just huge on this. Looking back, my eyelids still twitch.

It was a pretty nerve-racking time because the schedule was really tight. That was one of the most challenging parts of doing this project. I could have used a year to write this music, because five films are ordinarily what I‘d do in a year, not six months. But all of us who write music for film know that you have to work within extreme deadlines as a matter of course. So you say yes, and you find a way to do it.

So you basically locked yourself up for 14 hours a day, and just plugged away at it?
Right, except it was actually about 15 hours a day, seven days a week, with no breaks. I finished the score 11 days before its theatrical release, which is insane. But, hey, that part is all in the past now, and it’s great to see the film out there getting such attention. One thing that made it worthwhile to me in the end was the quality of the filmmaking — I was riveted by the film the whole time I was working on it.

When composing, you worked only on one part at a time and not with an overall story arc in mind?
I watched all five parts over the course of four days. Once I’d watched the first two parts, I couldn’t wait to start writing so I did that for a bit and then went back to watch the rest.

The director Ezra Edelman wanted me to first score the infamous Bronco chase, which is in Part 3. It’s a 30-minute segment of that particular episode. It was a long sequence of events, all having to do with the chase itself, the events leading up to it and the aftermath of it. So that is what I scored first. It’s kind of strange to dive into a film by first scoring such a pivotal, iconic event. But it worked out — what I wrote for that segment stuck.

It was strange to be writing music for something I had seen on television 20 years before – just to think that there I was, watching the Bronco chase on TV along with everyone else, not having the remotest idea that 20 years down the line I was going to be writing music for this real-life event. It’s just a very odd thing.

The Bronco chase wasn’t a high-speed chase. It was a long police escort back to OJ’s house. The music you wrote for this segment was so brooding and it fit perfectly…
I loved when Zoe Tur, the helicopter pilot, said they were giving OJ a police motorcade. That’s exactly what he got. So I didn’t want to score the sequence by commenting literally on what was happening — what people were doing, or the fact that this was a “chase.” What I tried to do was focus on the subtext, which was the tragedy of the circumstances, and have that direct the course of the music, supplying an overarching musical commentary.

For your instrumentation, did the director let you be carried away by your own muse? Or did he request specific instruments?
He was specific about two things: one, that there would be a trumpet in the score, and two, he wanted an oboe. Other than those two instruments, it was up to me. I have a trumpet player, Jeff Bunnell, who I’ve worked with before. It’s a great partnership because he’s a gifted improviser, and sometimes he knows what I want even when I don’t. He did a fantastic job on the score.

I also had a 40-piece string section recorded at the Eastman Scoring Stage at Warner Bros. Studios. We used players here in town and they added a lot, really bringing the score to life.

Were you conducting the orchestra? Or did you stay close to the engineer in the booth?
I wanted to be next to the recording engineer so I could hear everything as it was being recorded. I had a conductor instead. Besides, I’m a terrible conductor.

What instruments did you choose for the Bronco chase score?
For one of the scenes, I used layers of distorted electric guitars. Another element of the score was musical sound manipulation of acoustic instruments through electronics. It’s a time-consuming way to conjure up sounds, with all the trial and error involved, but the results can sometimes give a film an identity beyond what you can do with an orchestra alone.

So you recorded real instruments and then processed them? Can you share an example of your processing chain?
Sometimes I will get my guitar out and play a phrase. I’ll take that phrase and play it backwards, drop it two octaves, put it through a ring modulator, and then I’ll chop it up into short segments and use that to create a rhythmic pattern. The result is nothing like a real guitar. I didn’t necessarily know what I was going for at the start, but then I’d end up with this cool beat. Then I’d build a cue around that.

The original sound could be anything. I could tap a pencil on a desk and then drop that three octaves, time compress it and do all sorts of other processing. The result is a weird drum sound that no one’s ever heard before. It’s all sorts of experimentation, with the end result being a sound that has some originality and that piques the interest of the person watching the film.

To break that down a little further, what program do you work in?
I work in Pro Tools. I went from Digital Performer to Logic — I think most film composers use Logic or Cubase, but there are a growing number who actually use Pro Tools. I don’t need MIDI to jump through a lot of hoops. I just need to record basic lines because most of that stuff gets replaced by real players anyhow.

When you work in Pro Tools, it’s already the delivery format for the orchestra, so you eliminate a conversion step. I’ve been using Pro Tools for the past four years, and so far it’s been working out great. It has some limitations in MIDI, but not that many and nothing that I can’t work around.

What are some of your favorite processing plug-ins?
For pitching, I use Melodyne by Celemony and Serato’s Pitch ‘n’ Time. There’s a new pitch shifter in Pro Tools called X-Form that’s also good. I also use Waves SoundShifter — whatever seems to do a better job for what I’m working on. I always experiment to see which one works the best to give me the sound I’m looking for.

Besides pitch shifters, I use GRM Tools by Ina-GRM. They make weird plug-ins, like one called Shift, that really convolute sound to the point where you can take a drum or rhythmic guitar and turn it into a high-hat sound. It doesn’t sound like a real high-hat. It sounds like a weird high-hat, not a real one. You never know what you’re going to get from this plug-in, and that’s why I like it so much.

I also use a lot of Soundtoys plug-ins, like Crystallizer, which can really change sounds in unexpected ways. Soundtoys has great distortion plug-ins too. I’m always on the hunt for something new.

A lot of times I use hardware, like guitar pedals. It’s great to turn real knobs and get results and ideas from that. Sometimes the hardware will have a punchier sound, and maybe you can do more extreme things with it. It’s all about experimentation.

You’ve talked before about using a Guitarviol. Was that how you created the long, suspended bass notes in the Bronco chase score?
Yes, I did use the Guitarviol in that and in other places in the score, too. It’s a very weird instrument, because it looks like a cello but doesn’t sound like one, and it definitely doesn’t sound like a guitar. It has a weird, almost Middle Eastern sound to it, and that makes you want to play in that scale sometimes. Sometimes I’ll use it to write an idea, and then I’ll have my cellist play the same thing on cello.

The Guitarviol is built by Jonathan Wilson, who lives in Los Angeles. He had no idea when he invented this thing that it was going to get adopted by the film composer community here in town. But it has, and he can’t make them fast enough.

Do you end up layering the Guitarviol and the cello in the mix? Or do you just go with straight cello?
It’s usually just straight cello. There are a couple of cellists I use who are great. I don’t want to dilute their performance by having mine in the background. The Guitarviol is an inspiration to write something for the cellists to hear, and then I’ll just have them take over from there.

The overall sound of Part 3 is very brooding, and the percussion choices have complementary deep tones. Can you tell me about some of the choices you made there?
Those are all real drums. I don’t use any samples. I love playing real drums. I have a real timpani, a big Brazilian Surdo drum, a gigantic steel bass drum that sounds like a Caribbean steel drum but only two octaves lower (it has a really odd sound), and I have a classic Ludwig Beatles drum kit. I have a marimba and a collection of small percussion instruments. I use them all.

Sometimes I will pitch the recordings down to make them sound bigger. The Surdo by itself sounds huge, and when you pitch that down half an octave it’s even bigger. So I used all of those instruments and I played them. I don’t think I used a single drum sample on the entire score.

When you use percussion samples, you have to hunt around in your entire hard drive for a great tom-tom or a Taiko drum. It’s so much easier to run over to one in your studio and just play it. You never know how it’s going to sound, depending on how you mic it that day. And it’s more inspiring to play the real thing. You get great variation. Every time you hit the drum it sounds different, but a sample sound pretty much sounds the same every time you trigger it.

For striking, did you choose mallets, brushes, sticks, your hands, or other objects?
For the Surdo, I used my hands. I use marimba mallets and timpani mallets for the other instruments. For example, I’ll use timpani mallets for the big steel bass drum. Sometimes I’ll use timpani mallets on my drum kit’s bass drum, because it gives a different sound. It has a more orchestral sound, not like a kick drum from a rock band.

I’m always experimenting. I use brushes a lot on cymbals, and I use the brushes on the steel drum because it gives it a weird sound. You can even use brushes on the timpani, and that creates a strange sound. There are definitely no rules. Whatever you think or can imagine having an effect on the drum, you just try it out. You never know what you’ll get — it’s always good to give it a chance.

In addition to the Bronco chase scene, are there any other tracks that stood out for you in Part 3?
When you score something this long, at a certain point everything starts to run together in your mind. You don’t remember what cue belongs to what scene. But there are many that I can remember. During the jury section of that episode, I used an oboe for Johnny Cochran speaking to the jury. That was an interesting pairing, the oboe and Johnny Cochran. In a way, the oboe became an extension of his voice during his closing argument. I can’t really explain why it worked, but somehow it was the right match.

For the beginning of Part 3, when the police arrive because there was a phone call from Nicole Brown Simpson saying she was afraid of OJ, the cue there was very understated. It had a lot of strange, low sounds to it. That one comes to mind.

At the end of Part 3, they go to OJ’s Rockingham residence, and his lawyers had staged the setting. I did a cue there that was sort of quizzical in a way, just to show the ridiculousness of the whole thing. It was like a farce, the way they set up his residence. So I made the score take a right turn into a different area for that part. It gets away from the dark, brooding undercurrent that the rest of Part 3’s score had.

Of all the parts you could have submitted for Emmy consideration, why did you choose Part 3?
It was a toss-up between Part 2 and Part 3. Part 2 had some of the more major trumpet themes, more of the signature sound with the trumpet and the orchestra. But there were a few examples of that in Part 3, too.

I just felt the Bronco chase, score-wise, had a lot of variation to it, and that it moved in a way that was unpredictable. I ultimately thought that was the way to go, though it was a close race between Part 2 and Part 3.

I found out later that ESPN had submitted Part 3 for Emmy consideration in other categories, so there is a bit of synergy there.

—————-

Jennifer Walden is a New Jersey-based audio engineer and writer.

Barry Sonnenfeld on Netflix’s A Series of Unfortunate Events

By Iain Blair

Director/producer/showrunner Barry Sonnenfeld has a gift for combining killer visuals with off-kilter, broad and often dark comedy, as showcased in such monster hits as the Men in Black and The Addams Family franchises.

He did learn from the modern masters of black comedy, the Coen brothers, beginning his prolific career as their DP on their first feature film, Blood Simple and then shooting such classics as Raising Arizona and Miller’s Crossing. He continued his comedy training as the DP on such films as Penny Marshall’s Big, Danny Devito’s Throw Momma from the Train and Rob Reiner’s When Harry Met Sally.

So maybe it was just a matter of time before Sonnenfeld — whose directing credits include Get Shorty, Wild Wild West, RV and Nine Lives — gravitated toward helming the acclaimed new Netflix show A Series of Unfortunate Events, based on the beloved and best-selling “Lemony Snicket” children’s series by Daniel Handler. After all, with the series’ rat-a-tat dialogue, bizarre humor and dark comedy, it’s a perfect fit for the director’s own strengths and sensibilities.

I spoke with Sonnenfeld, who won a 2007 Primetime Emmy and a DGA Award for his directorial achievement on Pushing Daisies, about making the series, the new golden age of TV, his love of post — and the real story behind why he never directed the film version of A Series of Unfortunate Events.

Weren’t you originally set to direct the 2004 film, and you even hired Handler to write the screenplay?
That’s true. I was working with producer Scott Rudin, who had done the Addams Family films with me, and Paramount decided they needed more money, so they brought in another studio, DreamWorks. But the DreamWorks producer — who had done the Men in Black films with me — and I don’t really get along. So when they came on board, Daniel and I were let go. I’d been very involved with it for a long time. I’d already hired a crew, sets were all designed, and it was very disappointing as I loved the books.

But there’s a happy ending. You are doing Netflix TV series, which seems much closer to the original books than the movie version. How important was finding the right tone?
The single most important job of a director is both finding and maintaining the right tone. Luckily, the tone of the books is exactly in my wheelhouse — creating worlds that are real, but also with some artifice in them, like the Men in Black and Addams Family movies, and Pushing Daisies. I tend to like things that are a bit dark, slightly quirky.

What did you think of the film version?
I thought it was slightly too big and loud, and I wanted to do something more like a children’s book, for adults.

The film version had to stuff Handler’s first three novels into a single movie, but the TV format, with its added length, must work far better for the books?
Far better, and the other great thing is that once Netflix hired me — and it was a long auditioning process — they totally committed. They take a long time finding the right material and pairing it with the right filmmaker but once they do, they really trust their judgment.

I really wanted to shoot it all on stages, so I could control everything. I didn’t want sun or rain. I wanted gloomy overhead. So we shot it all in Vancouver, and Netflix totally bought into that vision. I have an amazing team — the great production designer Bo Welch, who did Men in Black and other films with me, and DP Bernard Couture.

Patrick Warburton’s deadpan delivery as Lemony Snicket, the books’ unreliable narrator, is a great move compared with having just the film’s voiceover. How early on did you make that change?
When I first met with Netflix, I told them that Lemony should be an on-screen character. That was my goal. Patrick’s just perfect for the role. He’s the sort of Rod Serling/Twilight Zone presence — only more so, as he’s involved in the actual visual style of the show.

How early on do you deal with post and CG for each episode?
Even before we’re shooting. You don’t want to wait until you lock picture to start all that work, or you’ll never finish in time. I’m directing most of it — half the first season and over a third of the second. Bo’s doing some episodes, and we bring in the directors at least a month before the shoot, which is long for TV, to do a shot list. These shows, both creatively and in terms of budget, are made in prep. There should be very few decisions being made in the shoot or surprises in post because basically every two episodes equal one book, and they’re like feature films but on one-tenth of the budget and a quarter of the schedule.

We only have 24 days to do two hours worth of feature film. Our goal is to make it look as good as any feature, and I think we’ve done that. So once we have sequences we’re happy with, we show them to Netflix and start post, as we have a lot of greenscreen. We do some CGI, but not as much as we expected.

Do you also post in Vancouver?
No. We began doing post there for the first season, but we discovered that with our TV budget and my feature film demands and standards, it wasn’t working out. So now we work with several post vendors in LA and San Francisco. All the editorial is in LA.

Do you like the post process?
I’ve always loved it. As Truffaut said, the day you finish filming is the worst it’ll ever be, and then in post you get to make it great again, separating the wheat from the chaff, adding all the VFX and sound. I love prep and post — especially post as it’s the least stress and you have the most time to just think. Production is really tough. Things go wrong constantly.

You used two editors?
Yes, Stuart Bass and Skip MacDonald, and each edits two episodes/one book as we go. I’m very involved, but in TV the director gets a very short time to do their cut, and I like to give notes and then leave. My problem is I’m a micro-manager, so it’s best if I leave because I drive everyone crazy! Then the showrunner — which is also me — takes over. I’m very comfortable in post, with all the editing and VFX, and I represent the whole team and end up making all the post decisions.

Where did you mix the sound?
We did all the mixing on the Sony lot with the great Paul Ottosson who won Oscars for Zero Dark Thirty and The Hurt Locker. We go way back, as he did Men in Black 3 and other shows for me, and what’s so great about him is that he both designs the sound and then also mixes.

The show uses a lot of VFX. Who did them?
We used three main houses — Shade and Digital Sandbox in LA and Tippett in San Francisco. We also used EDI, an Italian company, who came in late to do some wire removal and clean up.

How important was the DI on this and where did you do it?
We did it all at Encore LA, and the colorist on the first season was Laura Jans Fazio, who was fantastic. It’s the equivalent to a movie DI, where you do all the final color timing, and getting the right look was crucial. The DP created very good LUTs, and our rough cut was very close to where we wanted it, and then the DP and myself piggy-backed sessions with the colorist. It’s a painful experience for me as it’s so slow, and like editing, I micro-manage. So I set looks for scenes and then leave.

Barry Sonnefeld directs Joan Cusack.

Is it a golden age for TV?
Very much so. The writing’s a very high standard, and now everyone has wide-screen TVs there’s no more protecting the 3:4 image, which is almost square. When I began doing TV, there was no such thing as a wide shot. Executives would look at my cut, and the first thing they’d always say was, “Do you have a close-up of so and so?” Now it’s all changed. But TV is so different from movies. I look back fondly at movie schedules!

How important are the Emmys and other awards?
They’re very important for Netflix and all the new platforms. If you have critical success, then they get more subscribers, more money and then they develop more projects. And it’s great to be acknowledged by your peers.

What’s next?
I’ll finish season two and we’re hopeful about season three, which would keep us busy through fall 2018. And Vancouver’s a perfect place to be as long as you’re shooting on stage and don’t have to deal with the weather.

Will there be a fourth Men in Black?
If there is, I don’t think Will or I will be involved. I suspect there won’t be one, as it might be just too expensive to make now, with all the back-end deals for Spielberg and Amblin and so on. But I hope there’s one.

Images: Joe Lederer/Netflix


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Is television versioning about to go IMF?

By Andy Wilson

If you’ve worked in the post production industry for the last 20 years, you’ll have seen the exponential growth of feature film versioning. What was once a language track dub, subtitled version or country specific compliance edit has grown into a versioning industry that has to feed a voracious number of territories, devices, platforms and formats — from airplane entertainment systems to iTunes deliveries.

Of course, this rise in movie versioning has been helped by the shift over the last 10 years to digital cinema and file-based working. In 2013, SMPTE ratified ST 2067-2, which created the Interoperable Master Format (IMF). IMF was designed to help manage the complexity of storing high-quality master rushes inside a file structure that allowed the flexibility to generate multiple variants of films through constraining what was included in the output and in the desired output formats.

Like any workflow and format change, IMF has taken time to be adopted, but it is now becoming the preferred way to share high-quality file masters between media organizations. These masters are all delivered in the J2K codec to support cinema resolutions and playback technologies.

Technologists in the broadcast community have been monitoring the growth in popularity and flexibility of IMF, with its distinctive solution to the challenge of multiple versioning. Most broadcasters have moved away from tape-based playout and are instead using air-ready playout files. These are medium-sized files (50-100Mb/s), derived from high quality rushes that can be used on playout servers to create broadcast streams. The most widespread of these includes the native XDCAM file format, but it is fast being overtaken by the AS-11 format. This format has proved very popular in the United Kingdom, where all major broadcasters made a switch to AS-11 UK DPP in 2014. AS-11 is currently rolling out in the US via the AS-11 X8 and X9 variants. However, these remain air-ready playout files, output from the 600+Mb/s ProRes and RAW files used in high-end productions. AS-11 brings some uniformity, but it doesn’t solve the versioning challenge.

Versioning is rapidly becoming as big an issue for high-end broadcast content as for feature films. Broadcasters are now seeing the sales lifecycle of some of their programs running for more than 10 years. The BBC’s Planet Earth is a great example of this, with dozens of versions being made over several years. So the need to keep high-quality files for re-versioning for new broadcast and online deliveries has become increasingly important. It is crucial for long-tail sales revenue, and productions are starting to invest in higher-resolution recordings for exactly this reason.

So, as the international high-end television market continues to grow, producers are having to look at ways that they can share much higher quality assets than air-ready files. This is where IMF offers significant opportunity for efficiencies in the broadcast and wider media market and why it is something that has the attention of producers, such as the BBC and Sky. Major broadcasters such as these have been working with global partners through the Digital Production Partnership (DPP) to help develop a new specification of IMF, specifically designed for television and online mastering.

The DPP, in partnership with the North American Broadcasters Association (NABA) and the European Broadcasting Union (EBU), have been exploring what the business requirements are for a mastering format for broadcasting. The outcome of this work was published in June 2017, and can be downloaded here.

The work explored three different user requirements: Program Acquisitions (incoming), Program Sales (outgoing) and Archive. The sales and acquisition of content can be significantly transformed with the ability to build new versions on the fly, via the Composition Playlist (CPL) and an Output Profile List (OPL). The ability to archive master rushes in a suitably high-quality package will be extremely valuable to broadcast archives. The addition of the ability to store ProRes as part of an IMF is also being welcomed, as many broadcaster archives are already full of ProRes material.

The EBU-QC group has already started to look at how to manage program quality from a broadcast IMF package, and how technical assessments can be carried out during the outputting of materials, as well as on the component assets. This work paves the way for some innovative solutions to future QC checks, whether carried out locally in the post suite or in the cloud.

The DPP will be working with SMPTE and its partners to fast track a constrained version of IMF ready for use in the broadcast and online delivery market in the first half of 2018.

As OTT video services rely heavily on the ability to output multiple different versions of the source content, this new variant of IMF could play a particularly important role in automatic content versioning and automated processes for file creation and delivery to distribution platforms — not to mention in advertising, where commercials are often re-versioned for multiple territories and states.

The DPP’s work will include the ability to add ProRes- and H.264-derived materials into the IMF package, as well as the inclusion of delivery specific metadata. The DPP are working to deliver some proof-of-concept presentations for IBC 2017 and will host manufacturer and supplier briefing days and plugfests as the work progresses on the draft version of the IMF specification. It is hoped that the work will be completed in time to have the IMF specification for broadcast and online integrated into products by NAB 2018.

It’s exciting to think about how IMF and Internet-enabled production and distribution tools will work together as part of the architecture of the future content supply chain. This supply chain will enable media companies to respond more quickly and effectively to the ever-growing and changing demands of the consumer. The DPP sees this shift to more responsive operational design as the key to success for media suppliers in the years ahead.


Andy Wilson is head of business development at DPP.

Dailies and post for IFC’s Brockmire

By Randi Altman

When the name Brockmire first entered my vocabulary, it was thanks to a very naughty and extremely funny short video that I saw on YouTube, starring Hank Azaria. It made me laugh-cry.

Fast forward about seven years and the tale of the plaid-jacket-wearing, old-school baseball play-by-play man — who discovers his beloved wife’s infidelity and melts down in an incredibly dirty and curse-fueled way on air — is picked up by IFC, in the series aptly named Brockmire. It stars Azaria, Amanda Peet and features cameos from sportscasters like Joe Buck and Tim Kurkjian.

The Sim Group was called on to provide multiple services for Brockmire: Sim provided camera rentals, Bling Digital provided dailies and workflow services, and Chainsaw provided offline editorial facilities, post finishing services, and deliverables.

We reached out to Chainsaw’s VP of business development, Michael Levy, and Bling Digital’s workflow producer, James Koon, with some questions about workflow. First up is Levy.

Michael Levy

How early did you get involved on Brockmire?
Our role with Brockmire started from the very beginning stages of the project. This was through a working relationship I had with Elizabeth Baquet, who is a production executive at Funny or Die (which produces the show).

What challenges did you have to overcome?
One of the biggest challenges was related to scaling a short to a multi-episode series and having multiple episodes in both production and in post at the same time. However, all the companies that make up Sim Group have worked on many episodic series over the years, so we were in a really good position to offer advice in terms of how to plan a workflow strategy, how to document things properly and how to coordinate getting their camera and dailies offline media from Atlanta to Post Editorial in Los Angeles.

What tools did they need for post and how involved was Chainsaw?
Chainsaw worked very hard with our Sim Group colleagues in Atlanta to provide a level of coordination that I believe made life much simpler for the Brockmire production/editorial team.

Offline editing for the series was done on our Avid Media composer systems in cutting rooms here in the Chainsaw/SIM Group studio in Los Angeles at the Las Palmas Building.

The Avid dailies media created by Bling-Atlanta, our partner company in the SimGroup, was piped over the Internet each day to Chainsaw. When the Brockmire editorial crew walked into their cutting rooms, their offline dailies media was ready to edit with on their Avid Isis server workspace. Whenever needed, they were also able to access their Arri Alexa full-rez dailies media that had been shipped on Bling drives from Atlanta.

Bling-Atlanta’s workflow supervisor for Brockmire, James Koon, remained fully involved, and was able to supervise the pulling of any clips needed for VFX, or respond to any other dailies related needs.

Deb Wolfe, Funny or Die’s post producer for Brockmire, also had an office here at Chainsaw. She consulted regularly with Annalise Kurinsky (Chainsaw’s in-house producer for Brockmire) and I as they moved along locking cuts and getting ready for post finishing.

In preparation for the finishing work, we were able to set-up color tests with Chainsaw senior colorist Andy Lichtstein, who handled final color for the series in one of our FilmLight Baselight color suites. I should note that all of our Chainsaw finishing rooms were right downstairs on the second floor of the same Sim Group Las Palmas Building.

How closely did you work with Deb Wolfe?
Very closely, especially in dealing with an unexpected production problem. Co-star Amanda Peet was accidentally hit in the head by a thrown beer can (how Brockmire! as they would say in the series). We quickly called in Boyd Stepan, Chainsaw’s Senior VFX artist, and came up with a game plan to do Flame paint fixes on all of the affected Amanda Peet shots. We also provided additional VFX compositing for other planned VFX shots in several of their episodes.

What about the HD online finish?
That was done on Avid Symphony and Baselight by staff online editor Jon Pehlke, making full use of Chainsaw’s Avid/Baselight clip-based AAF workflow.

The last stop in the post process was the Chainsaw Deliverables Department, which took care of QC and requested videotape dubs and creation and digital upload of specified delivery files.

James Koon

Now for James Koon…

James, what challenges did you have to overcome if any?
I would say that the biggest challenge overall with Brockmire was the timeframe. Twenty-four days to shoot eight episodes is ambitious. While in general this doesn’t pose a specific problem in dailies, the tight shooting schedule meant that certain elements of the workflow were going to need more attention. The color workflow, in particular, was one that created a fair amount of discussion — with the tight schedules on set, the DP (Jeffrey Waldron) wanted to get his look, but wasn’t going to have much time, if any, on-set coloring. So we worked with the DP to set up looks before they started shooting that could be stored in the camera and monitored on set and would be applied and tweaked as needed back at the dailies lab with notes from the DP.

Episode information from set to editorial was also an important consideration as they were shooting material from all eight episodes at once. Making sure to cross reference and double check which episode a shot was for was important to make sure that editorial could quickly find what they needed.

Can you walk us through the workflow, and how you worked with the producers?
They shot with the Arri’s Amira and Alexa Mini, monitoring with the LUTs created before production. This material was offloaded to an on-set back-up and a shuttle drive  — we generally use G-Tech G-RAID 4TB Thunderbolt or USB3 and  for local storage a Promise Pegasus drive and a back up on our Facilis Terrablock SAN — that was sent to the lab along with camera notes and any notes from the DP and/or the DIT regarding the look for the material. Once received at the lab we would offload the footage to our local storage and process the footage in the dailies software, syncing the material to the audio mixers recording and logging the episode, scene and take information for every take, using camera notes, script notes and audio logs to make sure that the information was correct and consistent.

We also applied the correct LUT based on camera reports and tweaked color as needed to match cameras and make any adjustments needed from the DPs notes. Once all of that was completed, we would render Avid materials for editorial, create Internet streaming files for IFC’s Box service, as well as creating DVDs.

We would bring in the Avid files and organize them into bins per the editorial specs, and upload the files and bins to the editorial location in LA. These files were delivered directly to a dailies partition on their Isis, so once editorial arrived in the morning, everything was waiting for them.

Once dailies were completed, LTO backups of the media and dailies were written as well as additional temporary backups of the source material as a safety. These final backups were completed and verified by the following morning, and editorial and production were both notified, allowing production to clear cards from the previous day if needed.

What tools did you use for dailies?
We used DaVinci Resolve to set original looks with the DP before the show began shooting, Colorfront Express Dailies for dailies processing, Media Composer for Avid editorial prep and bin organization and Imagine’s PreRoll Post for LTO writing and verification.

Harbor’s Bobby Johanson discusses ADR for TV and film

By Jennifer Walden

A lot of work comes in and out of the ADR department at New York City’s Harbor Picture Company. A lot.

Over the past year alone, ADR mixer Bobby Johanson has been cranking out ADR and loop group for films such as Beauty and the Beast, The Light Between Oceans, Patriots Day, The Girl on the Train, Triple 9, Hail, Caesar! and more.

His expertise goes beyond film though. Johanson also does ADR for series, for shows like Amazon’s Red Oaks and their upcoming series The Marvelous Mrs. Maisel, and Netflix’s Master of None, which we will touch on lightly in a bit. First, let’s talk the art of ADR.

According to Johanson, “Last week, I did full days on three different films. Some weeks we record full days, nights and weekends, depending on the season, film festivals, what’s in post, actor availability and everything else that goes on with scheduling. Some sessions will book for two hours out of a day, while another client will want eight hours because of actor availability.”

With so many projects passing through his studio, efficiency is essential, but not at the cost of a job well done. “You have an actor on the stage and the director in the room, and you have to make things efficient,” says Johanson. “You have to play lines back as they are going to be in the show. You want to play the line and hear, ‘Was that ADR?’ Instantly, it’s a whole new world. People have been burned by not so good ADR in the past, and I feel like that compromises the performance. It’s very important for the talent to feel like they’re in good hands, so they forget about the technical side and just focus on their acting.”

Johanson got his start in ADR at New York’s Sound One facility, first as a messenger running reels around, and then moving up to the machine room when there was an opening for Sound One’s new ADR stage. “We didn’t really have anyone teaching us. The job was shown to us once; then we just had to figure out how to thread the dubbers and the projector. Once we got those hung, we would sit in the ADR studio and watch. I picked up a lot of my skills old-school. I’ve learned to incorporate those techniques into current technology and that works well for us.”

Tools
Gear-wise, one staple of his ADR career has been the Soundmaster ADR control system. Johanson calls it an “old-school tool,” probably 25 years old at this point, but he hasn’t found anything faster for recording ADR. “I used it at Sound One, and I used it at Digital Cinema, and now I use it here at Harbor. Until someone can invent another ADR synchronizer, this is the best for me.”

Johanson integrates the Soundmaster system with Avid Pro Tools 12 and works as a two-man team with ADR recordist Mike Rivera. “You can’t beat the efficiency and the attention to detail that you can get with the two-man team.”

Rivera tags the takes and makes minor edits while Johanson focuses on the director and the talent. “Because we are working on a synchronizer, the ADR recordist can do things that you couldn’t do if you were just shooting straight to Pro Tools,” explains Johanson. “We can actually edit on the fly and instantly playback the line in sync. I have the time to get the reverb on it and sweeten it. I can mix the line in because I’m not cutting it or pulling it into the track. That is being done while the system is moving on the pre-roll for a playback.”

For reverb, Johanson chooses an outboard Lexicon PCM80. This puts the controls easily within reach, and he can quickly add or change the reverb on the fly, helping the clean ADR line to sync into the scene. “The reverb unit is pretty old, but it is single-handedly the easiest reverb unit that you can use. There are four room sizes, and then you can adjust the delay of the reverb four times. I have been using this reverb for so many years now that I can match any reverb from any movie or TV show because I know this unit so well.”

Another key piece of gear in his set-up is an outboard Eventide H3000 SE sampler, which Johanson uses to sample the dialogue line they need to replace and play it back over and over for the actor to re-perform. “We offer a variety of ways to do ADR, like using beeps and having the actor perform to picture, but many actors prefer an older method that goes back to ‘looping.’ Back in the day, you would just run a line over and over again and the actor would emulate it. Then we put the select take of that line to picture. It’s a method that 60 percent of our actors who come in here love to do, and I can do that using the sampler.”

He also uses the sampler for playback. By sampling background noise from the scene, he can play that under the ADR line during playback and it helps the ADR to sit in the scene. “I keep the sampler and reverb as outboard gear because I can control them quickly. I’m doing things freestyle and we don’t have to stop the session. We don’t have to stop the system and wait for a playback or wait to do a record pass. Because we are a two-man operation, I can focus on these pieces of gear while Mike is tagging the takes with their cue numbers and managing them in the Pro Tools session for delivery. I can’t find an easier or quicker way to do what I do.”

While Johanson’s set-up may lack the luster of newly minted audio tools, it’s hard to argue with results. It’s not a case of “if it’s not broke then don’t fix it,” but rather a case of “don’t mess with perfection.”

Master of None
The set-up served them well while recording ADR and loop group for Netflix’s Emmy-winning comedy series Master of None. “Kudos to production sound mixer Michael Barosky because there wasn’t too much dialogue that we needed to replace with ADR for Season 2,” says Johanson. “But we did do a lot of loop group — sweetening backgrounds and walla, and things like that.”

For the Italian episodes, they brought in bilingual actors to record Italian language loop group. One scene that stood out for Johanson was the wedding scene in Italy, where the guests start jumping into the swimming pool. “We have a nice-sized ADR stage and so that frees us up to do a lot of movement. We were directing the actors to jump in front of the mic and run by the mic, to give us the effect of people jumping into the pool. That worked quite nicely in the track.”

Netflix’s The Last Kingdom puts Foley to good use

By Jennifer Walden

What is it about long-haired dudes strapped with leather, wielding swords and riding horses alongside equally fierce female warriors charging into bloody battles? There is a magic to this bygone era that has transfixed TV audiences, as evident by the success of HBO’s Game of Thrones, History Channel’s Vikings series and one of my favorites, The Last Kingdom, now on Netflix.

The Last Kingdom, based on a series of historical fiction novels by Bernard Cornwell, is set in late 9th century England. It tells the tale of Saxon-born Uhtred of Bebbanburg who is captured as a child by Danish invaders and raised as one of their own. Uhtred gets tangled up in King Alfred of Wessex’s vision to unite the three separate kingdoms (Wessex, Northumbria and East Anglia) into one country called England. He helps King Alfred battle the invading Danish, but Uhtred’s real desire is to reclaim his rightful home of Bebbanburg from his duplicitous uncle.

Mahoney Audio Post
The sound of the series is gritty and rich with leather, iron and wood elements. The soundtrack’s tactile quality is the result of extensive Foley work by Mahoney Audio Post, who has been with the series since the first season. “That’s great for us because we were able to establish all the sound for each character, village, environment and more, right from the first episode,” says Foley recordist/editor/sound designer Arran Mahoney.

Mahoney Audio Post is a family-operated audio facility in Sawbridgeworth, Hertfordshire, UK. Arran Mahoney explains the studio’s family ties. “Clare Mahoney (mum) and Jason Swanscott (cousin) are our Foley artists, with over 30 years of experience working on high-end TV shows and feature films. My brother Billy Mahoney and I are the Foley recordists and editors/sound designers. Billy Mahoney, Sr. (dad) is the founder of the company and has been a dubbing mixer for over 40 years.”

Their facility, built in 2012, houses a mixing suite and two separate audio editing suites, each with Avid Pro Tools HD Native systems, Avid Artist mixing consoles and Genelec monitors. The facility also has a purpose-built soundproof Foley stage featuring 20 different surfaces including grass, gravel, marble, concrete, sand, pebbles and multiple variations of wood.

Foley artists Clare Mahoney and Jason Swanscott.

Their mic collection includes a Røde NT1-A cardioid condenser microphone and a Røde NTG3 supercardioid shotgun microphone, which they use individually for close-micing or in combination to create more distant perspectives when necessary. They also have two other studio staples: a Neumann U87 large-diaphragm condenser mic and a Sennheiser MKH-416 short shotgun mic.

Going Medieval
Over the years, the Mahoney Foley team has collected thousands of props. For The Last Kingdom specifically, they visited a medieval weapons maker and bought a whole armory of items: swords, shields, axes, daggers, spears, helmets, chainmail, armor, bridles and more. And it’s all put to good use on the series. Mahoney notes, “We cover every single thing that you see on-screen as well as everything you hear off of it.” That includes all the feet (human and horses), cloth, and practical effects like grabs, pick-ups/put downs, and touches. They also cover the battle sequences.

Mahoney says they use 20 to 30 tracks of Foley just to create the layers of detail that the battle scenes need. Starting with the cloth pass, they cover the Saxon chainmail and the Vikings leather and fur armor. Then they do basic cloth and leather movements to cover non-warrior characters and villagers. They record a general weapons track, played at low volume, to provide a base layer of sound.

Next they cover the horses from head to hoof, with bridles and saddles, and Foley for the horses’ feet. When asked what’s the best way to Foley horse hooves, Mahoney asserts that it is indeed with coconuts. “We’ve also purchased horseshoes to add to the stable atmospheres and spot FX when required,” he explains. “We record any abnormal horse movements, i.e. crossing a drawbridge or moving across multiple surfaces, and sound designers take care of the rest. Whenever muck or gravel is needed, we buy fresh material from the local DIY stores and work it into our grids/pits on the Foley stage.”

The battle scenes also require Foley for all the grabs, hits and bodyfalls. For the blood and gore, they use a variety of fruit and animal flesh.

Then there’s a multitude of feet to cover the storm of warriors rushing at each other. All the boots they used were wrapped in leather to create an authentic sound that’s true to the time. Mahoney notes that they didn’t want to capture “too much heel in the footsteps, while also trying to get a close match to the sync sound in the event of ADR.”

Surfaces include stone and marble for the Saxon castles of King Alfred and the other noble lords. For the wooden palisades and fort walls, Mahoney says they used a large wooden base accompanied by wooden crates, plinths, boxes and an added layer of controlled creaks to give an aged effect to everything. On each series, they used 20 rolls of fresh grass, lots of hay for the stables, leaves for the forest, and water for all the sea and river scenes. “There were many nights cleaning the studio after battle sequences,” he says.

In addition to the aforementioned props of medieval weapons, grass, mud, bridles and leather, Mahoney says they used an unexpected prop: “The Viking cloth tracks were actually done with samurai suits. They gave us the weight needed to distinguish the larger size of a Danish man compared to a Saxon.”

Their favorite scenes to Foley, and by far the most challenging, were the battle scenes. “Those need so much detail and attention. It gives us a chance to shine on the soundtrack. The way that they are shot/edited can be very fast paced, which lends itself well to micro details. It’s all action, very precise and in your face,” he says. But if they had to pick one favorite scene, Mahoney says it would be “Uhtred and Ragnar storming Kjartan’s stronghold.”

Another challenging-yet-rewarding opportunity for Foley was during the slave ship scenes. Uhtred and his friend are sold into slavery as rowers on a Viking ship, which holds a crew of nearly 30 men. The Mahoney team brought the slave ship to life by building up layers of detail. “There were small wood creaks with small variations of wood and big creaks with larger variations of wood. For the big creaks, we used leather and a broomstick to work into the wood, creating a deep creak sound by twisting the three elements against each other. Then we would pitch shift or EQ to create size and weight. When you put the two together it gives detail and depth. Throw in a few tracks of rigging and pulleys for good measure and you’re halfway there,” says Mahoney.

For the sails, they used a two-mic setup to record huge canvas sheets to create a stereo wrap-around feel. For the rowing effects, they used sticks, brooms and wood rubbing, bouncing, or knocking against large wooden floors and solid boxes. They also covered all the characters’ shackles and chains.

Foley is a very effective way to draw the audience in close to a character or to help the audience feel closer to the action on-screen. For example, near the end of Season 2’s finale, a loyal subject of King Alfred has fallen out of favor. He’s eventually imprisoned and prepares to take his own life. The sound of his fingers running down the blade and the handling of his knife make the gravity of his decision palpable.

Mahoney shares another example of using Foley to draw the audience in — during the scene when Sven is eaten by Thyra’s wolves (following Uhtred and Ragnar storming Kjartan’s stronghold). “We used oranges and melons for Sven’s flesh being eaten and for the blood squirts. Then we created some tracks of cloth and leather being ripped. Specially manufactured claw props were used for the frantic, ravenous wolf feet,” he says. “All the action was off-screen so it was important for the audience to hear in detail what was going on, to give them a sense of what it would be like without actually seeing it. Also, Thyra’s reaction needed to reflect what was going on. Hopefully, we achieved that.”

The long, strange trip of Amir Bar-Lev’s new Dead doc

Deadheads take note — Long Strange Trip, director Amir Bar-Lev’s four-hour documentary on rock’s original jam band, the Grateful Dead, is now available for viewing. While the film had a theatrical release in New York and Los Angeles on May 26, the doc was made available on Amazon Video as a six-episode series.

L-R: Jack Lewars and Keith Jenson.

Encompassing the band’s rise and decades-long career, the film, executive produced by Martin Scorsese, was itself 14 years in the making. That included three months of final post at Technicolor PostWorks New York, where colorist Jack Lewars and online editor Keith Jenson worked with Bar-Lev to finalize the film’s form and look.

The documentary features scores of interviews conducted by Bar-Lev with band members and their associates, as well as a mountain of concert footage and other archival media. All that made editorial conforming complex as Jenson (using Autodesk Flame) had to keep the diverse source material organized and make it fit properly into a single timeline. “We had conversions that were made from old analog tapes, archival band footage, DPX scans from film and everything in between,” he recalls. “There was a lot of cool stuff, which was great, but it required attention to detail to ensure it came out nice and smooth.”

The process was further complicated as creative editorial was ongoing throughout post. New material was arriving constantly. “We do a lot of documentary work here, so that’s something we’re used to,” Jenson says. “We have workflows and failsafes in place for all formats and know how to translate them for the Lustre platform Jack uses. Other than the sheer amount, nothing took us by surprise.”

Lewars faced a similar challenge during grading as he was tasked with bringing consistency to material produced over a long period of time by varying means. The overall visual style, he says, recalls the band’s origins in the psychedelic culture of the 1960s. “It’s a Grateful Dead movie, so there are a lot of references to their experiments with drugs,” he explains. “Some sections have a trippy feel where the visuals go in and out of different formats. It almost gives the viewer the sense of being on acid.”

The color palette, too, has a psychedelic feel, reflecting the free-spirited essence of the band and its co-founder. “Jerry Garcia’s life, his intention and his outlook, was to have fun,” Lewars observes. “And that’s the look we embraced. It’s very saturated, very colorful and very bright. We tried to make the movie as fun as possible.”

The narrative is frequently punctuated by animated sequences where still photographs, archival media and other elements are blended together in kaleidoscopic patterns. Finalizing those sequences required a few extra steps. “For the animation sequences, we had to cut in the plates and get them to Jack to grade,” explains Jenson. “We’d then send the color-corrected plates to the VFX and animation department for treatment. They’d come back as completed elements that we’d cut into the conform.”

The documentary climaxes with the death of Garcia and its aftermath. The guitarist suffered a heart attack in 1995 after years of struggling with diabetes and drug addiction. As those events unfold, the story undergoes a mood change that is mirrored in shifts in the color treatment. “There is a four-minute animated sequence in the last reel where Jerry has just passed and they are recapping the film,” Lewars says. “Images are overlaid on top of images. We colored those plates in hyper saturation, pushing it almost to the breaking point.

“It’s a very emotional moment,” he adds. “The earlier animated sequences introduced characters and were funny. But it’s tied together at the end in a way that’s sad. It’s a whiplash effect.”

Despite the length of the project and the complexity of its parts, it came together with few bumps. “Supervising producer Stuart Macphee and his team were amazing,” says Jenson. “They were very well organized, incredibly so. With so many formats and conversions coming from various sources, it could have snowballed quickly, but with this team it was a breeze.”

Lewars concurs. Long Strange Trip is an unusual documentary both in its narrative style and its looks, and that’s what makes it fascinating for Deadheads and non-fans alike. “It’s not a typical history doc,” Lewars notes. “A lot of documentaries go with a cold, bleach by-pass look and gritty feel. This was the opposite. We were bumping the saturation in parts where it felt unnatural, but, in the end, it was completely the right thing to do. It’s like candy.”

You can binge it now on Amazon Video.

Pixelogic acquires Sony DADC NMS’ creative services unit

Pixelogic, a provider of localization and distribution services, has completed the acquisition of the creative services business unit of Sony DADC New Media Solutions, which specializes in 4K, UHD, HDR and IMF workflows for features and episodics. The move brings an expansion of Pixelogic’s significant services to the media and entertainment industry and provides additional capabilities, including experienced staff, proprietary technology and an extended footprint.

According to John Suh, co-president of Pixelogic, the acquisition “expands our team of expert media engineers and creative talent, extends our geographic reach by providing a fully established London operation and further adds to our capacity and capability within an expansive list of tools, technologies, formats and distribution solutions.”

Seth Hallen

Founded less than a year ago, Pixelogic currently employs over 240 worldwide and is led by industry veterans Suh and Rob Seidel. While the company is headquartered in Burbank, California, it has additional operations in Culver City, California, London and Cairo.

Sony DADC NMS Creative Services was under the direction of Seth Hallen, who joins Pixelogic as senior VP of business development and strategy. All Sony DADC NMS Creative Services staff, technology and operations are now part of Pixelogic. “Our business model is focused on the deep integration of localization and distribution services for movies and television products,” says Hallen. “This supply chain will require significant change in order to deliver global day and date releases with collapsed distribution windows, and by partnering closely with our customers we are setting out to innovate and help lead this change.”

FX’s Fargo features sounds as distinctive as its characters

By Jennifer Walden

In Fargo, North Dakota, in the dead of winter, there’s been a murder. You might think you’ve heard this story before, but Noah Hawley keeps coming up with a fresh, new version of it for each season of his Fargo series on FX. Sure, his inspiration was the Coen brothers’ Oscar-winning Fargo film, but with Season 3 now underway it’s obvious that Hawley’s series isn’t simply a spin-off.

Martin Lee and Kirk Lynds.

Every season of the Emmy-winning Fargo series follows a different story, with its own distinct cast of characters, set in its own specified point in time. Even the location isn’t always the same — Season 3 takes place in Minnesota. What does link the seasons together is Hawley’s distinct black humor, which oozes from these disparate small-town homicides. He’s a writer and director on the series, in addition to being the showrunner and an executive producer. “Noah is very hands-on,” confirms re-recording mixer Martin Lee at Tattersall Sound & Picture in Toronto, part of the SIM Group family of companies, who has been mixing the show with re-recording mixer Kirk Lynds since Season 2.

Fargo has a very distinct look, feel and sound that you have to maintain,” explains Lee. “The editors, producers and Noah put a lot of work into the sound design and sound ideas while they are cutting the picture. The music is very heavily worked while they are editing the show. By the time the soundtrack gets to us there is a pretty clear path as to what they are looking for. It’s up to us to take that and flesh it out, to make it fill the 5.1 environment. That’s one of the most unique parts of the process for us.”

Season 3 follows rival brothers, Emmit and Ray Stussy (both played by Ewan McGregor). Their feud over a rare postage stamp leads to a botched robbery attempt that ultimately ends in murder (don’t worry, neither Ewan character meets his demise…yet??).

One of the most challenging episodes to mix this season, so far, was Episode 3, “The Law of Non-Contradiction.” The story plays out across four different settings, each with unique soundscapes: Minnesota, Los Angeles in 2010, Los Angeles in 1975 and an animated sci-fi realm. As police officer Gloria Burgle (Carrie Coon) unravels the homicide in Eden Valley, Minnesota, her journey leads her to Los Angeles. There the story dives into the past, to 1975, to reveal the life story of science fiction writer Thaddeus Mobley (Thomas Mann). The episode side-trips into animation land when Gloria reads Mobley’s book titled The Planet Wyh.

One sonic distinction between Los Angeles in 2010 and Los Angeles of 1975 was the density of traffic. Lee, who mixed the dialogue and music, says, “All of the scenes that were taking place in 2010 were very thick with traffic and cars. That was a technical challenge, because the recordings were very heavy with traffic.”

Another distinction is the pervasiveness of technology in social situations, like the bar scene where Gloria meets up with a local Los Angeles cop to talk about her stolen luggage. The patrons are all glued to their cell phones. As the camera pans down the bar, you hear different sounds of texting playing over a contemporary, techno dance track. “They wanted to have those sounds playing, but not become intrusive. They wanted to establish with sound that people are always tapping away on their phones. It was important to get those sounds to play through subtly,” explains Lynds.

In the animated sequences, Gloria’s voice narrates the story of a small android named MNSKY whose spaceman companion dies just before they reach Earth. The robot carries on the mission and records an eon’s worth of data on Earth. The robot is eventually reunited with members of The Federation of United Planets, who cull the android’s data and then order it to shut down. “Because it was this animated sci-fi story, we wanted to really fill the room with the environment much more so than we can when we are dealing with production sound,” says Lee. “As this little robotic character is moving through time on Earth, you see something like the history of man. There’s voiceover, sound effects and music through all of it. It required a lot of finesse to maintain all of those elements with the right kind of energy.”

The animation begins with a spaceship crashing into the moon. MNSKY wakes and approaches the injured spaceman who tells the android he’s going to die. Lee needed to create a vocal process for the spaceman, to make it sound as though his voice is coming through his helmet. With Audio Ease’s Altiverb, Lee tweaked the settings on a “long plastic tube” convolution reverb. Then he layered that processed vocal with the clean vocal. “It was just enough to create that sense of a helmet,” he says.

At the end, when MNSKY rejoins the members of the Federation on their spaceship it’s a very different environment from Earth. The large, ethereal space is awash in long, warm reverbs which Lynds applied using plug-ins like PhoenixVerb 5.1 and Altiverb. Lee also applied a long reverb treatment to the dialogue. “The reverbs have quite a significant pre-delay, so you almost have that sense of a repeat of the voice afterwards. This gives it a very distinctive, environmental feel.”

Lynds and Lee spend two days premixing their material on separate dub stages. For the premix, Lynds typically has all the necessary tracks from supervising sound editor Nick Forshager while Lee’s dialogue and music tracks come in more piecemeal. “I get about half the production dialogue on day one and then I get the other half on day two,” says Lee. “ADR dribbles in the whole time, including well into the mixing process. ADR comes in even after we have had several playbacks already.”

Fortunately, the show doesn’t rely heavily on ADR. Lee notes that they put a lot of effort into preserving the production. “We use a combination of techniques. The editors find the cleanest lines and takes (while still keeping the performance), then I spent a lot of time cleaning that up,” he says.

This season Lee relies more on Cedar’s DNS One plug-in for noise reduction and less on the iZotope RX5 (Connect version). “I’m finding with Fargo that the showrunners are uniquely sensitive to the effects of the iZotope processing. This year it took more work to find the right sound. It ends up being a combination of both the Cedar and the RX5,” reports Lee.

After premixing, Lee and Lynds bring their tracks together on Tattersall’s Stage 1. They have three days for the 5.1 final mix. They spend one (very) long day building the episode in 5.1 and then send their mix to Los Angeles for Forshager and co-producer Gregg Tilson to review. Then Lee and Lynds address the first round of notes the next morning and send the mix back to Los Angeles for another playback. Each consecutive playback is played for more people. The last playback is for Hawley on the third day.

“One of the big challenges with the workflow is mixing an episode in one day. It’s a long mix day. At least the different time zones help. We send them a mix to listen to typically around 6-7pm PST, so it’s not super late for them. We start at 8am EST the next morning, which is three hours ahead of their time. By the time they’re in the studio and ready to listen, it is 10am their time and we’ve already spent three or four hours handling the revisions. That really works to our advantage,” says Lee.

Sound in the Fargo series is not an afterthought. It’s used to build tension, like a desk bell that rings for an uncomfortably long time, or to set the mood of a space, like an overly noisy fish tank in a cheap apartment. By the time the tracks have made it to the mixers, there’s been “a lot of time and effort spent thinking about what the show was going to sound like,” says Lynds. “From that sense, the entire mix for us is a creative opportunity. It’s our chance to re-create that in a 5.1 environment, and to make that bigger and better.”

You can catch new episodes of Fargo on FX Networks, Wednesdays at 10pm EST.


Jennifer Walden is a New Jersey-based audio engineer and writer.

Assistant Editor’s Bootcamp coming to Burbank in June

The new Assistant Editors’ Bootcamp, founded by assistant/lead editors Noah Chamow (The Voice) and Conor Burke (America’s Got Talent), is a place for a assistant editors and aspiring assistants to learn and collaborate with one another in a low-stakes environment. The next Assistant Editors’ Bootcamp classes will be held on June 10-11, along with a Lead Assistant Editors’ class geared toward understanding troubleshooting and system performance on June 24-25. All classes, sponsored by AlphaDogs’ Editor’s Lounge, will be held at Skye Rentals in Burbank.

The classes will cover such topics as The Fundamentals of Video, Media Management, Understanding I/O and Drive Speed, Prepping Footage for Edit, What’s New in Media Composer, Understanding System Performance Bottlenecks and more. Cost is $199 for two days for the Assistant Editor class, and $299 for two days for the Lead Assistant Editor class. Space is on a first-come, first-served basis and is limited to 25 participants per course. You can register here.

A system with Media Composer 8.6 or later and an external hard drive is required to take the class (30-day Avid trial available) 8GB of system memory and Windows 7/OS X 10.9 or later are needed to run Media Composer 8.6. Computer rentals are available for as little as $54 a week from Hi-Tech Computer Rental in Burbank.

Chamow and Burke came up with the idea for Assistant Editors’ Bootcamp when they realized how challenging it is to gain any real on-the-job experience in today’s workplace. With today’s focus being primarily on doing things faster and more efficiently, it’s almost impossible to find the time to figure out why one method of doing something is faster than another. Having worked extensively in reality television and creating the “The Super Grouper,” a multi-grouping macro for Avid that is now widely used in reality post workflows, Chamow understands first-hand the landscape of the assistant editor’s world. “One of the most difficult things about working in the entertainment industry, especially in a technical position, is that there is never time to learn,” he says. “I’m very passionate about education and hope by hosting these classes, I can help other assistants hone their skills as well as helping those who are new to the business get the experience they need.”

Having worked as both an assistant editor and lead assistant editor, Burke has created workflows and overseen post for up to 10 projects at a time, before moving into his current position at NBC’s America’s Got Talent. “In my years of experience and working on grueling deadlines, I completely understand how difficult the job of an assistant editor can be, having little or no time to learn anything other than what’s right in front of you,” he says. “In teaching this class, I hope to make peers feel more confident and have a better understanding in their work, taking them to the next level in their careers.”

Main Image (L-R): Noah Chamow and Conor Burke.

The A-List: Director Ron Howard discusses National Geo’s Genius

By Iain Blair

Ron Howard has done it all in Hollywood. The former child star of The Andy Griffith Show and Happy Days not only successfully made the tricky transition to adult actor (at 22 he starred opposite John Wayne in The Shootist and was nominated for a Best Supporting Actor Oscar), but went on to establish himself as an Oscar-winning director and producer (A Beautiful Mind). He is also one of Hollywood’s most beloved and commercially successful and versatile helmers.

Since making his directorial debut in 1977 with Grand Theft Auto (when he was still on Happy Days), he’s made an eclectic group of films about boxers (Cinderella Man), astronauts (Apollo 13), mermaids (Splash), symbologists (The Da Vinci Code franchise), politicians (Frost/Nixon) firefighters (Backdraft), mathematicians (A Beautiful Mind), Formula One racing (Rush), whalers (In the Heart of the Sea) and the Fab Four (his first documentary, The Beatles: Eight Days a Week).

Born in Oklahoma with showbiz in his DNA — his parents were both actors — Howard “always wanted to direct” and notes that “producing gives you control.” In 1986, he co-founded Imagine Entertainment with Brian Grazer, a powerhouse in film and TV (Empire, Arrested Development) production. His latest project is the new Genius series for National Geographic.

The 10-part global event series — the network’s first scripted series — is based on Walter Isaacson’s book “Einstein: His Life and Universe” and tracks Albert Einstein’s rise from humble origins as an imaginative and rebellious thinker through his struggles to be recognized by the establishment, to his global celebrity status as the man who unlocked the mysteries of the cosmos with his theory of relativity.

But if you’re expecting a dry, intellectual by-the-numbers look at his life and career, you’re in for a big surprise.

With an impressive cast that includes Geoffrey Rush as the celebrated scientist in his later years, Johnny Flynn as Einstein in the years before he rose to international acclaim and Emily Watson as his second wife — and first cousin — Elsa Einstein, the show is full of sex, drugs and rock ‘n’ roll.

We’re mostly joking, but the series does balance the hard-to-grasp scientific theories with an entertaining exploration of a man with an often very messy private life as it follows Einstein’s alternately exhilarating emotions and heartlessness in dealing with his closest personal relationships, including his children, his two wives and the various women with whom he cheats on them.

Besides all the personal drama, there’s plenty of global drama as Genius is set against an era of international conflict over the course of two world wars. Faced with rising anti-Semitism in Europe, surveillance by spies and the potential for atomic annihilation, Einstein struggles as a husband and a father, not to mention as a man of principle, even as his own life is put in danger.

I talked recently with Ron Howard about directing the first episode and his love of production and post.

What was the appeal of doing this and making your scripted television directorial debut with the first episode?
I’ve become a big fan of all the great TV shows people are doing now, where you let a story unfold in a novelistic way, and I was envious of a lot of my peers getting into doing TV — and this was a great project that just really suits the TV format. Over the years, I had read various screenplays about Einstein but they just never worked as a movie, so when National Geographic wanted to reach out to their audience in a more ambitious way, suddenly there was this perfect platform to do this life justice and have the length it needed. It’s an ideal fit, and it was perfect to do it with National Geographic.

Given that you had considered making a film about him, how familiar were you with Einstein and his life? How do you find the drama in an academic’s life?
I thought I had some insight, but I was blown away by the book and Noah Pink’s screenplay, and everyone on the team brought their own research to the process, and it became more and more fascinating. There was this constant pressure on Einstein that I felt we could work with through the whole series, and that I never realized was there. And with that pressure, there’s drama. We came very close to not benefiting from his genius because of all the forces against him – sometimes from external forces, like governments and academic institutions, but often from his own foibles and flaws. He was even on a hit list. So I was really fascinated by his whole story.

What most surprised you about Einstein once you began delving deeper into his private life?
That he was such a Lothario! He had quite a complicated love life, but it was also that he had such a dogged commitment to his principles and logic and point-of-view. I was doing post on the Beatles documentary as we prepped, and it was the same thing with those young men. They often didn’t listen to outside influences and people telling them it couldn’t be done. They absolutely committed to their musical vision and principles with all their drive and focus, and it worked — and collectively I think you could say the band was genius.

Einstein also trusted his convictions, whether it was physics or math, and if the conventional answers didn’t satisfy his sense of logic, he’d just dig deeper. The same thing can be said for his personal life and relationships, and trying to find a balance between his career and life’s work, and family and friends. Look at his falling in love with his fellow physics student Mileva Maric, which causes all sorts of problems, especially when she unexpectedly gets pregnant. No one else thought she was particularly attractive, she was a bit of an outcast as the only female physics student, and yet his logic called him to her. The same thing with politics. He went his own way in everything. He was a true renaissance man, eternally curious about everything.

In terms of dealing with very complex ideas that aren’t necessarily very cinematic, it must have helped that you’d made A Beautiful Mind?
Yes, we saw a lot of similarities between the two. It really helped that both men were essentially visualists — Einstein even more so than John Nash. That gave us a big advantage and gave me the chance to show audiences some of his famous thought experiments in cinematic ways, and he described them very vividly and they’re a fantastic jumping-off point — it was his visualizations that helped him wrap his head around the physics. He began with something he could grasp physically and then went back to prove it with the math. Those principles gave him the amazing insights about the nature of the universe, and time and space, that we’ve all benefitted from.

I assume you began integrating post and all the VFX very early on?
Right away, in preproduction meetings in Prague, in the Czech Republic, where Einstein lived and taught early in his career. We had our whole team there on location, including our VFX supervisor Eric Durst and his team, DP Mathias Herndl, our production designers and art directors and so on. With all the VFX, we stayed pretty close to how Einstein described his thought experiments. The one that starts off this first episode is very vivid, whereas the first one he has as a 17-year-old boy is done in a more chalk-board kind of way, where he faints and can barely hang on mentally to the image. All the dailies and visual effects were done by UPP.

Where did you do the post?
We did all the editing and sound back in LA.

Do you like the post process?
I love it. I love the edit and slowly pulling it all together after the stress of the shoot.

It was edited by James Wilcox, who’s done CSI: Miami and Hawaii Five-O, along with Debby Germino and J. Kathleen Gibson. How early was James involved and was he on set?
Dan and Mike weren’t available. It’s the first time I’d worked with James and he’s very creative and did a great job. He wasn’t on the set, but we were constantly in communication and we’d send him material back to LA and then when I got back, we sat down together.

The show constantly cuts back and forth in time.
Yes, I was fascinated by all those transitions and I worked very closely with my team to make sure we had all that down, and that it all flowed smoothly in the edit. For instance, Johnny Flynn plays violin and he trained classically, so he actually plays in all those scenes. But Geoffrey doesn’t play violin, but he practiced for several months, and we had a teacher on set too. Geoffrey was so dedicated to creating this character.They both looked at tons of footage of Einstein as an older man, so Johnny could develop aspects of Einstein’s manner and behavior as the younger one, which Geoffrey could work with later, so we had a real continuity to the character. That’s a big reason why I wanted to be so hands-on with the first episode, as we were defining so many key aspects of the man and the aesthetics and the way we’d be telling the whole story.

Can you talk about working on the sound and music?
It’s always huge to me and adds so much to every scene. Lorne Balfe wrote a fantastic score and we had a great sound team: production sound mixer Peter Forejt, supervising sound editor Daniel Pagan, music editor Del Spiva and re-recording mixers Mark Hensley and Bob Bronow. For post production audio we used Smart Post Sound.

The DI must have been important?
It was very important since we were trying to do stuff with the concept of time in very subtle ways using the camera work, the palette and the lighting style. This all changed subtly depending on whether it was an Einstein memory, or a flashback to his younger, brasher self, or looking ahead to the iconic older man where it was all a little more formal. So we went for different looks to match the different energies and, of course, the editing style had to embody all of that as well. The colorist was Pankaj Bajpai, and he did a great job.

What’s next?
I plan to do more TV. Remember, I came out of TV and it’s so exciting now. I’m also developing several movie projects, including Seveneves, a sci-fi film, and Under the Banner of Heaven which is based on the Jon Krakauer bestseller. So whatever comes together first.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Game of Thrones: VFX associate producer Adam Chazen

With excitement starting to build for the seventh season of HBO’s Game of Thrones, what better time to take a quick look back at last season’s VFX workflow. HBO associate VFX producer Adam Chazen was kind enough to spend some time answering questions after just wrapping Season 7.

Tell us about your background as a VFX associate producer and what led you to Game of Thrones.
I got my first job as a PA at VFX studio Pixomondo. I was there for a few years, working under my current boss Steve Kullback (visual effects producer on Game of Thrones). He took me with him when he moved to work on Yogi Bear, and then on Game of Thrones.

I’ve been with the show since 2011, so this is my sixth year on board. It’s become a real family at this point; lots of people have been on since the pilot.

From shooting to post, what is your role working on Game of Thrones?
As the VFX associate producer, in pre-production mode I assist with organizing our previs and concept work. I help run and manage our VFX database and I schedule reviews with producers, directors and heads of departments.

During production I make sure everyone has what they need on set in order to shoot for the various VFX requirements. Also during production, we start to post the show — I’m in charge of running review sessions with our VFX supervisor Joe Bauer. I make sure that all of his notes get across to the vendors and that the vendors have everything they need to put the shots together.

Season 7 has actually been the longest we’ve stayed on set before going back to LA for post. When in Belfast, it’s all about managing the pre-production and production process, making sure everything gets done correctly to make the later VFX adjustments as streamlined as possible. We’ll have vendors all over the world working on that next step — from Australia to Spain, Vancouver, Montreal, LA, Dublin and beyond. We like to say that the sun never sets on Game of Thrones.

What’s the process for bringing new vendors onto the show?
They could be vendors that we’ve worked with in the past. Other times, we employ vendors that come recommended by other people. We check out industry reels and have studios do testing for us. For example, when we have dragon work we ask around for vendors willing to run dragon animation tests for us. A lot of it is word of mouth. In VFX, you work with the people that you know will do great work.

What’s your biggest challenge in creating Game of Thrones?
We’re doing such complex work that we need to use multiple vendors. This can be a big hurdle. In general, whether it be film or TV, when you have multiple vendors working on the same shot, it becomes a potential issue.

Linking in with cineSync helps. We can have a vendor in Australia and a vendor in Los Angeles both working on the same shot, at exactly the same time. I first started using cineSync while at Pixomondo and found it makes the revision process a lot quicker. We send notes out to vendors, but most of the time it’s easier to get on cineSync, see the same image and draw on it.

Even the simple move of hovering a cursor over the frame can answer a million questions. We have several vendors who don’t use English as their first language, such as those in Spain. In these cases, communication is a lot easier via cineSync. By pointing to a single portion of a single frame, we completely bypass the language barrier. It definitely helps to see an image on screen versus just explaining it.

What is your favorite part of the cineSync toolkit?
We’ve seen a lot of cool updates to cineSync. Specifically, I like the notes section, where you can export a PDF to include whichever frame that note is attributed to.

Honestly, just seeing a cursor move on-screen from someone else’s computer is huge. It makes things so much easier to just point and click. If we’re talking to someone on the phone, trying to tell them about an issue in the upper left hand corner, it’s going to be hard to get our meaning across. cineSync takes away all of the guesswork.

Besides post, we also heavily use cineSync for shoot needs. We shoot the show in Northern Ireland, Iceland, Croatia, Spain and Calgary. With cineSync, we are able to review storyboards, previs, techvis and concepts with the producers, directors, HODs and others, wherever they are in the world. It’s crucial that everyone is on the same page. Being able to look at the same material together helps everyone get what they want from a day on set.

Is there a specific shot, effect or episode you’re particularly proud of?
The Battle of the Bastards — it was a huge episode. Particularly, the first half of the episode when Daenerys came in with her dragons at the battle of Meereen, showing those slavers who is boss. Meereen City itself was a large CG creation, which was unusual for Game of Thrones. We usually try to stay away from fully CG environments and like to get as much in-camera as possible.

For example, when the dragon breathes fire we used an actual flamethrower we shot. Back in Season 5, we started to pre-animate the dragon, translate it to a motion control rig, and attach a flamethrower to it. It moves exactly how the dragon would move, giving us a practical element to use in the shot. CG fire can be done but it’s really tricky. Real is real, so you can’t question it.

With multiple vendors working on the sequence, we had Rodeo FX do the environment while Rhythm & Hues did the dragons. We used cineSync a lot, reviewing shots between both vendors in order to point out areas of concern. Then in the second half of the episode, which was the actual Battle of the Bastards, the work was brilliantly done by Australian VFX studio Iloura.

Jason Moss composes music for ABC’s The Toy Box

By Jennifer Walden

Children may not be the best source for deciding when bedtime should be, or deciding what’s for dinner (chicken nuggets again?), but who better to decide what toys kids want to play with? A large part of the Tom Hanks film Big was based on this premise.

ABC’s new inventor-centric series, The Toy Box, which premiered in April, features four young judges who are presented with new toy inventions. They then get to decide which toy prototypes would be popular with others in their demographic. Toy inventors competing on the show first meet with a set of “expert mentors,” a small group of adults who delve into the specifics of the toy and offer advice.

Jason Moss

If the toy makes it past that panel, it gets put into the “toy box.” The toy is then presented to the four young judges, who get to play with it, ask questions and give their critique to the toy inventor. The four young judges deliberate and make a final decision on which toy will advance to the next round. At the end of the season, the judges will chose one winning toy to be made by Mattel and sold exclusively at Toys ‘R’ Us.

The Toy Box needed a soundtrack that could both embody the essence of juvenile joviality and portray the pseudo-seriousness its pre-teen decision makers. It’s not a job for your average reality show composer. It required askew musical sensibilities. “The music is fun and super-pop sounding with cool analog synths and video game sounds. It’s really energetic and puts a smile on your face,” says the series composer/music supervisor Jason Moss at Super Sonic Noise in Los Angeles. “Then for the decision-making cues, as the kids decide whether they like a toy and what they’re going to do, it had to be something other than what you’d expect. It couldn’t sound too dark. It still had to be quirky.”

Moss knows quirky. He was the composer on IFC’s Gigi Does It, starring David Krumholtz as an eccentric Jewish grandmother living in Florida. Moss also composed the theme music for the Seeso original series Bajillion Dollar Propertie$, a partially improvised comedy series that pokes fun at real estate reality shows.

Moss covered all of The Toy Box’s musical needs — from high-energy pop and indie rock tracks when the kids are playing with the toys to comedic cues infused with ukulele and kitschy strings, and tension tracks for moments of decision. He wrote original music as well as curated selections from the Bulletproof Bear music catalog. Bulletproof Bear offers a wide variety of licensable tracks written by Moss, plus other music catalogs they represent. “It’s a big collection with over 33,000 tracks. We can really compete with bigger music license companies because we have a huge amount of diverse music that can cover the whole production from head to toe,” he says.

The Gear
Moss composes in Apple’s Logic Pro X. He performed live guitars, bass and ukulele (using the Kala U-Bass bass ukulele). For mics, he chose Miktek Audio’s CV4 large diaphragm condense tube and their C5 small diaphragm pencil condenser, each paired with Empirical Labs Mike-E pre-amps.

Moss combined the live sounds with virtual instruments, particularly those from Spectrasonics. XLN Audio’s Addictive Drums were his go-to for classic and modern drum sounds. For synths, he used reFX’s Nexus, libraries from Native Instrument’s Kontakt, Arturia’s Analog Lab and their VOX Continental V. He also called on the ROLI Equator sound engine via the ROLI Rise 25 key MIDI controller, which features soft squishy silicone keys much different from a traditional keyboard controller. The Akai MPK88 weighted key controller is Moss’ choice in that department. For processing and effects, he chose plug-ins by Soundtoys and PSP Audioware. He also incorporated various toy and video game sounds into the tracks.

The Score
The show’s two-minute opener combines three separate segments — the host (Modern Family‘s Eric Stonestreet), the expert mentor introductions and the judges introductions. Each has its own musical vibe. The host and the expert mentors have original music that Moss wrote specifically for the show. The judges have a dramatic pulsing-string track that is licensed from Bulletproof Bear’s catalog. In addition, a five-second tag for The Toy Box logo is licensed from the Bulletproof Bear catalog. That tag was composed by Jon LaCroix, who is one of Moss’ business partners. In regards to the dramatic strings on the kids’ entrance, Moss, who happened to write that cue, says, “The way they filmed the kids… it’s like they are little mini adults. So the theme has some seriousness to it. In context, it’s really cute.”

For the decision-making cues, Moss wanted to stay away from traditional tension strings. To give the track a more playful feel that would counterbalance the tension, he used video game sounds and 808 analog drum sounds. “I also wanted to use organic sounds that were arpeggiated and warm. They are decision-making tick-tock tracks, but I wanted to make it more fun and interesting,” says Moss.

“We were able to service the show on the underscore side with Bulletproof Bear’s music catalog in conjunction with my original music. It was a great opportunity for us to keep all the music within our company and give the client a one-stop shop, keeping the music process organized and easy,” he explains. “It was all about finding the right sound, or the right cue, for each of those segments. At the end of the day, I want to make sure that everybody is happy, acknowledge the showrunners’ musical vision and strive to capture that. It was a super-fun experience, and hopefully it will come back for a second, third and tenth season! It’s one of those shows you can watch with your kids. The kid judges are adorable and brutally honest, and with the myriad of adult programming out there, it’s refreshing to see a show like The Toy Box get green-lit.”

A chat with Emmy-winning comedy editor Sue Federman

This sitcom vet talks about cutting Man With A Plan and How I Met Your Mother.

By Dayna McCallum

The art of sitcom editing is overly enjoyed and underappreciated. While millions of people literally laugh out loud every day enjoying their favorite situation comedies, very few give credit to the maestro behind the scenes, the sitcom editor.

Sue Federman is one of the best in the business. Her work on the comedy How I Met Your Mother earned three Emmy wins and six nominations. Now the editor of CBS’ new series, Man With A Plan, Federman is working with comedy legends Matt LeBlanc and James Burrows to create another classic sitcom.

However, Federman’s career in entertainment didn’t start in the cutting room; it started in the orchestra pit! After working as a professional violinist with orchestras in Honolulu and San Francisco, she traded in her bow for an Avid.

We sat down to talk with Federman about the ins and outs of sitcom editing, that pesky studio audience, and her journey from musician to editor.

When did you get involved with your show, and what is your workflow like?
I came onto Man With A Plan (MWAP) after the original pilot had been picked up. They recast one of the leads, so there was a reshoot of about 75 percent of the pilot with our new Andi, Liza Snyder. My job was to integrate the new scenes with the old. It was interesting to preserve the pace and feel of the original and to be free to bring my own spin to the show.

The workflow of the show is pretty fast since there’s only one editor on a traditional audience sitcom. I usually put a show together in two to three days, then work with the producers for one to two days, and then send a pretty finished cut to the studio/network.

What are the biggest challenges you face as an editor on a traditional half-hour comedy?
One big challenge is managing two to three episodes at a time — assembling one show while doing producer or studio/network notes on another, as well as having to cut preshot playbacks for show night, which can be anywhere from three to eight minutes of material that has to be cut pretty quickly.

Another challenge is the live audience laughter. It’s definitely a unique part of this kind of show. I worked on How I Met Your Mother (HIMYM) for nine years without an audience, so I could completely control the pacing. I added fake laughs that fit the performances and things like that. When I came back to a live audience show, I realized the audience is a big part of the way the performances are shaped. I’ve learned all kinds of ways to manipulate the laughs and, hopefully, still preserve the spontaneous live energy of the show.

How would you compare cutting comedy to drama?
I haven’t done much drama, but I feel like the pace of comedy is faster in every regard, and I really enjoy working at a fast pace. Also, as opposed to a drama or anything shot single-camera, the coverage on a multi-cam show is pretty straightforward, so it’s really all about performance and pacing. There’s not a lot of music in a multi-cam, but you spend a lot of time working with the audience tracks.

What role would you say an editor has in helping to make a “bit” land in a half-hour comedy?
It’s performance, timing and camera choices — and when it works, it feels great. I’m always amazed at how changing an edit by a frame or two can make something pop. Same goes for playing something wider or closer depending on the situation.

MWAP is shot before a live studio audience. How does that affect your rhythm?
The audience definitely affects the rhythm of the show. I try to preserve the feeling of the laughs and still keep the show moving. A really long laugh is great on show night, but usually we cut it down a bit and play off more reactions. The actors on MWAP are great because they really know how to “ride” the laughs and not break character. I love watching great comedic actors, like the cast of I Love Lucy, for example, who were incredible at holding for laughs. It’s a real asset and very helpful to the editor.

Can you describe your process? And what system do you edit the show on?
I’ve always used the Avid Media Composer. Dabbled with Final Cut, but prefer Avid. I assemble the whole show in one sequence and go scene by scene. I watch all of the takes of a scene and make choices for each section or sometimes for each line. Then I chunk the scene together, sometimes putting in two choices for a line or area. I then cut into the big pieces to select the cameras for each shot. After that, I go back and find the rhythm of the scene — tightening the pace, cutting into the laughs and smoothing them.

After the show is put together, I go back and watch the whole thing again, pretending that I’ve never seen it, which is a challenge. That makes me adjust it even more. I try to send out a pretty polished first cut, without cutting any dialogue to show the producers everything, which seems to make the whole process go faster. I’m lucky that the directors on MWAP are very seasoned and don’t really give me many notes. Jimmy Burrows and Pam Fryman have directed almost all of the episodes, and I don’t send out a separate cut to either of them. Particularly with Pam, as I’ve worked with her for about 11 years, so we have a nice shorthand.

How do assistant editors work into the mix?
My assistant, Dan “Steely” Esparza, is incredible! He allows me to show up to work every day and not think about anything other than cutting the show. He’s told me, even though I always ask, that he prefers not to be an editor, so I don’t push him in that direction. He is excellent at visual effects and enjoys them, so I always have him do those. On HIMYM, we had quite a lot of visual effects, so he was pretty busy there. But on MWAP, it’s mostly rough composites for blue/greenscreen scenes and painting out errant boom shadows, boom mics and parts of people.

Your work on HIMYM was highly lauded. What are some of your favorite “editing” moments from that show and what were some of the biggest challenges they threw at you?
I really loved working on that show — every episode was unique, and it really gave me opportunities to grow as an editor. Carter Bays and Craig Thomas were amazing problem solvers. They were able to look at the footage and make something completely different out of it if need be. I remember times when a scene wasn’t working or was too long, and they would write some narration, record the temp themselves, and then we’d throw some music over it and make it into a montage.

Some of the biggest editing challenges were the music videos/sequences that were incorporated into episodes. There were three complete Robin Sparkles videos and many, many other musical pieces, almost always written by Carter and Craig. In “P.S. I Love You,” they incorporated her last video into kind of a Canadian Behind the Music about the demise of Robin Sparkles, and that was pretty epic for a sitcom. The gigantic “Subway Wars” was another big challenge, in that it had 85 “scenelets.” It was a five-way race around Manhattan to see who could get to a restaurant where Woody Allen was supposedly eating first, with each person using a different mode of transportation. Crazy fun and also extremely challenging to fit into a sitcom schedule.

You started in the business as a classical musician. How does your experience as a professional violinist influence your work as an editor?
I think the biggest thing is having a good feeling for the rhythm of whatever I’m working on. I love to be able to change the tempo and to make something really pop. And when asked to change the pacing or cut sections out, when doing various people’s notes, being able to embrace that too. Collaborating is a big part of being a musician, and I think that’s helped me a lot in working with the different personalities. It’s not unlike responding to a conductor or playing chamber music. Also having an understanding of phrasing and the overall structure of a piece is valuable, even though it was musical phrasing and structure, it’s not all that different.

Obviously, whenever there’s actual music involved, I feel pretty comfortable handling it or choosing the right piece for a scene. If classical music’s involved, I have a great deal of knowledge that can be helpful. For example, in HIMYM, we needed something to be a theme for Barney’s Playbook antics. I tried a few things, and we landed on the Mozart Rondo Alla Turca, which I’ve been hearing lately in the Progresso Soup commercials.

How did you make the transition from the concert hall to the editing room?
It’s a long story! I was playing in the San Francisco Ballet Orchestra and was feeling stuck. I was lucky enough to find an amazing career counseling organization that helped me open my mind to all kinds of possibilities, and they helped me to discover the perfect job for me. It was quite a journey, but the main thing was to be open to anything and identify the things about myself that I wanted to use. I learned that I loved music (but not playing the violin), puzzles, stories and organizing — so editing!

I sold a bow, took the summer off from playing and enrolled in a summer production workshop at USC. I wasn’t quite ready to move to LA, so I went back to San Francisco and began interning at a small commercial editing house. I was answering phones, emptying the dishwasher, getting coffees and watching the editing, all while continuing to play in the Ballet Orchestra. The people were great and gave me opportunities to learn whenever possible. Luckily for me, they were using the Avid before it came to TV and features. Eventually, there was a very rough documentary that one of the editors wanted to cut, but it wasn’t organized. They gave me the key to the office and said, “You want to be an editor? Organize this!” So I did, and they started offering me assistant work on commercials. But I wanted to cut features, so I started to make little trips to LA to meet anybody I could.

Bill Steinberg, an editor working in the Universal Syndication department who I met at USC, got me hooked up with an editor who was to be one of Roger Corman’s first Avid editors. The Avids didn’t arrive right away, but he helped me put my name in the hat to be an assistant the next time. It happened, and I was on my way! I took a sabbatical from the orchestra, went down to LA, and worked my tail off for $400 a week on three low-budget features. I was in heaven. I had enough hours to join the union as an assistant, but I needed money to pay the admission fee. So I went back to San Francisco and played one month of Nutcrackers to cover the fee, and then I took another year sabbatical. Bill offered me a month position in the syndication department to fill in for him, and show the film editors what I knew about the Avid.

Eventually Andy Chulack, the editor of Coach, was looking for an Avid assistant, and I was recommended because I knew it. Andy hired me and took me under his wing, and I absolutely loved it. I guess the upshot is, I was fearlessly naive and knew the Avid!

What do you love most about being an editor?
I love the variation of material and people that I get to work with, and I like being able to take time to refine things. I don’t have to play it live anymore!

Lost in Time game show embraces ‘Interactive Mixed Reality’

By Daniel Restuccio

The Future Group — who has partnered with Fremantle Media, Ross Video and Epic Games — have created a new super-agile entertainment platform that blends linear television and game technology into a hybrid format called “Interactive Mixed Reality.”

The brainchild of Bård Anders Kasin, this innovative content deployment medium generated a storm of industry buzz at NAB 2016, and their first production Lost in Time — a weekly primetime game show — is scheduled to air this month on Norwegian television.

The Idea
The idea originated more than 13 years ago in Los Angeles. In 2003, at age 22, Kasin, a self-taught multimedia artist from Notodden, Norway, sent his CV and a bunch of media projects to Warner Bros. in Burbank, California, in hopes of working on The Matrix. They liked it. His interview was on a Wednesday and by Friday he had a job as a technical director.

Kasin immersed himself in the cutting-edge movie revolution that was The Matrix franchise. The Wachowskis visionary production was a masterful inspiration and featured a compelling sci-fi action story, Oscar-winning editing, breakthrough visual effects (“bullet-time”) and an expanded media universe that included video games and an animè-style short The Animatrix. The Matrix Reloaded and The Matrix Revolutions were shot at the same time, as well as more than an hour of footage specifically designed for the video game. The Matrix Online, an Internet gaming platform, was a direct sequel to The Matrix Revolutions.

L-R: Bård Anders Kasin and Jens Petter Høili.

Fast forward to 2013 and Kasin has connected with software engineer and serial entrepreneur Jens Petter Høili, founder of EasyPark and Fairchance. “There was this producer I knew in Norway,” explains Kasin, “who runs this thing called the Artists’ Gala charity. He called and said, ‘There’s this guy you should meet. I think you’ll really hit it off.’” Kasin met Høili had lunch and discussed projects they each were working on. “We both immediately felt there was a connection,” recalls Kasin. No persuading was necessary. “We thought that if we combined forces we were going to get something that’s truly amazing.”

That meeting of the minds led to the merging of their companies and the formation of The Future Group. The mandate of Oslo-based The Future Group is to revolutionize the television medium by combining linear TV production with cutting-edge visual effects, interactive gameplay, home viewer participation and e-commerce. Their IMR concept ditches the individual limiting virtual reality (VR) headset, but conceptually keeps the idea of creating content that is a multi-level, intricate and immersive experience.

Lost in Time
Fast forward again, this time to 2014. Through another mutual friend, The Future Group formed an alliance with Fremantle Media. Fremantle, a global media company, has produced some of the highest-rated and longest-running shows in the world, and is responsible for top international entertainment brands such as Got Talent, Idol and The X Factor.

Kasin started developing the first IMR prototype. At this point, the Lost in Time production had expanded to include Ross Video and Epic Games. Ross Video is a broadcast technology innovator and Epic Games is a video game producer and the inventor of the Unreal game engine. The Future Group, in collaboration with Ross Video, engineered the production technology and developed a broadcast-compatible version of the Unreal game engine called Frontier, shown at NAB 2016, to generate high-resolution, realtime graphics used in the production.

On January 15, 2015 the first prototype was shown. When Freemantle saw the prototype, they were amazed. They went directly to stage two, moving to the larger stages at Dagslys Studios. “Lost in Time has been the driver for the technology,” explains Kasin. “We’re a very content-driven company. We’ve used that content to drive the development of the platform and the technology, because there’s nothing better than having actual content to set the requirements for the technology rather than building technology for general purposes.”

In Lost in Time, three studio contestants are set loose on a greenscreen stage and perform timed, physical game challenges. The audience, which could be watching at home or on a mobile device, sees the contestant seamlessly blended into a virtual environment built out of realtime computer graphics. The environments are themed as western, ice age, medieval times and Jurassic period sets (among others) with interactive real props.

The audience can watch the contestants play the game or participate in the contest as players on their mobile device at home, riding the train or literally anywhere. They can play along or against contestants, performing customized versions of the scripted challenges in the TV show. The mobile content uses graphics generated from the same Unreal engine that created the television version.

“It’s a platform,” reports partner Høili, referring to the technology behind Lost in Time. A business model is a way you make money, notes tech blogger Jonathan Clarks, and a platform is something that generates business models. So while Lost in Time is a specific game show with specific rules, built on television technology, it’s really a business technology framework where multiple kinds of interactive content could be generated. Lost in Time is like the Unreal engine itself, software that can be used to create games, VR experiences and more, limited only by the imagination of the content creator. What The Future Group has done is create a high-tech kitchen from which any kind of cuisine can be cooked up.

Soundstages and Gear
Lost in Time is produced on two greenscreen soundstages at Dagslys Studios in Oslo. The main “gameplay set” takes up all of Studio 1 (5,393 square feet) and the “base station set” is on Studio 3 (1,345 square feet). Over 150 liters (40 gallons) of ProCyc greenscreen paint was used to cover both studios.

Ross Video, in collaboration with The Future Group, devised an integrated technology of hardware and software that supports the Lost in Time production platform. This platform consists of custom cameras, lenses, tracking, control, delay, chroma key, rendering, greenscreen, lighting and switcher technology. This system includes the new Frontier hardware, introduced at NAB 2016, which runs the Unreal game engine 3D graphics software.

Eight Sony HDC-2500 cameras running HZC-UG444 software are used for the production. Five are deployed on the “gameplay set.” One camera rides on a technocrane, two are on manual pedestal dollies and one is on Steadicam. For fast-action tracking shots, another camera sits on the Furio RC dolly that rides on a straight track that runs the 90-foot length of the studio. The Furio RC pedestal, controlled by SmartShell, guarantees smooth movement in virtual environments and uses absolute encoders on all axes to send complete 3D tracking data into the Unreal engine.

There is also one Sony HDC-P1 camera that is used as a static, center stage, ceiling cam flying 30 feet above the gameplay set. There are three cameras in the home base set, two on Furio Robo dollies and one on a technocrane. In the gameplay set, all cameras (except the ceiling cam) are tracked with the SolidTrack IR markerless tracking system.

All filming is done at 1080p25 and output RGB 444 via SDI. They use a custom LUT on the cameras to avoid clipping and an expanded dynamic range for post work. All nine camera ISOs, separate camera “clean feeds,” are recorded with a “flat” LUT in RGB 444. For all other video streams, including keying and compositing, they use LUT boxes to invert the signal back to Rec 709.

Barnfind provided the fiber optic network infrastructure that links all the systems. Ross Video Dashboard controls the BarnOne frames as well as the router, Carbonite switchers, Frontier graphics system and robotic cameras.

A genlock signal distributed via OpenGear syncs all the gear to a master clock. The Future Group added proprietary code to Unreal so the render engine can genlock, receive and record linear timecode (LTC) and output video via SDI in all industry standard formats. They also added additional functionality to the Unreal engine to control lights via DMX, send and receive GPI signals, communicate with custom sensors, buttons, switches and wheels used for interaction with the games and controlling motion simulation equipment.

In order for the “virtual cameras” in the graphics systems and the real cameras viewing the real elements to have the exact same perspectives, an “encoded” camera lens is required that provides the lens focal length (zoom) and focus data. In addition the virtual lens field of view (FOV) must be properly calibrated to match the FOV of the real lens. Full servo digital lenses with 16-bit encoders are needed for virtual productions. Lost in Time uses three Canon lenses with these specifications: Canon Hj14ex4.3B-IASE, Canon Hj22ex7.6B-IASE-A and Canon Kj17ex7.7B-IASE-A.

The Lost in Time camera feeds are routed to the Carbonite family hardware: Ultrachrome HR, Carbonite production frame and Carbonite production switcher. Carbonite Ultrachrome HR is a stand-alone multichannel chroma key processor based on the Carbonite Black processing engine. On Lost in Time, the Ultrachrome switcher accepts the Sony camera RGB 444 signal and uses high-resolution chroma keyers, each with full control of delay management, fill color temperature for scene matching, foreground key and fill, and internal storage for animated graphics.

Isolated feeds of all nine cameras are recorded, plus two quad-splits with the composited material and the program feed. Metus Ingest, a The Future Group proprietary hardware solution, was used for all video recording. Metus Ingest can simultaneously capture and record  up to six HD channels of video and audio from multiple devices on a single platform.

Post Production
While the system is capable of being broadcast live, they decided not to go live for the debut. Instead they are only doing a modest amount of post to retain the live feel. That said, the potential of the post workflow on Lost in Time arguably sets a whole new post paradigm. “Post allows us to continue to develop the virtual worlds for a longer amount of time,” says Kasin. “This gives us more flexibility in terms of storytelling. We’re always trying to push the boundaries with the creative content. How we tell the story of the different challenges.”

All camera metadata, including position, rotation, lens data, etc., and all game interaction, were recorded in the Unreal engine with a proprietary system. This allowed graphics playback as a recorded session later. This also let the editors change any part of the graphics non-destructively. They could choose to replace 3D models or textures or in post change the tracking or point-of-view of any of the virtual cameras as well as add cameras for more virtual “coverage.”

Lost in Time episodes were edited as a multicam project, based on the program feed, in Adobe Premiere CC. They have a multi-terabyte storage solution from Pixit Media running Tiger Technology’s workflow manager. “The EDL from the final edit is fed through a custom system, which then builds a timeline in Unreal to output EXR sequences for a final composite.”

That’s it for now, but be sure to visit this space again to see part two of our coverage on The Future Group’s Lost in Time. Our next story will include the real and virtual lighting systems, the SolidTrack IR tracking system, the backend component, and interview with Epic Games’ Kim Libreri about Unreal engine development/integration and a Lost in Time episode editor.


Daniel Restuccio, who traveled to Oslo for this piece, is a writer, producer and teacher. He is currently multimedia department chairperson at California Lutheran in Thousand Oaks.

Mozart in the Jungle

The colorful dimensions of Amazon’s Mozart in the Jungle

By Randi Altman

How do you describe Amazon’s Mozart in the Jungle? Well, in its most basic form it’s a comedy about the changing of the guard — or maestro — at the New York Philharmonic, and the musicians that make up that orchestra. When you dig deeper you get a behind-the-scenes look at the back-biting and crazy that goes on in the lives and heads of these gifted artists.

Timothy Vincent

Timothy Vincent

Based on the novel Mozart in the Jungle: Sex, Drugs, and Classical Music by oboist Blair Tindall, the series — which won the Golden Globe last year and was nominated this year — has shot in a number of locations over its three seasons, including Mexico and Italy.

Since its inception, Mozart in the Jungle has been finishing in 4K and streaming in both SDR and HDR. We recently reached out to Technicolor’s senior color timer, Timothy Vincent, who has been on the show since the pilot to find out more about the show’s color workflow.

Did Technicolor have to gear up infrastructure-wise for the show’s HDR workflow?
We were doing UHD 4K already and were just getting our HDR workflows worked out.

What is the workflow from offline to online to color?
The dailies are done in New York based on the Alexa K1S1 709 LUT. (Technicolor On-Location Services handled dailies out of Italy, and Technicolor PostWorks in New York.) After the offline and online, I get the offline reference made with the dailies so I can look at if I have a question about what was intended.

If someone was unsure about watching in HDR versus SDR, what would you tell them?
The emotional feel of both the SDR and the HDR is the same. That is always the goal in the HDR pass for Mozart. One of the experiences that is enhanced in the HDR is the depth of field and the three-dimensional quality you gain in the image. This really plays nicely with the feel in the landscapes of Italy, the stage performances where you feel more like you are in the audience, and the long streets of New York just to name a few.

Mozart in the JungleWhen I’m grading the HDR version, I’m able to retain more highlight detail than I was in the SDR pass. For someone who has not yet been able to experience HDR, I would actually recommend that they watch an episode of the show in SDR first and then in HDR so they can see the difference between them. At that point they can choose what kind of viewing experience they want. I think that Mozart looks fantastic in both versions.

What about the “look” of the show. What kind of direction where you given?
We established the look of the show based on conversations and collaboration in my bay. It has always been a filmic look with soft blacks and yellow warm tones as the main palette for the show. Then we added in a fearlessness to take the story in and out of strong shadows. We shape the look of the show to guide the viewers to exactly the story that is being told and the emotions that we want them to feel. Color has always been used as one of the storytelling tools on the show. There is a realistic beauty to the show.

What was your creative partnership like with the show’s cinematographer, Tobias Datum?
I look forward to each episode and discovering what Tobias has given me as palette and mood for each scene. For Season 3 we picked up where we left off at the end of Season 2. We had established the look and feel of the show and only had to account for a large portion of Season 3 being shot in Italy. Making sure to feel the different quality of light and feel of the warmth and beauty of Italy. We did this by playing with natural warm skin tones and the contrast of light and shadow he was creating for the different moods and locations. The same can be said for the two episodes in Mexico in Season 2. I know now what Tobias likes and can make decisions I’m confident that he will like.

Mozart in the JungleFrom a director and cinematographer’s point of view, what kind of choices does HDR open up creatively?
It depends on if they want to maintain the same feel of the SDR or if they want to create a new feel. If they choose to go in a different direction, they can accentuate the contrast and color more with HDR. You can keep more low-light detail while being dark, and you can really create a separate feel to different parts of the show… like a dream sequence or something like that.

Any workflow tricks/tips/trouble spots within the workflow or is it a well-oiled machine at this point?
I have actually changed the way I grade my shows based on the evolution of this show. My end results are the same, but I learned how to build grades that translate to HDR much easier and consistently.

Do you have a color assistant?
I have a couple of assistants that I work with who help me with prepping the show, getting proxies generated, color tracing and some color support.

What tools do you use — monitor, software, computer, scope, etc.?
I am working on Autodesk Lustre 2017 on an HP Z840, while monitoring on both a Panasonic CZ950 and a Sony X300. I work on Omnitek scopes off the downconverter to 2K. The show is shot on both Alexa XT and Alexa Mini, framing for 16×9. All finishing is done in 4K UHD for both SDR and HDR.

Anything you would like to add?
I would only say that everyone should be open to experiencing both SDR and HDR and giving themselves that opportunity to choose which they want to watch and when.

ACE Eddie nominees include Arrival, Manchester by the Sea, Better Call Saul

The American Cinema Editors (ACE) have named the nominees for the 67th ACE Eddie Award, which recognize editing in 10 categories of film, television and documentaries.

Winners will be announced during ACE’s annual awards ceremony on January 27 at the Beverly Hilton Hotel. In addition to the regular editing awards, J.J. Abrams will receive the ACE Golden Eddie Filmmaker of the Year award.

Check out the nominees:

BEST EDITED FEATURE FILM (DRAMATIC)
Arrival
Joe Walker, ACE

Hacksaw Ridge
John Gilbert, ACE

Hell or High Water
Jake Roberts

Manchester by the Sea
Jennifer Lame
 
Moonlight
Nat Sanders, Joi McMillon

BEST EDITED FEATURE FILM (COMEDY)
Deadpool
Julian Clarke, ACE

Hail, Caesar!
Roderick Jaynes

The Jungle Book
Mark Livolsi, ACE

La La Land
Tom Cross, ACE

The Lobster
Yorgos Mavropsaridis

BEST EDITED ANIMATED FEATURE FILM
Kubo and the Two Strings
Christopher Murrie, ACE

Moana
Jeff Draheim, ACE

Zootopia
Fabienne Rawley and Jeremy Milton

BEST EDITED DOCUMENTARY (FEATURE)

13th
Spencer Averick

Amanda Knox
Matthew Hamachek

The Beatles: Eight Days a Week — The Touring Years
Paul Crowder

OJ: Made in America
Bret Granato, Maya Mumma and Ben Sozanski

Weiner
Eli B. Despres

BEST EDITED DOCUMENTARY (TELEVISION)
The Choice 2016
Steve Audette, ACE

Everything Is Copy
Bob Eisenhardt, ACE

We Will Rise: Michelle Obama’s Mission to Educate Girls Around the World
Oliver Lief

BEST EDITED HALF-HOUR SERIES
Silicon Valley: “The Uptick”
Brian Merken, ACE

Veep: “Morning After”
Steven Rasch, ACE

Veep: “Mother”
Shawn Paper

BEST EDITED ONE-HOUR SERIES — COMMERCIAL
Better Call Saul: “Fifi”
Skip Macdonald, ACE

Better Call Saul: “Klick”
Skip Macdonald, ACE & Curtis Thurber

Better Call Saul: “Nailed”
Kelley Dixon, ACE and Chris McCaleb

Mr. Robot: “eps2.4m4ster-s1ave.aes”
Philip Harrison

This is Us: “Pilot”
David L. Bertman, ACE

BEST EDITED ONE-HOUR SERIES – NON-COMMERCIAL
The Crown: “Assassins”
Yan Miles, ACE

Game of Thrones: “Battle of the Bastards”
Tim Porter, ACE

Stranger Things: “Chapter One: The Vanishing of Will Byers”
Dean Zimmerman

Stranger Things: “Chapter Seven: The Bathtub”
Kevin D. Ross

Westworld: “The Original”
Stephen Semel, ACE and Marc Jozefowicz

BEST EDITED MINISERIES OR MOTION PICTURE (NON-THEATRICAL)
All the Way
Carol Littleton, ACE

The Night Of: “The Beach”
Jay Cassidy, ACE

The People V. OJ Simpson: American Crime Story: “Marcia, Marcia, Marcia”
Adam Penn, Stewart Schill, ACE and C. Chi-yoon Chung

BEST EDITED NON-SCRIPTED SERIES:
Anthony Bourdain: Parts Unknown: “Manila” 
Hunter Gross, ACE

Anthony Bourdain: Parts Unknown: Senegal
Mustafa Bhagat

Deadliest Catch: “Fire at Sea: Part 2”
Josh Earl, ACE and Alexander Rubinow, ACE

Final ballots will be mailed on January 6, and voting ends on January 17. The Blue Ribbon screenings, where judging for all television categories and the documentary categories take place, will be on January 15. Projects in the aforementioned categories are viewed and judged by committees comprised of professional editors (all ACE members). All 850-plus ACE members vote during the final balloting of the ACE Eddies, including active members, life members, affiliate members and honorary members.

Main Image: Tilt Photo

What it sounds like when Good Girls Revolt for Amazon Studios

By Jennifer Walden

“Girls do not do rewrites,” says Jim Belushi’s character, Wick McFadden, in Amazon Studios’ series Good Girls Revolt. It’s 1969, and he’s the national editor at News of the Week, a fictional news magazine based in New York City. He’s confronting the new researcher Nora Ephron (Grace Gummer) who claims credit for a story that Wick has just praised in front of the entire newsroom staff. The trouble is it’s 1969 and women aren’t writers; they’re only “researchers” following leads and gathering facts for the male writers.

When Nora’s writer drops the ball by delivering a boring courtroom story, she rewrites it as an insightful articulation of the country’s cultural climate. “If copy is good, it’s good,” she argues to Wick, testing the old conventions of workplace gender-bias. Wick tells her not to make waves, but it’s too late. Nora’s actions set in motion an unstoppable wave of change.

While the series is set in New York City, it was shot in Los Angeles. The newsroom they constructed had an open floor plan with a bi-level design. The girls are located in “the pit” area downstairs from the male writers. The newsroom production set was hollow, which caused an issue with the actors’ footsteps that were recorded on the production tracks, explains supervising sound editor Peter Austin. “The set was not solid. It was built on a platform, so we had a lot of boomy production footsteps to work around. That was one of the big dialogue issues. We tried not to loop too much, so we did a lot of specific dialogue work to clean up all of those newsroom scenes,” he says.

The main character Patti Robinson (Genevieve Angelson) was particularly challenging because of her signature leather riding boots. “We wanted to have an interesting sound for her boots, and the production footsteps were just useless. So we did a lot of experimenting on the Foley stage,” says Austin, who worked with Foley artists Laura Macias and Sharon Michaels to find the right sound. All the post sound work — sound editorial, Foley, ADR, loop group, and final mix was handled at Westwind Media in Burbank, under the guidance of post producer Cindy Kerber.

Austin and dialog editor Sean Massey made every effort to save production dialog when possible and to keep the total ADR to a minimum. Still, the newsroom environment and several busy street scenes proved challenging, especially when the characters were engaged in confidential whispers. Fortunately, “the set mixer Joe Foglia was terrific,” says Austin. “He captured some great tracks despite all these issues, and for that we’re very thankful!”

The Newsroom
The newsroom acts as another character in Good Girls Revolt. It has its own life and energy. Austin and sound effects editor Steve Urban built rich backgrounds with tactile sounds, like typewriters clacking and dinging, the sound of rotary phones with whirring dials and bell-style ringers, the sound of papers shuffling and pencils scratching. They pulled effects from Austin’s personal sound library, from commercial sound libraries like Sound Ideas, and had the Foley artists create an array of period-appropriate sounds.

Loop group coordinator Julie Falls researched and recorded walla that contained period appropriate colloquialisms, which Austin used to add even more depth and texture to the backgrounds. The lively backgrounds helped to hide some dialogue flaws and helped to blend in the ADR. “Executive producer/series creator Dana Calvo actually worked in an environment like this and so she had very definite ideas about how it would sound, particularly the relentlessness of the newsroom,” explains Austin. “Dana had strong ideas about the newsroom being a character in itself. We followed her guide and wanted to support the scenes and communicate what the girls were going through — how they’re trying to break through this male-dominated barrier.”

Austin and Urban also used the backgrounds to reinforce the difference between the hectic state of “the pit” and the more mellow writers’ area. Austin says, “The girls’ area, the pit, sounds a little more shrill. We pitched up the phone’s a little bit, and made it feel more chaotic. The men’s raised area feels less strident. This was subtle, but I think it helps to set the tone that these girls were ‘in the pit’ so to speak.”

The busy backgrounds posed their own challenge too. When the characters are quiet, the room still had to feel frenetic but it couldn’t swallow up their lines. “That was a delicate balance. You have characters who are talking low and you have this energy that you try to create on the set. That’s always a dance you have to figure out,” says Austin. “The whole anarchy of the newsroom was key to the story. It creates a good contrast for some of the other scenes where the characters’ private lives were explored.”

Peter Austin

The heartbeat of the newsroom is the teletype machines that fire off stories, which in turn set the newsroom in motion. Austin reports the teletype sound they used was captured from a working teletype machine they actually had on set. “They had an authentic teletype from that period, so we recorded that and augmented it with other sounds. Since that was a key motif in the show, we actually sweetened the teletype with other sounds, like machine guns for example, to give it a boost every now and then when it was a key element in the scene.”

Austin and Urban also built rich backgrounds for the exterior city shots. In the series opener, archival footage of New York City circa 1969 paints the picture of a rumbling city, moved by diesel-powered buses and trains, and hulking cars. That footage cuts to shots of war protestors and police lining the sidewalk. Their discontented shouts break through the city’s continuous din. “We did a lot of texturing with loop group for the protestors,” says Austin. He’s worked on several period projects over years, and has amassed a collection of old vehicle recordings that they used to build the street sounds on Good Girls Revolt. “I’ve collected a ton of NYC sounds over the years. New York in that time definitely has a different sound than it does today. It’s very distinct. We wanted to sell New York of that time.”

Sound Design
Good Girls Revolt is a dialogue-driven show but it did provide Austin with several opportunities to use subjective sound design to pull the audience into a character’s experience. The most fun scene for Austin was in Episode 5 “The Year-Ender” in which several newsroom researchers consume LSD at a party. As the scene progresses, the characters’ perspectives become warped. Austin notes they created an altered state by slowing down and pitching down sections of the loop group using Revoice Pro by Synchro Arts. They also used Avid’s D-Verb to distort and diffuse selected sounds.

Good Girls Revolt“We got subjective by smearing different elements at different times. The regular sound would disappear and the music would dominate for a while and then that would smear out,” describes Austin. They also used breathing sounds to draw in the viewer. “This one character, Diane (Hannah Barefoot), has a bad experience. She’s crawling along the hallway and we hear her breathing while the rest of the sound slurs out in the background. We build up to her freaking out and falling down the stairs.”

Austin and Urban did their design and preliminary sound treatments in Pro Tools 12 and then handed it off to sound effects re-recording mixer Derek Marcil, who polished the final sound. Marcil was joined by dialog/music re-recording mixer David Raines on Stage 1 at Westwind. Together they mixed the series in 5.1 on an Avid ICON D-Control console. “Everyone on the show was very supportive, and we had a lot of creative freedom to do our thing,” concludes Austin.

Creating and tracking roaches for Hulu’s 11.22.63

By Randi Altman

Looking for something fun and compelling to watch while your broadcast shows are on winter break? You might want to try Hulu’s original eight-part miniseries 11.22.63, which the streaming channel released last February.

It comes with a pretty impressive pedigree — it’s based on a Stephen King novel, it’s executive produced by J.J. Abrams, it stars Oscar-nominee James Franco (127 Hours) and it’s about JFK’s assassination and includes time travel. C’mon!

The plot involves Franco’s character traveling back to 1960 in an effort to stop JFK’s assassination, but just as he makes headway, he feels the past pushing back in some dangerous, and sometimes gross, ways.

Bruce Branit

In the series pilot, Franco’s character, Jack Epping, is being chased by Kennedy’s security after he tries to sneak into a campaign rally. He ducks in a storage room to hide, but he’s already ticked off the past, which slowly serves him up a room filled with cockroaches that swarm him. The sequence is a slow build, with roaches crawling out, covering the floor and then crawling up him.

I’m not sure if Franco has a no-roach clause in his contract (I would), but in order to have control over these pests, it was best to create them digitally. This is where Bruce Branit, owner of BranitFX in Kansas City, Missouri came in. Yes, you read that right, Kansas City, and his resume is impressive. He is a frequent collaborator with Jay Worth, Bad Robot’s VFX supervisor.

So for this particular scene, BranitFX had one or two reference shots, which they used to create a roach brush via Photoshop. Once the exact look was determined regarding the amount of attacking roaches, they animated it in 3D and and composited. They then used 2D and 3D tracking tools to track Franco while the cockroaches swarmed all over him.

Let’s find out more from Bruce Branit.

How early did you get involved in that episode? How much input did you have in how it would play out?
For this show, there wasn’t a lot of lead time. I came on after shooting was done and there was a rough edit. I don’t think the edit changed a lot after we started.

What did the client want from the scene, and how did you go about accomplishing that?
VFX supervisor Jay Worth and I have worked together on a lot of shows. We’d done some roaches for an episode of Almost Human, and also I think for Fringe, so we had some similar assets and background with talking “roach.” The general description was tons of roaches crawling on James Franco.

Did you do previs?
Not really. I rendered about 10 angles of the roach we had previously worked with and made Adobe Photoshop brushes out of each frame. I used that to paint up a still of each shot to establish a baseline for size, population and general direction of the roaches in each of the 25 or so shots in the sequence.

Did you have to play with the movements a lot, or did it all just come together?
We developed a couple base roach walks and behaviors and then populated each scene with instances of that. This changed depending on whether we needed them crossing the floor, hanging on a light fixture or climbing on Franco’s suit. The roach we had used in the past was similar to what the producers on 11.22.63 had in mind. We made a few minor modifications with texture and modeling. Some of this affected the rig we’d built so a lot of the animations had to be rebuilt.

Can you talk about your process/workflow?
This sequence was shot in anamorphic and featured a constantly flashing light on the set going from dark emergency red lighting to brighter florescent lights. So I generated unsqueezed lens distortion, removed and light mitigated interim plates to pull all of our 2D and 3D tracking off of. The tracking was broken into 2D, 3D and 3D tracking by hand involving roaches on Franco’s body as he turns and swats at them in a panic. The production had taped large “Xs” on his jacket to help with this roto-tracking, but those two had to be painted out for many shots prior to the roaches reaching Franco.

The shots were tracked in Fusion Studio for 2D and SynthEyes for 3D. A few shots were also tracked in PFTrack.

The 3D roach assets were animated and rendered in NewTek LightWave. Passes for the red light and white light conditions were rendered as well as ambient show and specular passes. Although we were now using tracking plates with the 2:1 anamorphic stretch removed, a special camera was created in LightWave that was actually double the anamorphic squeeze to duplicate the vertical booked and DOF from an anamorphic lens. The final composite was completed in Blackmagic Fusion Studio using the original anamorphic plates.

What was the biggest challenge you faced working on this scene?
Understanding the anamorphic workflow was a new challenge. Luckily, I had just completed a short project of my own called Bully Mech that was shot with Lomo anamorphic lenses. So I had just recently developed some familiarity and techniques to deal with the unusual lens attributes of those lenses. Let’s just say they have a lot of character. I talked with a lot of cinematographer friends to try to understand how the lenses behaved and why they stretched the out-of-focus element vertically while the image was actually stretched the other way.

What are you working on now?
I‘ve wrapped up a small amount of work on Westworld and a handful of shots on Legends of Tomorrow. I’ve been directing some television commercials the last few months and just signed a development deal on the Bully Mech project I mentioned earlier.

We are making a sizzle reel of the short that expands the scope of the larger world and working with concept designers and a writer to flush out a feature film pitch. We should be going out with the project early next year.

GenPop’s Bill Yukich directs, edits gritty open for Amazon’s Goliath 

Director/editor Bill Yukich helmed the film noir-ish opening title sequence for Amazon’s new legal drama, Goliath. Produced by LA-based content creation studio GenPop, the black and white intro starts with Goliath lead actor Billy Bob Thornton jumping into the ocean. While underwater, and smoking a cigarette and holding a briefcase, he casually strolls through rooms filled with smoke and fire. At the end of the open, he rises from the water as the Santa Monica Pier appears next to him and as the picture turns from B&W to color. The Silent Comedy’s “Bartholomew” track plays throughout.

The ominous backdrop, of a man underwater but not drgoliathowning, is a perfect visual description of Thornton’s role as disgraced lawyer Billy McBride. Yukich’s visuals, he says, are meant to strike a balance between dreamlike and menacing.

The approved concept called for a dry shoot, so Yukich came up with solutions to make it seem as though the sequence was actually filmed underwater. Shot on a Red Magnesium Weapon camera, Yukich used a variety of in-camera techniques to achieve the illusion of water, smoke and fire existing within the same world, including the ingenious use of smoke to mimic the movement of crashing waves.

After wrapping the live-action shoot with Thornton, Yukich edited and color corrected the sequence. The VFX work was mostly supplementary and used to enhance the practical effects which were captured on set, such as adding extra fireballs into the frame to make the pyrotechnics feel fuller. Editing was via Adobe Premiere and VFX and color was done in Autodesk Flame. In the end, 80 percent was live action and only 20 percent visual effects.

Once post production was done, Yukich projected the sequence onto a screen which was submerged underwater and reshot the projected footage. Though technically challenging, Yukich says, this Inception-style method of re-shooting the footage gave the film the organic quality that he was looking for.

Yukich recently worked as lead editor for Beyoncé’s visual album Lemonade. Stepping behind the lens was a natural progression for Yukich, who began directing concerts for bands like Godsmack and The Hollywood Undead, as well as music videos for HIM, Vision of Disorder and The Foo Fighters.

SMPTE: The convergence of toolsets for television and cinema

By Mel Lambert

While the annual SMPTE Technical Conferences normally put a strong focus on things visual, there is no denying that these gatherings offer a number of interesting sessions for sound pros from the production and post communities. According to Aimée Ricca, who oversees marketing and communications for SMPTE, pre-registration included “nearly 2,500 registered attendees hailing from all over the world.” This year’s conference, held at the Loews Hollywood Hotel and Ray Dolby Ballroom from October 24-27, also attracted more than 108 exhibitors in two exhibit halls.

Setting the stage for the 2016 celebration of SMPTE’s Centenary, opening keynotes addressed the dramatic changes that have occurred within the motion picture and TV industries during the past 100 years, particularly with the advent of multichannel immersive sound. The two co-speakers — SMPTE president Robert Seidel and filmmaker/innovator Doug Trumbull — chronicled the advance in audio playback sound since, respectively, the advent of TV broadcasting after WWII and the introduction of film soundtracks in 1927 with The Jazz Singer.

Robert Seidel

ATSC 3.0
Currently VP of CBS Engineering and Advanced Technology, with responsibility for TV technologies at CBS and the CW networks, Seidel headed up the team that assisted WRAL-HD, the CBS affiliate in Raleigh, North Carolina, to become the first TV station to transmit HDTV in July 1996.  The transition included adding the ability to carry 5.1-channel sound using Advanced Television Systems Committee (ATSC) standards and Dolby AC-3 encoding.

The 45th Grammy Awards Ceremony broadcast by CBS Television in February 2004 marked the first scheduled HD broadcast with a 5.1 soundtrack. The emergent ATSC 3.0 standard reportedly will provide increased bandwidth efficiency and compression performance. The drawback is the lack of backwards compatibility with current technologies, resulting in a need for new set-top boxes and TV receivers.

As Seidel explained, the upside for ATSC 3.0 will be immersive soundtracks, using either Dolby AC-4 or MPEG-H coding, together with audio objects that can carry alternate dialog and commentary tracks, plus other consumer features to be refined with companion 4K UHD, high dynamic range and high frame rate images. In June, WRAL-HD launched an experimental ATSC 3.0 channel carrying the station’s programming in 1080p with 4K segments, while in mid-summer South Korea adopted ATSC 3.0 and plans to begin broadcasts with immersive audio and object-based capabilities next February in anticipation of hosting the 2018 Winter Olympics. The 2016 World Series games between the Cleveland Indians and the Chicago Cubs marked the first live ATSC 3.0 broadcast of a major sporting event on experimental station Channel 31, with an immersive-audio simulcast on the Tribune Media-owned Fox affiliate WJW-TV.

Immersive audio will enable enhanced spatial resolution for 3D sound-source localization and therefore provide an increased sense of envelopment throughout the home listening environment, while audio “personalization” will include level control for dialog elements, alternate audio tracks, assistive services, other-language dialog and special commentaries. ATSC 3.0 also will support loudness normalization and contouring of dynamic range.

Doug Trumbull

Higher Frame Rates
With a wide range of experience within the filmmaking and entertainment technologies, including visual effects supervision on 2001: A Space Odyssey, Close Encounters of the Third Kind, Star Trek: The Motion Picture and Blade Runner, Trumbull also directed Silent Running and Brainstorm, as well as special venue offerings. He won an Academy Award for his Showscan process for high-speed 70mm cinematography, helped develop IMAX technologies and now runs Trumbull Studios, which is innovating a new MAGI process to offer 4K 3D at 120fps. High production costs and a lack of playback environments meant that Trumbull’s Showscan format never really got off the ground, which was “a crushing disappointment,” he conceded to the SMPTE audience.

But meanwhile, responding to falling box office receipts during the ‘50s and ‘60s, Hollywood added more consumer features, including large-screen presentations and surround sound, although the movie industry also began to rely on income from the TV community for broadcast rights to popular cinema releases.

As Seidel added, “The convergence of toolsets for both television and cinema — including 2K, 4K and eventually 8K — will lead to reduced costs, and help create a global market around the world [with] a significant income stream.” He also said that “cord cutting” — substituting cable subscription services for Amazon.com, Hulu, iTunes, Netflix and the like — is bringing people back to over-the-air broadcasting.

Trumbull countered that TV will continue at 60fps “with a live texture that we like,” whereas film will retain its 24fps frame rate “that we have loved for years and which has a ‘movie texture.’ Higher frame rates for cinema, such as 48fps used by Peter Jackson for several of the Lord of the Rings films, has too much of a TV look. Showscan at 120fps and a 360-degree shutter avoided that TV look, which is considered objectionable.” (Early reviews of director Ang Lee’s upcoming 3D film Billy Lynn’s Long Halftime Walk, which was shot in 4K at 120fps, have been critical of its video look and feel.)

complex-tv-networkNext-Gen Audio for Film and TV
During a series of “Advances in Audio Reproduction” conference sessions, chaired by Chris Witham, director of digital cinema technology at Walt Disney Studios, three presentations covered key design criteria for next-generation audio for TV and film. During his discussion called “Building the World’s Most Complex TV Network — A Test Bed for Broadcasting Immersive & Interactive Audio,” Robert Bleidt, GM of Fraunhofer USA’s audio and multimedia division, provided an overview of a complete end-to-end broadcast plant that was built to test various operational features developed by Fraunhofer, Technicolor and Qualcomm. These tests were used to evaluate an immersive/object-based audio system based on MPEG-H for use in Korea during planned ATSC 3.0 broadcasting.

“At the NAB Convention we demonstrated The MPEG Network,” Bleidt stated. “It is perhaps the most complex combination of broadcast audio content ever made in a single plant, involving 13 different formats.” This includes mono, stereo, 5.1-channel and other sources. “The network was designed to handle immersive audio in both channel- and HOA-based formats, using audio objects for interactivity. Live mixes from a simulated sports remote was connected to a network operating center, with distribution to affiliates, and then sent to a consumer living room, all using the MPEG-H audio system.”

Bleidt presented an overview of system and equipment design, together with details of a critical AMAU (audio monitoring and authoring unit) that will be used to mix immersive audio signals using existing broadcast consoles limited to 5.1-channel assignment and panning.

Dr. Jan Skoglund, who leads a team at Google developing audio signal processing solutions, addressed the subject of “Open-source Spatial Audio Compression for VR Content,” including the importance of providing realistic immersive audio experiences to accompany VR presentations and 360-degree 3D video.

“Ambisonics have reemerged as an important technique in providing immersive audio experiences,” Skoglund stated. “As an alternative to channel-based 3D sound, Ambisonics represent full-sphere sound, independent of loudspeaker location.” His fascinating presentation considered the ways in which open-source compression technologies can transport audio for various species of next-generation immersive media. Skoglund compared the efficacy of several open-source codecs for first-order Ambisonics, and also the progress being made toward higher-order Ambisonics (HOA) for VR content delivered via the internet, including enhanced experience provided by HOA.

Finally, Paul Peace, who oversees loudspeaker development for cinema, retail and commercial applications at JBL Professional — and designed the Model 9350, 9300 and 9310 surround units — discussed “Loudspeaker Requirements in Object-Based Cinema,” including a valuable in-depth analysis of the acoustic delivery requirements in a typical movie theater that accommodates object-based formats.

Peace is proposing the use of a new metric for surround loudspeaker placement and selection when the layout relies on venue-specific immersive rendering engines for Dolby Atmos and Barco Auro-3D soundtracks, with object-based overhead and side-wall channels. “The metric is based on three foundational elements as mapped in a theater: frequency response, directionality and timing,” he explained. “Current set-up techniques are quite poor for a majority of seats in actual theaters.”

Peace also discussed new loudspeaker requirements and layout criteria necessary to ensure a more consistent sound coverage throughout such venues that can replay more accurately the material being re-recorded on typical dub stages, which are often smaller and of different width/length/height dimensions than most multiplex environments.


Mel Lambert, who also gets photo credit on pictures from the show, is principal of Content Creators, an LA-based copywriting and editorial service, and can be reached at mel.lambert@content-creators.com Follow him on Twitter @MelLambertLA.

 

Quick Chat: Efilm’s new managing director Al Cleland

Al Cleland has been promoted to managing director of Deluxe’s Efilm, which is a digital color, finishing and location services company working on feature films, episodics and trailers. For the past eight years, Cleland has been VP of trailers at Efilm.

A 30-year veteran of the post business, Cleland started his career at Editel and joined CIS, which later became Efilm, as one of the company’s original employees. He served as senior V/GM at Technicolor Creative Services for 10 years, and at Postworks, Los Angeles, returning to Efilm as VP of trailers. We threw three questions at Cleland, let’s see what he had to say…


After working on trailers for the last eight years, you must be excited to be working in all aspects of what Efilm does.
Our trailer department started out dedicated to finishing one studio’s trailers and we’ve expanded into a dedicated hub for the marketing departments of all the studios. Our trailers department has had the advantage of connectivity and common practices with all of Deluxe’s facilities throughout the world. I’ve loved being part of that growth process and, in my new position, I’ll continue to oversee that vital part of the company.

What’s challenging about trailers that people even in the business might not think about?
The great team in that division have to pull together shots and visual effects while the film itself is being finished, which is a unique logistical challenge. And they’re doing all kinds of small changes and creating effects specific to the trailer and to the MPAA requirements for trailers. It’s a unique skill set.

What do you hope to accomplish for Efilm going forward?
Efilm is expanding in terms of the amount of work and the kind of work we’re doing, and I intend to push that expansion along at an even faster rate. We’ve always had an amazing team of colorists, producers and editors that are really the heart of Efilm. We have wonderful technical and support staff. And, of course, we have access to all of those elements at our partner companies and we continue to build on that.

It’s early to talk about specifics, but we all know the industry is changing rapidly. We’ve been among the very first to introduce new technologies and workflows and that’s something the team here is going to expand on.

The A-List — Director Ed Zwick talks Jack Reacher: Never Go Back

By Iain Blair

Director, screenwriter and producer Ed Zwick got his start in television as co-creator of the Emmy Award-winning series Thirtysomething. His feature film career kicked off when he directed the Rob Lowe/Demi Moore vehicle, About Last Night. Zwick went on to direct the Academy Award-winning films Glory and Legends of the Fall. 

Zwick also produced the Oscar-nominated I Am Sam, as well as Traffic — winner of two Golden Globes and four Academy Awards — directed by Steven Soderbergh. He won an Academy Award as a producer of 1999’s Best Picture, Shakespeare in Love.

Ed Zwick

His latest film, Paramount’s Jack Reacher: Never Go Back, reunites him with his The Last Samurai star Tom Cruise. It’s an action-packed follow-up to 2012’s Jack Reacher hit that grossed over $200 million in worldwide box office.

The set-up? Years after resigning command of an elite military police unit, the nomadic, righter-of-wrongs Reacher is drawn back into the life he left behind when his friend and successor, Major Susan Turner (Cobie Smulders), is framed for espionage. Naturally, Reacher will stop at nothing to prove her innocence and to expose the real perpetrators behind the killings of his former soldiers. Mayhem quickly ensues, helped along with plenty of crazy stunts and cutting-edge VFX.

I recently chatted with Zwick about making the film.

You’ve worked in so many genres, but this is your first crime thriller. 
I’ve always loved crime thrillers — especially films like Three Days of the Condor and Bullitt where the characters and their relationships are far more important than the action. That’s where I tried to take this.

Jack Reacher: Never Go BackTom Cruise is famous for being a perfectionist and doing all his own stunts when possible. Any surprises re-teaming with him?
Yeah, I always say the most boring job on set is being Tom’s stunt double. Tom is a perfectionist and he loves to be involved in every aspect of the production, so no surprises there. He has such a great love for all the different genres, but a particular love for action films and thrillers. It was very important for him that he didn’t do something that was like all the other films out there. I think we all felt that superhero fatigue has been setting in, so the idea was to do things on a more human scale, and make it more realistic and authentic, both with the characters and with the action.

What were the main technical challenges of making this?
We shot it all in New Orleans, and it’s a road movie. So we had to shoot Washington, DC, there too, and create a cross-country journey with different airports and so on. We did all of that with some sleight of hand and extensions and VFX. I think it’s also a challenge to come up with new settings for action pieces we haven’t seen before, and that’s where the parade and rooftop sequences in New Orleans come in, along with the fight on the plane. The book it’s based on is set in LA and DC, but they’re both tough to shoot in, and with the great tax breaks in Louisiana and all the great locations, it made sense to shoot there.

Jack Reacher: Never Go BackEvery shoot is tough, but it was pretty straightforward on this, though shutting down the whole French Quarter took some doing —  but all the city officials were so helpful. The rooftop stuff was very challenging to do, and we did a lot of prep and began on post right away, on day one.

Do you like the post process?
I love it. I can sit there with a cup of coffee and my editor and rewrite the entire script as much as I can. It’s the best part and most creative part of the whole process.

Where did you post?
We did it all in LA. We just set up some offices in Santa Monica where I live and did all the editorial there.

You’ve typically worked with Steve Rosenblum, but you edited this film with Billy Weber, who’s been nominated twice for Oscars (Top Gun, The Thin Red Line) and whose credits include The Warriors, Pee Wee’s Big Adventure, Beverly Hills Cop II, Midnight Run and The Tree of Life, among others. How did that relationship work?
Steve wasn’t available so I asked him, ‘Who can I hire that you’d be jealous of?’ He said, ‘There’s only one person — Billy Weber. He’s your guy.’ He was right. Billy’s legendary and has cut so many great movies for directors like Terrence Malik, Tony Scott, Walter Hill, Martin Brest, Tim Burton, and he’s a prince. I love editing, and I loved working with him. He’s a great collaborator, and I was very open to all his ideas.Jack Reacher: Never Go Back

He came to New Orleans and we set up a cutting room there and he did the assembly there as we shot. Then we moved back to LA. Billy lives on the other side of town, so to beat the traffic we’d start every day at 6am and wrap at 3pm. It was a great system.

Can you talk about the importance of music and sound in the film?
I love working with the audio, and Henry Jackman did a great, classic-modern score. It was crucial, not just for all the action, but for some of the quieter moments. Then we mixed the sound at Fox, with Andy Nelson who’s now done 10 of my movies.

This is obviously not a VFX-driven piece, but the VFX play a big role.
You’re right, and we didn’t want it to look like there was a ton of CG work. In the end, we had well over 200 shots, including stuff like the Capitol Dome in DC in the background and tons of bullet hits on cars and enhancements. But I didn’t want all the VFX to be at all noticeable. Lola and Flash Film Works did the work, and often today where you need bullet hits on a car, it’s far cheaper and more time-effective to add them in post, so there was a lot of that.

How important was the DI on this and where did you do it?
We did it at Company 3 in Santa Monica with colorist Stephen Nakamura, who is brilliant. We went for a natural look but also enhanced some of the dramatic scenes [via Resolve]. It’s remarkable what you can do now in the DI, and as we shot on film I wanted to preserve some of that real film look, so I think it’s a light touch, but also a sophisticated one in the DI.

What’s next?
I don’t have anything lined up, so I’m taking a break.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Vancouver names first film commissioner, opens Film & Media Centre

The city of Vancouver and the Vancouver Economic Commission (VEC) have named David Shepheard as Vancouver’s first film commissioner. At the same time, they announced the creation of the Vancouver Film & Media Center, which aims to bring more work to the city. Vancouver is already the third largest film production center in North America — behind New York and Los Angeles.

Shepheard, who brings over 16 years of experience to his new role, previously ran the film commission services for Film London, the capital’s Media Development Agency. Film London supports producers shooting in London, advises them on accessing UK tax breaks and co-production opportunities, and promotes London’s film, TV, post production, animation, VFX and games sectors. Prior to that, Shepheard was CEO of Open House Films in the UK — a consultancy partnership specializing in developing strategies, film commissions and media development agencies at city, regional and state levels in Europe, Africa, North America, Asia and the Middle East.

“As one of Vancouver’s high-growth industries, film and media has been a big contributor to our economic growth and has a tremendous positive impact in our city,” reports Mayor Gregor Robertson. “David Shepheard’s expertise and experience — coupled with the new Film & Media Centre — will take our digital entertainment industry to the next level on the international stage.”

Nancy Mott and actor Ryan Reynolds during the filming of Deadpool. Deadpool 2 returns to Vancouver for production in January 2017.

In recent years, Vancouver has experienced big growth in film and TV production, and the industry has been a top contributor to British Columbia’s diversified economy, creating jobs, services and tax revenue. Last year, with pilot production filming increasing by 67 percent from 2015 to 2016. In 2015, the city issued almost 5,000 permits for 353 productions filmed in Vancouver; provisional data for 2016 from the city of Vancouver’s film office projects that this year’s numbers are expected to be even higher.

Shepheard will work with executive director Nancy Mott to attract film and TV production, co-production and post production projects to Vancouver. Through the prior business development activities of its Asia-Pacific Centre, the VEC has already identified opportunities in Japan, China and Korea, all of whom have been stepping up investment and production activity.