Author Archives: Randi Altman

Point.360 adds senior colorist Patrick Woodard

Senior colorist Patrick Woodard has joined the creative team at Point.360 in Burbank. He was most recently at Hollywood’s DigitalFilm Tree, where he colored dozens of television shows, including ABC’s American Housewife, CBS’ NCIS: Los Angeles, NBC’s Great News and TBS’ Angie Tribeca. Over the years, he also worked on Weeds, Everybody Hates Chris, Cougar Town and Sarah Silverman: We Are Miracles.

Woodard joins Point.360 senior colorist Charlie Tucker, whose recent credits include the final season of the Netflix’s Orange Is the New Black, CW’s Legacies and Roswell, New Mexico, YouTube’s Cobra Kai, as well as the Netflix comedy Medical Police.

“Patrick is an exceptional artist with an extensive background in photography,” says Point.360’s SVP of episodic Jason Kavner. “His ability to combine his vast depth of technical expertise and his creative vision to quickly create a highly-developed aesthetic has the won the loyalty of many DPs and creatives alike.”

Point360 has four color suites at its Burbank facility. “Although we have the feel of a boutique episodic facility, we are able to offer a robust end to end pipeline thanks to our long history as a premier mastering company,” reports Kavner. “We are currently servicing 4K Dolby Vision projects for Netflix such as the upcoming Jenji Kohan series currently being called Untitled Vigilante Project, as well as the UHD SDR Sony produced YouTube series Cobra Kai. We also continue to offer the same end-to-end service to our traditional studio and network clients on series such as Legacies for the CW, Fresh Off The Boat, Family Guy and American Dad for 20th Century Fox, and Drunk History and Robbie for Comedy Central.

Woodard, who will be working on Resolve at Point360, was also a recent subject of our Behind the Title series. You can read that here.

Brittany Howard music video sets mood with color and VFX

The latest collaboration between Framestore and director Kim Gehrig is for Brittany Howard’s debut solo music video for Stay High, which features a color grade and subtle VFX by the studio. A tribute to the Alabama Shakes’ lead singer’s late father, the stylized music video stars actor Terry Crews (Brooklyn Nine-Nine, The Expendables) as a man finishing a day’s work and returning home to his family.

Produced by production company Somesuch, the aim of Stay High is to present a natural and emotionally driven story that honors the singer’s father, K.J. Howard. Shot in her hometown of Nashville, the music video features Howard’s family and friends while the singer pops up in several scenes throughout the video as different characters.

The video begins with Howard’s father getting off of work at his factory job. The camera follows him on his drive home, all the while he’s singing “Stay High.” As he drives home, we see images people and locations where Howard grew up. The video ends when her dad pulls into his driveway and is met by his daughters and wife.

“Kim wanted to really highlight the innocence of the video’s story, something I kept in mind while grading the film,” says Simon Bourne, Framestore’s head of creative color, who’s graded several films for the director. “The focus needed to always be on Terry with nothing in his surroundings distracting from that and the grade needed to reflect that idea.”

Framestore’s creative director Ben Cronin, who was also a compositor on the project along with Nuke compositor Christian Baker, adds, “From a VFX point of view, our job was all about invisible effects that highlighted the beautiful job that Ryley Brown, the film’s DP, did and to complement Kim’s unique vision.”

“We’ve worked with Kim on several commercials and music video projects, and we love collaborating because her films are always visually-interesting and she knows we’ll always help achieve the ground-breaking and effortlessly cool work that she does.”

Creating and mixing authentic sounds for HBO’s Deadwood movie

By Jennifer Walden

HBO’s award-winning series Deadwood might have aired its final episode 13 years ago, but it’s recently found new life as a movie. Set in 1889 — a decade after the series finale — Deadwood: The Movie picks up the threads of many of the main characters’ stories and weaves them together as the town of Deadwood celebrates the statehood of South Dakota.

Deadwood: The Movie

The Deadwood: The Movie sound team.

The film, which aired on HBO and is available on Amazon, picked up eight 2019 Emmy nominations including in the categories of sound editing, sound mixing and  best television movie.

Series creator David Milch has returned as writer on the film. So has director Daniel Minahan, who helmed several episodes of the series. The film’s cast is populated by returning members, as is much of the crew. On the sound side, there are freelance production sound mixer Geoffrey Patterson; 424 Post’s sound designer, Benjamin Cook; NBCUniversal StudioPost’s re-recording mixer, William Freesh; and Mind Meld Arts’ music editor, Micha Liberman. “Series composers Reinhold Heil and Johnny Klimek — who haven’t been a composing team in many years — have reunited just to do this film. A lot of people came back for this opportunity. Who wouldn’t want to go back to Deadwood?” says Liberman.

Freelance supervising sound editor Mandell Winter adds, “The loop group used on the series was also used on the film. It was like a reunion. People came out of retirement to do this. The richness of voices they brought to the stage was amazing. We shot two days of group for the film, covering a lot of material in that limited time to populate Deadwood.”

Deadwood (the film and series) was shot on a dedicated film ranch called Melody Ranch Motion Picture Studio in Newhall, California. The streets, buildings and “districts” are consistently laid out the same way. This allowed the sound team to use a map of the town to orient sounds to match each specific location and direction that the camera is facing.

For example, there’s a scene in which the town bell is ringing. As the picture cuts to different locations, the ringing sound is panned to show where the bell is in relation to that location on screen. “We did that for everything,” says co-supervising sound editor Daniel Colman, who along with Freesh and re-recording mixer John Cook, works at NBCUniversal StudioPost. “You hear the sounds of the blacksmith’s place coming from where it would be.”

“Or, if you’re close to the Chinese section of the town, then you hear that. If you were near the saloons, that’s what you hear. They all had different sounds that were pulled forward from the series into the film,” adds re-recording mixer Freesh.

Many of the exterior and interior sounds on set were captured by Benjamin Cook, who was sound effects editor on the original Deadwood series. Since it’s a practical location, they had real horses and carriages that Cook recorded. He captured every door and many of the props. Colman says, “We weren’t guessing at what something sounded like; we were putting in the actual sounds.”

The street sounds were an active part of the ambience in the series, both day and night. There were numerous extras playing vendors plying their wares and practicing their crafts. Inside the saloons and out in front of them, patrons talked and laughed. Their voices — performed by the loop group in post — helped to bring Deadwood alive. “The loop group we had was more than just sound effects. We had to populate the town with people,” says Winter, who scripted lines for the loopers because they were played more prominently in the mix than what you’d typically hear. “Having the group play so far forward in a show is very rare. It had to make sense and feel timely and not modern.”

In the movie, the street ambience isn’t as strong a sonic component. “The town had calmed down a little bit as it’s going about its business. It’s not quite as bustling as it was in the series. So that left room for a different approach,” says Freesh.

The attenuation of street ambience was conducive to the cinematic approach that director Minahan wanted to take on Deadwood: The Movie. He used music to help the film feel bigger and more dramatic than the series, notes Liberman. Re-recording mixer John Cook adds, “We experimented a lot with music cues. We saw scenes take on different qualities, depending on whether the music was in or out. We worked hard with Dan [Minahan] to end up with the appropriate amount of music in the film.”

Minahan even introduced music on set by way of a piano player inside the Gem Saloon. Production sound mixer Patterson says, “Dan was very active on the set in creating a mood with that music for everyone that was there. It was part and parcel of the place at that time.”

Authenticity was a major driving force behind Deadwood’s aesthetics. Each location on set was carefully dressed with era-specific props, and the characters were dressed with equal care, right down to their accessories, tools and weapons. “The sound of Seth Bullock’s gun is an actual 1889 Remington revolver, and Calamity Jane’s gun is an 1860’s Colt Army cavalry gun. We’ve made every detail as real and authentic as possible, including the train whistle that opens the film. I wasn’t going to just put in any train whistle. It’s the 1880s Black Hills steam engine that actually went through Deadwood,” reports Colman.

The set’s wooden structures and elevated boardwalk that runs in front of the establishments in the heart of town lent an authentic character to the production sound. The creaky wooden doors and thumpiness of footsteps across the raised wooden floors are natural sounds the audience would expect to hear from that environment. “The set for Deadwood was practical and beautiful and amazing. You want to make sure that you preserve that realness and let the 1800s noises come through. You don’t want to over sterilize the tracks. You want them to feel organic,” says Patterson.

Freesh adds, “These places were creaky and noisy. Wind whistled through the windows. You just embrace it. You enhance it. That was part of the original series sound, and it followed through in the movie as well.”

The location was challenging due to its proximity to real-world civilization and all of our modern-day sonic intrusions, like traffic, airplanes and landscaping equipment from a nearby neighborhood. Those sounds have no place in the 1880s world of Deadwood, but “if we always waited for the moment to be perfect, we would never make a day’s work,” says Patterson. “My mantra was always to protect every precious word of David Milch’s script and to preserve the performances of that incredible cast.”

In the end, the modern-day noises at the location weren’t enough to require excessive ADR. John Cook says, “Geoffrey [Patterson] did a great job of capturing the dialogue. Then, between the choices the picture editors made for different takes and the work that Mandell [Winter] did, there were only one or two scenes in the whole movie that required extra attention for dialogue.”

Winter adds, “Even denoising the tracks, I didn’t take much out. The tracks sounded really good when they got to us. I just used iZotope RX 7 and did our normal pass with it.”

Any fan of Deadwood knows just how important dialogue clarity is since the show’s writing is like Shakespeare for the American West — with prolific profanity, of course. The word choices and their flow aren’t standard TV script fare. To help each word come through clearly, Winter notes they often cut in both the boom and lav mic tracks. This created nice, rich dialogue for John Cook to mix.

On the stage, John Cook used the FabFilter Pro-Q 2 to work each syllable, making sure the dialogue sounded bright and punchy and not too muddy or tubby. “I wanted the audience to hear every word without losing the dynamics of a given monologue or delivery. I wanted to maintain the dynamics, but make sure that the quieter moments were just as intelligible as the louder moments,” he says.

In the film, several main characters experience flashback moments in which they remember events from the series. For example, Al Swearengen (Ian McShane) recalls the death of Jen (Jennifer Lutheran) from the Season 3 finale. These flashbacks — or hauntings, as the post team refers to them — went through several iterations before the team decided on the most effective way to play each one. “We experimented with how to treat them. Do we go into the actor’s head and become completely immersed in the past? Or, do we stay in the present — wherever we are — and give it a slight treatment? Or, should there not be any sounds in the haunting? In the end, we decided they weren’t all going to be handled the same,” says Freesh.

Before coming together for the final mix on Mix 6 at NBCUniversal StudioPost on the Universal Studios Lot in Los Angeles, John Cook and Freesh pre-dubbed Deadwood: The Movie in separate rooms as they’d do on a typical film — with Freesh pre-dubbing the backgrounds, effects, and Foley while Cook pre-dubbed the dialogue and music.

The pre-dubbing process gave Freesh and John Cook time to get the tracks into great shape before meeting up for the final mix. Freesh concludes, “We were able to, with all the people involved, listen to the film in real good condition from the first pass down and make intelligent decisions based on what we were hearing. It really made a big difference in making this feel like Deadwood.”


Jennifer Walden is a New Jersey-based audio engineer and writer. Follow her on Twitter @audiojeney.

Jody Madden upped to CEO at Foundry

Jody Madden, who joined Foundry in 2013 and has held positions as chief operating officer and, most recently, chief customer officer and chief product officer, has been promoted to chief executive officer. She takes over the role from Craig Rodgerson.

Madden, who has a rich background in VFX, has been with Foundry for six years. Prior to joining the company, she spent more than a decade in technology management and studio leadership roles at Industrial Light & Magic, Lucasfilm and Digital Domain after graduating from Stanford University.

“During a time of rapid change in creative industries, Foundry is committed to delivering innovations in workflow and future looking research,” says Madden.  “As the company continues to grow, delivering further improvements in speed, quality and user-experience remains a core focus to enable our customers to meet the demands of their markets.”

“Jody is well known for her collaborative leadership style and this has been crucial in enabling our engineering, product and research teams to achieve results for our customers and build the foundation for the future,” says Simon Robinson, co-founder/chief scientist. “I have worked closely with Jody and have seen the difference she has made to the business so I am extremely excited to see where she will lead Foundry in her new role and look forward to continuing to work with her.”

Review: FXhome’s HitFilm Pro 12 for editing, compositing, VFX

By Brady Betzel

If you have ever worked in Adobe Premiere Pro, Apple FCP X or Avid Media Composer and wished you could just flip a tab and be inside After Effects, with access to 3D objects directly in your timeline, you are going to want to take a look at FXhome’s HitFilm Pro 12.

Similar to how Blackmagic brought Fusion inside of its most recent versions of DaVinci Resolve, HitFilm Pro offers a nonlinear editor, a composite/VFX suite and a finishing suite combined into one piece of software. Haven’t heard about HitFilm yet? Let me help fill in some blanks.

Editing and 3D model Import

Editing and 3D model Import

What is HitFilm Pro 12?
Technically, HitFilm Pro 12 is a non-subscription-based nonlinear editor, compositor and VFX suite that costs $299. Not only does that price include 12 months of updates and tech support, but one license can be used on up to three computers simultaneously. In my eyes, HitFilm Pro is a great tool set for independent filmmakers, social media content generators and any editor who goes beyond editing and dives into topics like 3D modeling, tracking, keying, etc. without having to necessarily fork over money for a bunch of expensive third-party plugins. That doesn’t mean you won’t want to buy third-party plugins, but you are less likely to need them with HitFilm’s expansive list of native features and tools.

At my day job, I use Premiere, After Effects, Media Composer and Resolve. I often come home and want to work in something that has everything inside, and that is where HitFilm Pro 12 lives. Not only does it have the professional functionality that I am used to, such as trimming, color scopes and more, but it also has BorisFX’s Mocha planar tracking plugin built in for no extra cost. This is something I use constantly and love.

One of the most interesting and recent updates to HitFilm Pro 12 is the ability to use After Effects plugins. Not all plugins will work since there are so many, but in a video released after NAB 2019, HitFilm said plugins like Andrew Kramer’s Video CoPilot Element3D and ones from Red Giant are on the horizon. If you are within your support window, or you continue to purchase HitFilm, FXhome will work with you to get your favorite After Effects plugins working directly inside of HitFilm.

Timeline and 3D model editor

Some additional updates to HitFilm Pro 12 include a completely redesigned user interface that resembles Premiere Pro… kind of. Threaded rendering has also been added, so Windows users who have Intel and Nvidia hardware will see increased GPU speeds, the ability to add title directly in the editor and more.

The Review
So how doees HitFilm Pro 12 compare to today’s modern software packages? That is an interesting question. I have become more and more of a Resolve convert over the past two years, so I am constantly comparing everything to that. In addition, being an Avid user for over 15 years, I am used to a rock-solid NLE with only a few hiccups here and there. In my opinion, HitFilm 12 lands itself right where Premiere and FCP X live.

It feels prosumer-y, in a YouTuber or content-generator capacity. Would it stand up to 10 hours of abuse with content over 45 minutes? It probably would, but much like with Premiere, I would probably split my edits in scenes or acts to avoid slowdowns, especially when importing things like OBJ files or composites.

The nonlinear editor portion feels like Premiere and FCP X had a baby, but left out FCP X’s Magnetic Timeline feature. The trimming in the timeline feels smooth, and after about 20 minutes of getting comfortable with it I felt like it was what I am generally used to. Cutting in footage feels good using three-point edits or simply dragging and dropping. Using effects feels very similar to the Adobe world, where you can stack them on top of clips and they each affect each other from the top down.

Mocha within HitFilm Pro

Where HitFilm Pro 12 shines is in the inclusion of typically third-party plugins directly in the timeline. From the ability to create a scene with 3D cameras and particle generators to being able to track using BorisFX’s Mocha, HitFilm Pro 12 has many features that will help take your project to the next level. With HitFilm 12 Pro’s true 3D cameras, you can take flat text and enhance it with raytraced lighting, shadows and even textures. You can even use the included BorisFX Continuum 3D Objects to make great titles relatively easily. To take it a step further, you can even track them and animate them.

Color Tools
By day, I am an online editor/colorist who deals with the finishing aspect of media creation. Throughout the process, from color correction to exporting files, I need tools that are not only efficient but accurate. When I started to dig into the color correction side of HitFilm Pro 12, things slowed down for me. The color correction tools are very close to what you’ll find in other NLEs, like Premiere and FCP X, but they don’t quite rise to the level of Resolve. HitFilm Pro 12 does operate inside of a 32-bit color pipeline, which really helps avoid banding and other errors when color correcting. However, I didn’t feel that the toolset was making me more efficient; in fact, it was the opposite. I felt like I had to learn FXhome’s way of doing it. It wasn’t that it totally slowed me down, but I felt it could be better.

Color

Color

Summing Up
In the end, HitFilm 12 Pro will fill a lot of holes for individual content creators. If you love learning new things (like I do), then HitFilm Pro 12 will be a good investment of your time. In fact, FXhome post tons of video tutorials on all sorts of good and topical stuff, like how to create a Stranger Things intro title.

If you are a little more inclined to work with a layer-based workflow, like in After Effects, then HitFilm Pro Pro 12 is the app you’ll want to learn. Check out HitFilm Pro 12 on FXhome’s website and definitely watch some of the company’s informative tutorials.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

a52 Color adds colorist Gregory Reese

Colorist Gregory Reese has joined LA-based grading and finishing studio a52 Color, which is led by executive producer Thatcher Peterson and includes colorists Paul Yacono and Daniel de Vue.

Reese comes to a52 Color after eight years at The Mill. While there he colored a spectrum of commercials for athletic brands, including Nike and Reebok, as well as campaigns for Audi, Apple, Covergirl, GMC, Progressive and Samsung. He worked with such directors as AG Rojas, Matt Lambert and Harold Einstein while developing the ability to grade for any style.

Reese contributed to several projects for Apple, including the History of Sound spot, which sonically chronicles the decades from the late 1800s to 2015. The spot earned Reese an HPA Award nomination for Outstanding Color Grading in a Commercial.

“Color is at the center of how audiences engage with a picture in motion,” explains Reese. “Some of its technical components may not always be instantly recognized by the audience, but when it’s done right, it can make for an emotional experience.”

Merging his love for music with the passion for his craft, Reese has collaborated with artists like Jack Ü, Major Lazer, Arctic Monkeys, Run The Jewels, Jack White, Pharrell Williams and many more. Peterson and Reese previously worked together at The Mill in LA. “Having had the fortunate experience of working with Gregory at The Mill, I knew he was the real deal when it came to a seasoned colorist,” says Peterson.

The all-new facility was yet another perk that sealed the deal for Reese, as he explains: “One of the biggest barriers for entry to color is not having access to theaters. a52 Color solves that problem with having the ability to grade both broadcast and theatrical formats as well as giving us a high level of creative freedom. It left me immediately impressed by how invested they are in making it the absolute best place to go for color grading.”

He will be working on FilmLight Baselight.

Biff Butler joins Work as editor, creative partner

Work Editorial has added Biff Butler to its roster as editor and creative partner. Currently based in Los Angeles, Butler will be available to work in all of the company’s locations, which also include New York and London.

Originally from the UK, Butler moved to Los Angeles in 1999 to pursue a career as a musician, releasing albums and touring the country. Inspired by the title sequence for the movie Se7en cut by Angus Wall at Rock Paper Scissors (RPS), he found himself intrigued by the craft of editing. Following the breakup of his band in 2005 Butler got a job at RPS and, as he puts it, RPS was his film school. There he found his editorial voice and satiated another interest — advertising.

(Check out an interview we did with Butler recently while he was still at RPS.)

Within a couple years, he was cutting spots for Nike, Microsoft, Lexus and Adidas, and in 2008 he made a breakthrough with the Emmy Award-winning will.i.am video Yes We Can video by then-presidential candidate Barack Obama. By 2012 his clientele spanned across both coasts and after moving to New York, he went on to collaborate on some of the era-defining work coming out of the US at the time, with Wieden +Kennedy NY, Anomaly and BBDO amongst others. Perhaps, most notably, was his involvement in the Derek Cianfrance/Dicks Sporting Goods campaign that defined a style in sports commercials.

“I’ve always had an interest in advertising and the process,” says Butler. “I love watching a campaign roll out, seeing it permeate the culture. I still get such a kick out of coming out of the subway and seeing a huge poster from something I’ve been involved with.”

Butler has been recognized for his work, winning numerous AICE, the Clio and Cannes Lion awards as well as receiving an Emmy for the six-part documentary Long Live Benjamin, which he edited and co-directed with creative director Jimm Lasser.

Work founding partner Jane Dilworth says, “I have always been aware of Biff and the great work he does. He is an editor with great feeling and instinct that not only works for the director or creative but what is right for the job.”

Colorist Chat: Refinery’s Kyle Stroebel

This Cape Town, South Africa-based artist says that “working creatively with a director and DP to create art is a privilege.”

NAME: Colorist Kyle Stroebel

COMPANY: Refinery in Cape Town, South Africa

CAN YOU DESCRIBE YOUR COMPANY?
We are a full-service post company in the heart of Cape Town. We specialize in front-end dailies and data solutions, and have a full finishing department with a VFX arm and audio division.

Our work varies from long-form feature and television programming to commercials and music video content. We are a relatively young team that loves what we do.

AS A COLORIST, WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
We are by far the most important members of the team and the creative success of a movie is largely based around our skills! Okay, honestly? I have a shot on my timeline that is currently on version 54, and my client still needs an additional eyelash painted out.

I think the surprising thing to the uninformed is the minute elements that we focus on in detail. It’s not all large brush strokes and emotional gesturing; the images you see have more often than not gone through painstaking hours of crafting and creative processing. For us the beauty is in the detail.

Flatland

WHAT SYSTEM DO YOU WORK ON?
FilmLight’s Baselight

ARE YOU SOMETIMES ASKED TO DO MORE THAN JUST COLOR ON PROJECTS?
We are a small team handling multiple projects simultaneously, and our Baselight suites perform multiple functions as a result. My fellow colorist David Grant and I will get involved in our respective projects early on. We handle conform, VFX pulls and versioning and follow the pipe through until the film or project has cleared QC.

With Baselight’s enhanced toolset and paint functionality, we are now saving our clients both time and money by handling a variety of cleanups and corrections without farming the shots out to VFX or Flame.

Plus, the DI is pretty much the last element in the production process. We’re counselors, confidants and financial advisors. People skills come in really handy. (And a Spotify playlist for most tastes and moods is a prerequisite.)

WHAT’S YOUR FAVORITE PART OF THE JOB?
Making something amazing happen with a client’s footage. When they didn’t realize that their own footage could look like what the final product looks like… and sharing in that excitement when it happens.

WHAT’S YOUR LEAST FAVORITE?
Insane deadlines. As our tools have improved, the expectation for lightning-fast turnarounds has increased. I’m a perfectionist with my work and would love to spend days molding certain shots and trying new things. Walking away from a grade and coming back to it is often very fruitful because looking at a complex shot with fresh eyes frequently produces new outlooks and better results. But with hard delivery dates this is becoming seldom-afforded.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Scuba diving with manta rays in Bali; it’s a testament to how much I love what I do that I’m not doing that every day of my life.

WHY DID YOU CHOOSE THIS PROFESSION?
I sometimes wonder that myself when it’s 3am and I’m in a room with no windows for the 17th consecutive hour. Truthfully, I chose it because changing something from the banal to the magnificent gives me joy. Working creatively with a director and DP to create art is a privilege, and the fact that they must sweat and literally bleed to capture the images while I fiddle with the aircon in my catered suite doesn’t hurt.

I was in my third year of film school and brought one of my 16mm projects in to grade with a colorist in telecine. I couldn’t believe what I was seeing. I knew I wanted to do that.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
There have been a load of amazing projects recently. Our local industry has been very busy, and we have benefited greatly from that. I recently finished a remake of the cult classic Critters for Warner Bros.

Flatland

Before that I completed a movie called Flatland that premiered at Berlinale and then went to Cannes. There are a few other movies that I can’t chat too much about right now. I also did a short piece by one of South Africa’s biggest directors, Kim Geldenhuys, for the largest blue diamond found in recent history.

Changing of the seasons has also meant a couple of amazing fashion pieces for different fashion houses’ new collections.

HOW DO YOU PREFER TO WORK WITH THE DP/DIRECTOR?
Depends on the project. Depends on the director and DP too, actually. With long-form work,  I love to spend a day or two together with them in the beginning, and then I take a day or two to go over and play with a couple of scenes on my own. From there we should have reached a pretty cohesive vision as to what the directors wants and how I see the footage. Once that vision is aligned, I like to work on my own while listening to loud music and giving everything a more concrete look. Then, ideally, the director returns for a few days at the end, and we get stuck into the minutia.

With commercials, I like working with the director from early in the morning so that we know where we want to go before the agency has input and makes alterations! It’s a fine balancing act.

ANY SUGGESTIONS FOR GETTING THE MOST OUT OF A PROJECT FROM A COLOR PERSPECTIVE?
Have the colorist involved early on. When you begin shooting, have the colorist and DP develop a relationship so that the common vision develops during principal photography. That way, when the edit is locked, you have already experimented with ideas and the DP is shooting for a more precise look.

CAN YOU TALK ABOUT YOUR WORK ON THE WARNER BROS. FILM? EXPLAIN YOUR PROCESS ON THAT? ANY PARTICULARLY CHALLENGING SCENES?
Critters is a cult horror franchise from the late ’80 and early ‘90s. The challenge was to be really dark and moody but still stay true to the original and fit in with modern viewing devices without losing drastic detail. It centers on a lot of practical on-set special effects, something in increasing decline with advancements in CGI. Giving the puppets a lifelike appearance while still making them believable came with quite a few challenges.

HOW DO YOU PREFER THE DP OR DIRECTOR TO DESCRIBE THE LOOK THEY WANT? PHYSICAL EXAMPLES, FILMS TO EMULATE, ETC.?
Practical examples or references are very helpful. Matching something is easy, developing beyond that to give it a unique quality is what keeps it interesting. Certain directors find it easier to work with non-specifics and let me interpret the vibe and mood from more emotional explanations rather than technical jargon. While sometimes harder to initially interpret, that approach has benefits because it’s a bit more open-ended.

Red Bull

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
I love and hate most of the things I work on for a variety of reasons. It’s hard to pick one. Gun to my head? Probably a short film for Red Bull Music by Petite Noir. It was shot by Deon Van Zyl in the Namib desert and had just the most exquisite visuals from the outset. I still watch it when I’m feeling down.

WHERE DO YOU FIND INSPIRATION? ART? PHOTOGRAPHY?
At the risk of sounding like a typical millennial, I use Instagram a heck of a lot. I get to see what the biggest and best colorists are doing around the world. Before Instagram, you would only see pieces of critical acclaim. Now, through Instagram and Vimeo, I get to see so many passion projects in which people are trying new things and pushing boundaries beyond what clients, brands and studios want. I can spend days in galleries and bask in the glory of Caravaggio and Vermeer, but I can also scroll quickly through very contemporary looks, innovations and trends.

Red Bull

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
My phone. I hate it, but my life happens largely through that porthole. My NutriBullet. My Baselight. I’ve never loved an inanimate object like I love my Baselight.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
Instagram as mentioned. I love the work of Joseph Bicknell, Kath Raisch, Sofie Borup, Craig Simonetti, Matt Osborne and then anything that comes from The Mill channel. Also, a wide range of directors and the associated Vimeo links. I can honestly get lost on an obscure Korean channel with magnificent images and languages I don’t understand.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I run. Even If I’m breaking 90-hour weeks, I always make sure I run three or four times a week. And I love cooking. It’s expressive. I get to make meals for my partner Katherine, who tends to be very receptive.

Creating Foley for FX’s Fosse/Verdon

Alchemy Post Sound created Foley for Fosse/Verdon, FX’s miniseries about choreographer Bob Fosse (Sam Rockwell) and his collaborator and wife, the singer/dancer Gwen Verdon (Michelle Williams). Working under the direction of supervising sound editors Daniel Timmons and Tony Volante, Foley artist Leslie Bloome and his team performed and recorded hundreds of custom sound effects to support the show’s dance sequences and add realistic ambience to its historic settings.

Spanning five decades, Fosse/Verdon focuses on the romantic and creative partnership between Bob Fosse and Gwen Verdon. The former was a visionary filmmaker and one of the theater’s most influential choreographers and directors, while the latter was one of the greatest Broadway dancers of all time.

Given the subject matter, it’s hardly surprising that post production sound was a crucial element in the series. For its many musical scenes, Timmons and Volante were tasked with conjuring intricate sound beds to match the choreography and meld seamlessly with the score. They also created dense soundscapes to back the very distinctive environments of film sets and Broadway stages, as well as a myriad of other exterior and interior locations.

For Timmons, the project’s mix of music and drama posed significant creative challenges but also a unique opportunity. “I grew up in upstate New York and originally hoped to work in live sound, potentially on Broadway,” he recalls. “With this show, I got to work with artists who perform in that world at the highest level. It was not so much a television show as a blend of Broadway music, Broadway acting and television. It was fun to collaborate with people who were working at the top of their game.”

The crew drew on an incredible mix of sources in assembling the sound. Timmons notes that to recreate Fosse’s hacking cough (a symptom of his overuse of prescription medicine), they poured through audio stems from the classic 1979 film All That Jazz. “Roy Scheider, who played Bob Fosse’s alter ego in the film, was unable to cough like him, so Bob went into a recording studio and did some of the coughing himself,” Timmons says. “We ended up using those old recordings along with ADR of Sam Rockwell. When Bob’s health starts to go south, some of the coughing you hear is actually him. Maybe I’m superstitious, but for me it helped to capture his identity. I felt like the spirit of Bob Fosse was there on the set.”

A large portion of the post sound effects were created by Alchemy Post Sound. Most notably, Foley artists meticulously reproduced the footsteps of dancers. Foley tap dancing can be heard throughout the series, not only in musical sequences, but also in certain transitions. “Bob Fosse got his start as a tap dancer, so we used tap sounds as a motif,” explains Timmons. “You hear them when we go into and out of flashbacks and interior monologues.”Along with Bloome, Alchemy’s team included Foley artist Joanna Fang, Foley mixers Ryan Collison and Nick Seaman, and Foley assistant Laura Heinzinger.

Ironically, Alchemy had to avoid delivering sounds that were “too perfect.”  Fang points out that scenes depicting musical performances from films were meant to represent the production of those scenes rather than the final product. “We were careful to include natural background sounds that would have been edited out before the film was delivered to theaters,” she explains, adding that those scenes also required Foley to match the dancers’ body motion and costuming. “We spent a lot of time watching old footage of Bob Fosse talking about his work, and how conscious he was not just of the dancers’ footwork, but their shuffling and body language. That’s part of what made his art unique.”

Foley production was unusually collaborative. Alchemy’s team maintained a regular dialogue with the sound editors and were continually exchanging and refining sound elements. “We knew going into the series that we needed to bring out the magic in the dance sequences,” recalls production Foley editor Jonathan Fuhrer. “I spoke with Alchemy every day. I talked with Ryan and Nick about the tonalities we were aiming for and how they would play in the mix. Leslie and Joanna had so many interesting ideas and approaches; I was ceaselessly amazed by the thought they put into performances, props, shoes and surfaces.”

Alchemy also worked hard to achieve realism in creating sounds for non-musical scenes. That included tracking down props to match the series’ different time periods. For a scene set in a film editing room in the 1950s, the crew located a 70-year-old Steenbeck flatbed editor to capture its unique sounds. As musical sequences involved more than tap dancing, the crew assembled a collection of hundreds of pairs of shoes to match the footwear worn by individual performers in specific scenes.

Some sounds undergo subtle changes over the course of the series relative to the passage of time. “Bob Fosse struggled with addictions and he is often seen taking anti-depression medication,” notes Seaman. “In early scenes, we recorded pills in a glass vial, but for scenes in later decades, we switched to plastic.”

Such subtleties add richness to the soundtrack and help cement the character of the era, says Timmons. “Alchemy fulfilled every request we made, no matter how far-fetched,” he recalls. “The number of shoes that they used was incredible. Broadway performers tend to wear shoes with softer soles during rehearsals and shoes with harder soles when they get close to the show. The harder soles are more strenuous. So the Foley team was always careful to choose the right shoes depending on the point in rehearsal depicted in the scene. That’s accuracy.”

The extra effort also resulted in Foley that blended easily with other sound elements, dialogue and music. “I like Alchemy’s work because it has a real, natural and open sound; nothing sounds augmented,” concludes Timmons. “It sounds like the room. It enhances the story even if the audience doesn’t realize it’s there. That’s good Foley.”

Alchemy used Neumann KMR 81 and U 87 mics, Millennia mic pres, Apogee converters, and C24 mixer into Avid Pro Tools.

DP Chat: Autumn Durald Arkapaw on The Sun Is Also a Star

By Randi Altman

Autumn Durald Arkapaw always enjoyed photography and making films with friends in high school, so it was inevitable that her path would lead to cinematography.

“After a genre course in college where we watched Raging Bull and Broadway Danny Rose I was hooked. From that day on I wanted to find out who was responsible for photographing a film. After I found out it was an actual job, I set out to become a DP. I immediately started learning what the job entailed and also started applying to film schools with my photography portfolio.”

The Sun Is Also a Star

Her credits are vast and include James Franco’s Palo Alto, the indie film One & Two, and music videos for the Jonas Brothers and Arcade Fire. Most recently she worked on Emma Forrest’s feature film Untogether, Max Minghella’s feature debut Teen Spirit and director Ry Russo-Young’s The Sun Is Also a Star, which follows a two young people who hit it off immediately and spend one magical day enjoying each other and the chaos that is New York City.

We recently reached out to Durald Arkapaw to find out more about these films, her workflow and more.

You’ve been busy with three films out this year — Untogether, Teen Spirit and The Sun Is Also a Star. What attracts you to a project?
I’m particular when it comes to choosing a narrative project. I have mostly worked with friends in the past and continue to do so. When making feature films, I throw myself into it. So it’s usually the relationship with the director and their vision that draws me first to a project.

Tell us about The Sun Is Also a Star. How would you describe the overall look of the film?
Director Ry Russo-Young and I wanted the film to feel grounded and not like the usual overlit/precious versions of these films we’ve all encountered. We wanted it to have texture and darks and lights, and the visuals to have a soulfulness. It was important that the world we created felt like an authentic and emotional environment.

Autumn Durald Arkapaw

How early did you get involved in the production? What were some of the discussions about conveying the story arc visually?
Ry and I met early on before she left for prep in New York. She shared with me her passion for wanting to make something new in this genre. That was always the basis for me when I thought about the story unfolding over one day and the arc of these characters. It was important for us to have the light show their progression through the city, but also have it highlight their love.

How did you go about choosing the right camera and lenses to achieve the look?
Ry was into anamorphic before I signed on, so it was already alluring to me once she sent me her look book and visual inspirations. I tend to shoot mostly in the Panavision anamorphic format, so my love goes deep for this medium. As for our camera, the ARRI Alexa Mini was our first choice since it renders a filmic texture, which is very important to me.

Any challenging scene or scenes that you are particularly proud of?
One of my favorite scenes/shots in the film is when Daniel (Charles Melton) sees Natasha (Yara Shahidi) for the first time in Grand Central Station. We had a Scorpio 23-foot telescopic crane on the ground floor. It is a beautiful shot that pulls out, booms down from Daniel’s medium shot in the glass staircase windows, swings around the opposite direction and pushes in while also zooming in on a 12:1 into an extreme closeup of Natasha’s face. We only did two takes and we nailed it on the first one. My focus puller, Ethan Borsuk, is an ace, and so is my camera operator, Andrew Fletcher. We all celebrated that one.

The Sun is Also a Star

Were you involved in the final color grading? What’s important to you about the collaboration between DP and colorist?
Yes, we did final color at Company 3 in New York. Drew Geary was our DI colorist. I do a lot of color on set, and I like to use the on-set LUT for the final as well. So, it’s important that my colorist and I share the same taste. I also like to work fast. Drew was amazing. He was fantastic to work with and added a lot to the overall look and feel.

What inspires you artistically?
Talented, inspiring, hardworking people. Because filmmaking is a team effort, and those around me inspire me to make better art.

How do you stay on top of advancing technology that serves your vision?
Every opportunity I get to shoot is an opportunity to try something new and tell a story differently. Working with directors that like to push the envelope is always a plus. Since I work a lot in commercials, that always affords me the occasion to try new technology and have fun with it.

Has any recent or new technology changed the way you work, looking back over the past few years?
I recently wrapped a film where we shot a few scenes with the iPhone. Something I would have never considered in the past, but the technology has come a long way. Granted the film is about a YouTube star, but I was happily surprised at how decent some of the stuff turned out.

What are some of your best practices or rules you try to follow on each job?
Always work fast and always make it look the best you can while, most importantly, telling the story.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Review: Western Digital’s Blue SN500 NVMe SSD

By Brady Betzel

Since we began the transfer of power from the old standard SATA 3.5-inch hard drives to SSD drives, multimedia-based computer users have seen a dramatic uptick in read and write speeds. The only issue has been price. You can still find a 3.5-inch brick drive, ranging in size from 2TB to 4TB, for under $200 (maybe closer to $100), but if you upgraded to an SSD drive over the past five years, you were looking at a huge jump in price. Hundreds, if not thousands, of dollars. These days you are looking at just a couple of hundred for 1TB and even less for 256GB or 512GB SSD.

Western Digital hopes you’ll think of NVMe SSD drives as more of an automatic purchase than a luxury with the Western Digital Blue SN500 NVMe M.2 2280 line of SSD drives.

Before you get started, you will need a somewhat modern computer with an NVMe M.2-compatible motherboard (also referred to as a PCIe Gen 3 interface). This NVMe SSD is a “B+M key” configuration, so you will need to make sure you are compatible. Once you confirm that your motherboard is compatible, you can start shopping around. The Western Digital Blue series has always been the budget-friendly level of hard drives. Western Digital also offers the next level up: the Black series. In terms of NVMe SSD M.2 drives, the Western Digital Blue series drives will be budget-friendly, but they also use two fewer PCIe lanes, which results in a slower read/write speed. The Black series uses up to four PCIe lanes, as well as has a heat sink to dissipate the heat. But for this review, I am focusing on the Blue series and how it performs.

On paper the Western Digital Blue SN500 NVMe SSD is available in either 250GB or 500GB sizes, measures approximately 80mm long and uses the M.2 2280 form factor for the PCIe Gen 3 interface in up to two lanes. Technically, the 500GB drive can achieve up to 1,700MB/s read and 1450MB/s write speeds, and the 250GB can achieve up to 1700MB/s read and 1300MB/s write speeds.

As of this review, the 250GB version sells for $53.99, while the 500GB version sells for $75.99. You can find specs on the Western Digital website and learn more about the Black series as well.

One of the coolest things about these NVMe drives is that they come standard with a five-year limited warranty (or max endurance limit). The max endurance (aka TBW — terabytes written) for the 250GB SSD is 150TB, while the max endurance for the 500GB version is 300TB. Both versions have a MTTF (mean time to failure) of 1.75 million hours.

In addition, the drive uses an in-house controller and 3D NAND logic. Now those words might sound like nonsense, but the in-house controller is what tells the NVMe what to do and when to do it (it’s essentially a dedicated processor), while3D NAND is a way of cramming more memory into smaller spaces. Instead of hard drive manufacturers adding more memory on the same platform in an x- or y-axis, they achieve more storage space by stacking layers vertically on top — or on the z-axis.

Testing Read and Write Speeds
Keep in mind that I ran these tests on a Windows-based PC. Doing a straight file transfer, I was getting about 1GB/s. When using Crystal Disk Mark, I would get a burst of speed at the top, slow down a little and then mellow out. Using a 4GB sample, my speeds were:
“Seq Q32T” – Read: 1749.5 MB/s – Write: 1456.6 MB/s
“4KiB Q8T8” – Read: 1020.4 MB/s – Write: 1039.9 MB/s
“4KiB Q32T1” – Read: 732.5 MB/s – Write: 676.5 MB/s
“4KiB Q1T1” – Read: 35.77 MB/s – Write: 185.5 MB/s

If you would like to read exactly what these types of tests entail, check out the Crystal Disk Mark info page. In the AJA System Test I had a little drop off, but with a 4GB test file size, I got an initial read speed of 1457MB/s and a write speed of 1210MB/s, which seems to fall more in line with what Western Digital is touting. The second time I ran the AJA System Test, I got a read speed of 1458MB/s and write speed of 883MB/s. I wanted a third opinion, so I ran the Blackmagic Design Disk Speed Test (you’ll have to install drivers for a Blackmagic card, like the Ultrastudio 4K). On my first run, I got a read speed of 1359.6MB/s and write speed of 1305.8MB/s. On my second run, I got a read speed of 1340.5MB/s and write speed of 968.3MB/s. My read numbers were generally above 1300MB/s, and my write numbers varied between 800 and 1000MB/s. Not terrible for a sub-$100 hard drive.

Summing Up
In the end, the Western Digital Blue SN500 NVMe SSD is an amazing value at under $100, and hopefully we will get expanded sizes in the future. The drive is a B+M key configuration, so when you are looking at compatibility, make sure to check which key your PCIe card, external drive case or motherboard supports. It is typically M or B+M key, but I found a PCI card that supported both. If you need more space and speed than the WD Blue series can offer, check out Western Digital’s Black series of NVMe SSDs.

The sticker price starts to go up significantly when you hit the 1TB or 2TB marks — $279.99 and $529.99, respectively (with the heat sink attachment). If you stick to the 500GB version, you are looking at a more modest price tag.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Steinberg’s SpectraLayers Pro 6: visual audio editing with ARA support

Steinberg’s SpectraLayers Pro 6 audio editing software is now available. First distributed by Sony Creative Software and then by Magix Software, the developers behind SpectraLayers have joined forces with Steinberg to release its sixth iteration.

Unlike most audio editing tools, SpectraLayers offers a visual approach to audio editing, allowing users to visualize audio in the spectral domain (in 2D and 3D) and to manipulate its spectral data in many different ways. While many dedicated audio pros typically edit with their ears, this offering targets those who are more comfortable with visuals leading their editing decisions.

With its 25 advanced tools, SpectraLayers Pro 6 provides precision-editing within the spectral domain, comparable with the editing capabilities applied in high-performance photo editing software: modification, selection, measurement and drawing. Think Adobe Photoshop for audio editing.

The features newly introduced in SpectraLayers Pro 6 include ARA 2 support; next to the standalone application, Version 6 offers an ARA plug-in that seamlessly integrates into every ARA 2-compatible DAW, such as Nuendo and Cubase, to be used as a native editor. Fades along the selection border are one of the innovative features in SpectraLayers, and Pro 6 now includes visible fade masks and allows users to select from the many available fade types.

SpectraLayers’ advanced selection engine now features nine revamped selection tools — including the new Transient Selector — making selections more flexible. The new Move tool helps users transform audio intuitively: grab layers to activate and move or scale them. SpectraLayers Pro 6 also provides external editor integration, allowing users to include other editor software so that any selection can be processed by them as well.

“This new version of SpectraLayers offers a refined and more intuitive user interface inspired by picture editors and a new selection system combining multiple fade masks, bringing spectral editing and remixing to a whole new level. We’re also excited by the possibilities unlocked by the new ARA connection between SpectraLayers, Cubase and Nuendo, bringing spectral mixing and editing right within your DAW,” says Robin Lobel, creator of SpectraLayers.

The user interface of SpectraLayers Pro 6 has completely been redesigned to build on the original use of image editing software. The menus have been redesigned and the panels are collapsible; the Layers panel is customizable; and users can now refer to comprehensive tool tip documentation and a new user manual.

The full retail version of SpectraLayers Pro 6 is available as download through the Steinberg Online Shop at the suggested retail price of $399.99, together with various downloadable updates from previous versions.
of the respective owners.

Behind the Title: Cinematic Media head of sound Martin Hernández

This audio post pro’s favorite part of the job is the start of a project — having a conversation with the producer and the director. “It’s exciting, like any new relationship,” he says.

Name: Martin Hernández

Job Title: Supervising Sound Editor

Company: Mexico City’s Cinematic Media

Can you describe Cinematic Media and your role there?
I lead a new sound post department at Cinematic Media, Mexico’s largest post facility focused on television and cinema. We take production sound through the full post process: effects, backgrounds, music editing… the whole thing. We finish the sound on our mix stages.

What would surprise people most about what you do?
We want the sound to go unnoticed. The viewer shouldn’t be aware that something has been added or is unnatural. If the viewer is distracted from the story by the sound, it’s a lousy job. It’s like an actor whose performance draws attention to himself. That’s bad acting. The same applies to every aspect of filmmaking, including sound. Sound needs to help the narrative in a subjective and quiet way. The sound should be unnoticed… but still eloquent. When done properly, it’s magical.

Hernandez has been working on Easy for Netflix.

What’s your favorite part of the job?
Entering the project for the first time and having a conversation with the team: the producer and the director. It’s exciting, like any new relationship. It’s beautiful. Even if you’re working with people you’ve worked with before, the project is newborn.

My second favorite part is the start of sound production, when I have a picture but the sound is a blank page. We must consider what to add. What will work? What won’t? How much is enough or too much? It’s a lot like cooking. The dish might need more of this spice and a little less of that. You work with your ingredients, apply your personal taste and find the right flavor. I enjoy cooking sound.

What’s your least favorite part of the job?
Me.

What do you mean?
I am very hard on myself. I only see my shortcomings, which are, to tell you the truth, many. I see my limitations very clearly. In my perception of things, it is very hard to get where I want to go. Often you fail, but every once in a while, a few things actually work. That’s why I’m so stubborn. I know I am going to have a lot of misses, so I do more than expected. I will shoot three or four times, hoping to hit the mark once or twice. It’s very difficult for me to work with me.

What is your most productive time of the day?
In the morning. I’m a morning person. I work from my own place, very early, like 5:30am. I wake up thinking about things that I left behind in the session. It’s useless to remain in bed, so I go to my studio and start working on these ideas. It’s amazing how much you can accomplish between 6am and 9am. You have no distractions. No one’s calling. No emails. Nothing. I am very happy working in the mornings.

If you didn’t have this job, what would you be doing?
That’s a tough question! I don’t know anything else. Probably, I would cook. I’d go to a restaurant and offer myself as an intern in the kitchen.

For most people I know, their career is not something they’ve chosen; it was embedded in them when they were born. It’s a matter of realizing what’s there inside you and embracing it. I never, in my wildest dreams, expected to be doing this work.

When I was young, I enjoyed watching films, going to the movies, listening to music. My earliest childhood memories are sound memories, but I never thought that would be my work. It happened by accident. Actually, it was one accident after another. I found myself working with sound as a hobby. I really liked it, so I embraced it. My hobby then became my job.

So you knew early on that audio would be your path?
I started working in radio when I was 20. It happened by chance. A neighbor told me about a radio station that was starting up from scratch. I told my friend from school, Alejandro Gonzalez Iñárritu, the director. Suddenly, we’re working at a radio station. We’re writing radio pieces and doing production sound. It was beautiful. We had our own on-air, live shows. I was on in the mornings. He did the noon show. Then he decided to make films and I followed him.

Easy

What are some of your recent projects?
I just finished a series for Joe Swanberg, the third season of Easy. It’s on Netflix. It’s the fourth project I’ve done with Joe. I’ve also done two shows here in Mexico. The first one is my first full-time job as supervisor/designer for Argos, the company lead by Epigmenio Ibarra. Yankee is our first series together for Netflix, and we’re cutting another one to be aired later in the year. It’s a very exciting for me.

Is there a project that you’re most proud of?
I am very proud of the results that we’ve been getting on the first two series here in Mexico. We built the sound crew from scratch. Some are editors I’ve worked with before, but we’ve also brought in new talent. That’s a very joyful process. Finding talent is not easy, but once you do, it’s very gratifying. I’m also proud of this work because the quality is very good. Our clients are happy, and when they’re happy, I’m happy.

What pieces of technology can you not live without?
Avid Pro Tools. It’s the universal language for sound. It allows me to share sound elements and sessions from all over the world, just like we do locally, between editing and mixing stages. The second is my converter. We are using the Red system from Focusrite. It’s a beautiful machine.

This is a high-stress job with deadlines and client expectations. What do you do to de-stress from it all?
Keep working.

Veteran episodic colorist Scott Klein joins Light Iron

Colorist Scott Klein has joined post house Light Iron, which has artists working on feature films, episodic series and music videos at its Los Angeles- and New York-based studios. Klein brings with him 40 years of experience supervising a variety of episodic series.

“While Light Iron was historically known for its capabilities with feature films, we have developed an equally strong episodic division, and Scott builds upon our ongoing commitment to providing the talent and technology necessary for supporting all formats and distribution platforms,” says GM Peter Cioni of Light Iron.

Klein’s list of credits include Fox’s Empire, HBO’s Deadwood: The Movie and Showtime’s Ray Donovan. He also collaborated on the series Bosch, True Blood, The Affair, Halt and Catch Fire, Entourage and The Sopranos. Klein is also an associate member of the American Society of Cinematographers (ASC). He will be working on Blackmagic’s DaVinci Resolve.

“I really enjoy the artistic collaboration with filmmakers,” he says. “It is great to be part of a facility with such a pure passion for supporting the creative through technology. Colorists need strong technology that serves as a means to best express the feelings being conveyed in the images and further enhance the moods that draw audiences into a story.”

Also joining Klein are his colleagues and fellow colorists Daniel Yang, Jesús Borrego and Ara Thomassian. They join Light Iron after working together at Warner Bros. and then Technicolor.

In addition to growing its team of artists to support the expanding market and client needs, Light Iron has also expanded its physical footprint with a second Hollywood-based location a short distance from its flagship facility. A full breadth of creative finishing services for feature films and episodic series is available at both locations. Light Iron also has locations in Atlanta, Albuquerque, Chicago and New Orleans.

 

Quick Chat: Robert Ryang on editing Netflix’s Zion doc

Back in May, Cut+Run’s Robert Ryang took home a Sports Emmy in the Outstanding Editing category for the short film Zion. The documentary, which premiered at Sundance and was released on Netflix, tells the story of Zion Clark, a young man who was born without legs, grew up in foster care and found community and hope in wrestling.

Robert Ryang and his Emmy for his work on Zion.

Clark began wrestling in second grade against his able-bodied peers. The physical challenge became a therapeutic outlet and gave him a sense of family. Moving from foster home to foster home, wrestling became the one constant in his childhood.

Editor Ryang and Zion’s director, Floyd Russ, had worked together previously — on the Ad Council’s Fans of Love and SK-II’s Marriage Market Takeover, among other projects — and developed a creative shorthand that helped tell this compelling, feel-good story.

We spoke with Ryang about the film, his process and working with the director

How and when did you become involved in this project?
In the spring of 2017, my good friend director Floyd Russ asked me to edit his passion project. Initially, I was hesitant, since it was just after the birth of my second child. Two years later, both the film and my kid have turned out great.

You’ve worked him before. What defines the way you work together?
I think Floyd and I work really well together because we’re such good friends; we don’t have to be polite. He’ll text me ideas any time of day, and I feel comfortable enough to tell him if I don’t like something. He wins most of the fights, but I think this dialectic probably makes the work better.

How did you approach the edit on the film? How did you hone the story structure?
At first, Floyd had a basic outline that I followed just to get something on the timeline. But from there, it was a pretty intense process of shuffling and reshaping. At one point, we tried to map the beats onto a whiteboard, and it looked like a Richter scale. Editor Adam Bazadona helped cut some of these iterations while I was on paternity leave.

How does working on a short film like this differ — hats worn, people involved, etc. — from advertising projects?
The editing process was a lot different from most commercial projects in that it was only Floyd and me in the room. Friends floated a few thoughts here and there, but we were only working toward a director’s cut.

What tools did you use?
Avid Media Composer for editing, some Adobe After Effects for rough comps.

What are the biggest creative and technical challenges you faced in the process?
With docs, there are usually infinite ways to put it together, so we did a lot of exploration. Floyd definitely pushed me out of my comfort zone in prescribing the more abstract scenes, but I think those touches ultimately made the film stand out.

From Sundance, to Netflix, to Sports Emmy awards. Did you ever imagine it would take this journey?
There wasn’t much precedent for a studio or network acquiring a 10-minute short, so our biggest hope was that it would get into Sundance then live on Vimeo. It really exceeded everyone’s expectations. And I would have never imagined receiving an Emm, but am really honored I did.

CVLT hires Katya Pavlova as head of post

Bi-coastal video production studio CVLT has added Katya Pavlova as head of post production. She will be based in the studio’s New York location.

Pavlova joins the team after six years at The Mill, where she produced projects that include David Bowie’s Life on Mars music video remake directed by photographer Mick Rock, as well as Steven Klein’s augmented reality experience for W Magazine’s cover, featuring an interactive 3D digital portrait of Katy Perry.

In her new role, Pavlova will focus on growing CVLT’s post production operation and developing new partnerships. She brings career expertise in a broad range of editorial, VFX, design and CG disciplines across digital and broadcast. In her time as a producer at The Mill, she worked on variety of work for brands including Netflix, Facebook, Ralph Lauren, Jimmy Choo and Vogue.

“We have seen post production needs shift focus from traditional media channels to multi-platform requirements, including emerging technology like augmented reality and crafting short-form videos for social media and mobile audiences. At CLVT, I intend to adapt our team to execute post on advanced AR projects as well as quick turnaround videos for social channels.”

Mixing sounds of fantasy and reality for Rocketman

By Jennifer Walden

Paramount Pictures’ Rocketman is a musical fantasy about the early years of Elton John. The story is told through flashbacks, giving director Dexter Fletcher the freedom to bend reality. He blended memories and music to tell an emotional truth as opposed to delivering hard facts.

Mike Prestwood Smith

The story begins with Elton John (Taron Egerton) attending a group therapy session with other recovering addicts. Even as he’s sharing details of his life, he’s stretching the truth. “His recollection of the past is not reliable. He often fantasizes. He’ll say a truth that isn’t really the case, because when you flash back to his memory, it is not what he’s saying,” says BAFTA-winning re-recording mixer Mike Prestwood Smith, who handled the film’s dialogue and music. “So we’re constantly crossing the line of fantasy even in the reality sections.”

For Smith, finding the balance between fantasy and reality was what made Rocketman unique. There’s a sequence in which pre-teen Elton (Kit Connor) evolves into grown-up Elton to the tune of “Saturday’s Alright for Fighting.” It was a continuous shot, so the camera tracks pre-teen Elton playing the piano, who then then gets into a bar fight that spills into an alleyway that leads to a fairground where a huge choreographed dance number happens. Egerton (whose actual voice is featured) is singing the whole way, and there’s a full-on band under him, but specific effects from his surrounding environment poke through the mix. “We have to believe in this layer of reality that is gluing the whole thing together, but we never let that reality get in the way of enjoying the music.”

Smith helped the pre-recorded singing to feel in-situ by adding different reverbs — like Audio Ease’s AltiVerb, Exponential Audio’s PhoenixVerb and Avid’s ReVibe. He created custom reverbs from impulse responses taken from the rooms on set to ground the vocal in that space and help sell the reality of it.

For instance, when Elton is in the alleyway, Smith added a slap verb to Egerton’s voice to make it feel like it’s bouncing off the walls. “But once he gets into the main verses, we slowly move away from reality. There’s this flux between making the audience believe that this is happening and then suspending that belief for a bit so they can enjoy the song. It was a fine line and very subjective,” he says.

He and re-recording mixer/supervising sound editor Matthew Collinge spent a lot of time getting it to play just right. “We had to be very selective about the sound of reality,” says Smith. “The balance of that whole sequence was very complex. You can never do those scenes in one take.”

Another way Smith helped the pre-recorded vocals to sound realistic was by creating movement using subtle shifts in EQ. When Elton moves his head, Smith slightly EQ’d Egerton’s vocals to match. These EQ shifts “seem little, but collectively they have a big impact on selling that reality and making it feel like he’s actually performing live,” says Smith. “It’s one of those things that if you don’t know about it, then you just accept it as real. But getting it to sound that real is quite complicated.”

For example, there’s a scene in which Egerton is working out “Your Song,” and the camera cuts from upstairs to downstairs. “We are playing very real perspectives using reverb and EQ,” says Smith. Then, once Elton gets the song, he gives Bernie Taupin (Jamie Bell) a knowing look. The music gets fleshed out with a more complicated score, with strings and guitar. Next, Elton is recording the song in a studio. As he’s singing, he’s looking down and playing piano. Smith EQ’d all of that to add movement, so “it feels like that performance is happening at that time. But not one single sound of it is from that moment on set. There is a laugh from Bernie, a little giggle that he does, and that’s the only thing from the on-set performance. Everything else is manufactured.”

In addition to EQ and reverb, Smith used plugins from Helsinki-based sound company Oeksound to help the studio recordings to sound like production recordings. In particular, Oeksound’s Spiff plugin was useful for controlling transients “to get rid of that close-mic’d sound and make it feel more like it was captured on set,” Smith says. “Combining EQ and compression and adding reverb helped the vocals to sound like sync, but at the same time, I was careful not to take away too much from the quality of the recording. It’s always a fine line between those things.”

The most challenging transitions were going from dialogue into singing. Such was the case with quiet moments like “Your Song” and “Goodbye Yellow Brick Road.” In the latter, Elton quietly sings to his reflection in a mirror backstage. The music slowly builds up under his voice as he takes off down the hallway and by the time he hops into a cab outside it’s a full-on song. Part of what makes the fantasy feel real is that his singing feels like sync. The vocals had to sound impactful and engage the audience emotionally, but at the same time they had to sound believable — at least initially. “Once you’re into the track, you have the audience there. But getting in and out is hard. The filmmakers want the audience to believe what they’re seeing, that Taron was actually in the situations surrounded by a certain level of reality at any given point, even though it’s a fantasy,” says Smith.

The “Rocketman” song sequence is different though. Reality is secondary and the fantasy takes control, says Smith. “Elton happens to be having a drug overdose at that time, so his reality becomes incredibly subjective, and that gives us license to play it much more through the song and his vocal.”

During “Rocketman,” Elton is sinking to the bottom of a swimming pool, watching a younger version of himself play piano underwater. On the music side, Smith was able to spread the instruments around the Dolby Atmos surround field, placing guitar parts and effect-like orchestrations into speakers discretely and moving those elements into the ceiling and walls. The bubble sound effects and underwater atmosphere also add to the illusion of being submerged. “Atmos works really well when you have quiet, and you can place sounds in the sound field and really hear them. There’s a lot of movement musically in Rocketman and it’s wonderful to have that space to put all of these great elements into,” says Smith.

That sequence ends with Elton coming on stage at Dodger Stadium and hitting a baseball into the massive crowd. The whole audience — 100,000 people — sing the chorus with him. “The moment the crowd comes in is spine-tingling. You’re just so with him at that point, and the sound and the music are doing all of that work,” he explains.

The Music
The music was a key ingredient to the success of Rocketman. According to Smith, they were changing performances from Egerton and also orchestrations right through the post sound mix, making sure that each piece was the best it could be. “Taron [Egerton] was very involved; he was on the dub stage a lot. Once everything was up on the screen, he’d want to do certain lines again to get a better performance. So, he did pre-records, on-set performances and post recording as well,” notes Smith.

Smith needed to keep those tracks live through the mix to accommodate the changes, so he and Collinge chose Avid S6 control surfaces and mixed in-the-box as opposed to printing the tracks for a mix on a traditional large-format console. “To have locked down the music and vocals in any way would have been a disaster. I’ve always been a proponent of mixing inside Pro Tools mainly because workflow-wise, it’s very collaborative. On Rocketman, having the tracks constantly addressable — not just by me but for the music editors Cecile Tournesac and Andy Patterson as well — was vital. We were able to constantly tweak bits and pieces as we went along. I love the collaborative nature of making and mixing sound for film, and this workflow allows for that much more so than any other. I couldn’t imagine doing this any other way,” says Smith.

Smith and Collinge mixed in native Dolby Atmos at Goldcrest London in Theatre 1 and Theatre 2, and also at Warner Bros. De Lane Lea. “It was such a tight schedule that we had all three mixing stages going for the very end of it, because it got a bit crazy as these things do,” says Smith. “All the stages we mixed at had S6s, and I just brought the drives with me. At one point we were print mastering and creating M&Es on one stage and doing some fold-downs on a different stage, all with the same session. That made it so much more straightforward and foolproof.”

As for the fold-down from Atmos to 5.1, Smith says it was nearly seamless. The pre-recorded music tracks were mixed by music producer Giles Martin at Abbey Road. Smith pulled those tracks apart, spread them into the Atmos surround field and then folded them down to 5.1. “Ultimately, the mixing that Giles Martin did at Abbey Road was a great thing because it meant the fold-downs really had the best backbone possible. Also, the way that Dolby has been tweaking their fold-down processing, it’s become something special. The fold-downs were a lot easier than I thought they’d be,” concludes Smith.


Jennifer Walden is a New Jersey-based audio engineer and writer. Follow her on Twitter @audiojeney.

Review: The Loupedeck+ editing console for stills and video

By Brady Betzel

As an online editor I am often tasked with wearing multiple job hats, including VFX artist, compositor, offline editor, audio editor and colorist, which requires me to use special color correction panel hardware. I really love photography and cinematography but have never been able to use the color correction hardware I’m used to in  Adobe’s Photoshop or Lightroom, so for the most part I’ve only done basic photo color correction.

You could call it a hobby, although this knowledge definitely helps many aspects of my job. I’ve known Photoshop for years and use it for things like building clean plates to use in apps like Boris FX Mocha Pro and After Effects, but I had never really mastered Lightroom. However, that changed when I saw the Loupedeck. I was really intrigued with its unique layout but soon dismissed it since it didn’t work on video… until now. I’m happy to say the new Loupedeck+ works with both photo and video apps.

Much like the Tangent Element and Wave or Blackmagic Micro and Mini panels, the Loupedeck+ is made to adjust parameters like contrast, exposure, saturation, highlights, shadows and individual colors. But, unlike Tangent or Blackmagic products, the Loupedeck+ functions not only in Adobe Premiere and Apple Final Cut Pro X but in image editing apps like Lightroom 6, Photoshop CC, and Skylum Aurora HDR; the audio editing app Adobe Audition and the VFX app Adobe After Effects. There’s also beta integration with Capture One.

It works via USB 2.0 connection on Windows 10 and Mac OS 10.12 or later. In order to use the panel and adjust its keys, you must also download the Loupedeck software, which you can find here. The Loupedeck+ costs just $249 dollars, which is significantly less than many of the other color correction panels on the market offering so many functions.

Digging In
In this review, I am going to focus on Loupedeck+’s functionality with Premiere, but keep in mind that half of what makes this panel interesting is that you can jump into Lightroom Classic or Photoshop and have the same, if not more, functionality. Once you install the Loupedeck software, you should restart your system. When I installed the software I had some weird issues until I restarted.

When inside of Premiere, you will need to tell the app that you are using this specific control panel by going to the Edit menu > Preferences > Control Surface > click “Add” and select Loupedeck 2. This is for a PC, but Mac OS works in a similar way. From there you are ready to use the Loupedeck+. If you have any customized keyboard shortcuts (like I do) I would suggest putting your keyboard shortcuts to default for the time being, since they might cause the Loupedeck+ to use different keypresses than you originally intended.

Once I got inside of Premiere, I immediately opened up the Lumetri color panels and began adjusting contrast, exposure and saturation, which are all clearly labeled on the Loupedeck+. Easy enough, but what if you want to use the Loupedeck+ as an editing panel as well as a basic color correction console? That’s when you will want to print out pages six through nine of the Premiere Pro Loupedeck+ manual, which you can find here. (If you like to read on a tablet you could pull that up there, but I like paper for some reason… sorry trees.) In these pages, you will see that there are four layers of controls built into the Loupedeck+.

Shortcuts
Not only can you advance frames using the arrow keypad, jump to different edit points with the jog dial, change LUTs, add keyframes and extend edits, you also have three more layers of shortcuts. To get to the second layer of shortcuts, press the “Fn” button located toward the lower left, and the Fn layer will appear. Here you can do things like adjust the shadows and midtones on the X and Y axes, access the Type Tool or add edits to all tracks. To go even further, you can access the “Custom” mode, which has defaults but can be customized to whichever keypress and functions the Loupedeck+ app allows.

Finally, while in the Custom mode, you can press the Fn button again and enter “Custom Fn” mode — the fourth and final layer of shortcuts. Man, that is a lot of customizable buttons. Do I need all those buttons? Probably not, but still, they are there —and it’s better to have too much than not enough, right?

Beyond the hundreds of shortcuts in the Loupedeck+ console you have eight color-specific scroll wheels for adjusting. In Lightroom Classic, these tools are self-explanatory as they adjust each color’s intensity.

In Premiere they work a little differently. To the left of the color scroll wheels are three buttons: hue, saturation and luminance (Hue, Sat and Lum, respectively). In the standard mode, they each equate to a different color wheel: Hue = highlights, Sat = midtones and Lum = shadows. The scroll wheel above red will adjust the up/down movement in the selected color wheel’s x-axis, orange will adjust the left/right movement in the selected color wheel’s y-axis, and yellow will adjust the intensity (or luminance) of the color wheel.

Controlling the Panel
In traditional color correction panels, color correction is controlled by roller balls surrounded by a literal wheel to control intensity. It’s another way to skin a cat. I personally love the feel of the Tangent Element Tk panel, which simply has three roller balls and rings to adjust the hue, but some people might like the ability to precisely control the color wheels in x- and y-axis.

To solve my issue, I used both. In the preferences, I enabled both Tangent and Loupedeck options. It worked perfectly (once I restarted)! I just couldn’t get past the lack of hue balls and rings in the Loupedeck, but I really love the rest of the knobs and buttons. So in a weird hodge-podge, you can combine a couple of panels to get a more “affordable” set of correction panels. I say affordable in quotes because, as of this review, the Tangent Element Tk panels are over $1,100 for one panel, while the entire set is over $3,000.

So if you already have the Tangent Element Tk panel, but want a more natural button and knob layout, the Loupedeck+ is a phenomenal addition as long as you are staying within the Adobe or FCP X world. And while I clearly like the Tangent Elements panels, I think the overall layout and design of the Loupedeck+ is more efficient and overall more modern.

Summing Up
In the end, I really like the Loupedeck+. I love being able to jump back and forth between photo and video apps seamlessly with one panel. What I think I love the most is the “Export” button in the upper right corner of the Loupedeck+. I wish that button existed on all panels.

When using the Loupedeck+, you can really get your creative juices flowing by hitting the “Full Screen” button and color correcting away, even using multiple adjustments at once to achieve your desired look — similar to how a lot of people use other color correction panels. And at $249, the Loupedeck+ might be the overall best value for the functionality of any editing/color correction panel currently out there.

Can I see using it when editing? I can, but I am such a diehard keyboard and Wacom tablet user that I have a hard time using a panel for editing functions like trimming and three-point edits. I did try the trimming functionality and it was great, not only on a higher-end Intel Xeon-based system but on an even older Windows laptop. The responsiveness was pretty impressive and I am a sucker for adjustments using dials, sliders and roller balls.

If you want to color correct using panels, I think the Loupedeck+ is going to fit the bill for you if you work in Adobe Creative Suite or FCP X. If you are a seasoned colorist, you will probably start to freak out at the lack of rollerballs to adjust hues of shadows, midtones and highlights. But if you are a power user who stays inside the Adobe Creative Cloud ecosystem, there really isn’t a better panel for you. Just print up the shortcut pages of the manual and tape them to the wall by your monitor for constant reference.

As with anything, you will only get faster with repetition. Not only did I test out color correcting footage for this review, I also used the Loupedeck+ in Adobe Lightroom Classic to correct my images!


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Perpetual Grace’s DPs, colorist weigh in on show’s gritty look

You don’t have to get very far into watching the Epix series Perpetual Grace LTD to realize just how ominous this show feels. It begins with the opening shots, and by the time you’ve spent a few minutes with the dark, mysterious characters who populate this world — and gathered hints of the many schemes within schemes that perpetuate the story — the show’s tone is clear. With its black-and-white flashbacks and the occasional, gritty flash-forwards, Perpetual Grace gets pretty dark, and the action goes in directions you won’t see coming.

This bizarre show revolves around James (Westworld’s Jimmi Simpson), who gets caught up in what initially seems like a simple con that quickly gets out of control. Sir Ben Kingsley, Jacki Weaver, Chris Conrad and Luis Guzmán also star as an assortment of strange and volatile characters.

The series comes from the minds of executive producer Steve Conrad, who also served in that role on Amazon’s quirky drama Patriot, and Bruce Terris, who was both a writer and a first AD on that show.

These showrunners developed the look with other Patriot veterans: cinematographers James Whitaker and Nicole Hirsch Whitaker, who incorporated colorist Sean Coleman’s input before commencing principal photography.

Coleman left his grading suite at Company 3 in Santa Monica to spend several days at the series’ New Mexico location. While there he worked with the DPs to build customized LUTs for them to use during production. This meant that everyone on set could get a strong sense of how lighting, costumes, sets and locations would read with the show’s signature looks applied.

The Whitakers on set

“I’ve never been able to work with the final colorist this way,” says Whitaker, who also alternated directing duties with Conrad. “It was great having him there on set where we could talk about the subtleties of color. What should the sky look like? What should blood look like? Faces? Clothes? Using Resolve, he made two LUTs — “the main one for the color portions and a different one specifically for the black-and-white parts.”

The main look of the show is inspired by film noir and western movie tropes, and all with a tip of the hat to Roger Deakins’ outstanding work on The Assassination of Jesse James by the Coward Robert Ford. “For me,” says Whitaker, “it’s about strong contrast, deep blacks and desert colors … the moodier the better. I don’t love very blue skies, but we wanted to keep some tonality there.”

“It’s real sweaty, gritty, warm, nicotine-stained kind of thing,” Coleman elaborates.

“When we showed up in New Mexico,” Whitaker recalls, “all these colors did exist at various times of the day, and we just leaned into them. When you have landscapes with big, blue skies, strong greens and browns, you can lean in the other way and make it overly saturated. We leaned into it the other way, holding the brown earth tones but pulling out some of the color, which is always better for skin tones.”

The LUTs, Whitaker notes, offer a lot more flexibility than the DPs would have if they used optical filters. Beyond the nondestructive aspect of a LUT, it also allows for a lot more complexity. “If you think about a ‘sepia’ or ‘tobacco’ filter or something like that, you think of an overall wash that goes across the entire frame, and I get immediately bored by that. It’s tricky to do something that feels like it’s from a film a long time ago without dating the project you’re working on now; you want a lot of flexibility to get [the imagery] where you want it to go.”

The series was shot in November through February, often in brutally cold environments. Almost the entire series (the present-day scenes and black-and-white flashbacks) was shot on ARRI Alexa cameras in a 2.0:1 aspect ratio. A frequent Whitaker/Hirsch Whitaker collaborator, DIT Ryan Kunkleman applied and controlled the LUTs so the set monitors reflected their effect on the look.

The flash forwards, which usually occur in very quick spurts, were shot on a 16mm Bolex camera using Kodak’s 7203 (50D) and 7207 (250D) color negative film, which was pushed two stops in processing to enhance grain in post by Coleman.

Final color was done at Company 3’s Santa Monica facility, working primarily alongside the Whitakers. “We enhanced the noir look with the strong, detailed blacks,” says Coleman. Even though a lot of the show exudes the dry desert heat, it was actually shot over a particularly cold winter in New Mexico. “Things were sometimes kind of cold-looking, so sometimes we’d twist things a bit. We also added some digital ‘grain’ to sort of muck it up a little.”

For the black and white, Coleman took the color material in Resolve and isolated just the blue channel in order to manipulate it independent of the red and green, “to make it more inky,” he says. “Normally, you might just drain the color out, but you can really go further than that if you want a strong black-and-white look. When you adjust the individual channel, you affect the image in a way that’s similar to the effect of shooting black-and-white film through a yellow filter. It helps us make darker skies and richer blacks.”

Sean Coleman

“We’ve booked a whole lot of hours together, and that provides a level of comfort,” says Hirsch Whitaker about her and Whitaker’s work with Coleman. “He does some wonderful painting [in Resolve] that helps make a character pop in the frame or direct the viewer’s eye to a specific part of the frame. He really enjoys the collaborative element of color grading.”

Whitaker seconds that emotion: “As a cinematographer, I look at color grading a bit like working on set. It’s not a one-person job. It takes a lot of people to make these images.”


Glassbox’s virtual camera toolset for Unreal, Unity, Maya

Virtual production software company Glassbox Technologies has released its virtual camera plugin DragonFly from private beta for public use. DragonFly offers professional virtual cinematography tools to filmmakers and content creators, allowing users to view character performances and scenes within computer-generated virtual environments in realtime, through the camera’s viewfinder, an external monitor or iPad.

Available for Unreal Engine, Unity 3D and Autodesk Maya, DragonFly delivers an inclusive virtual cinematography workflow that allows filmmakers and content creators to make and test creative decisions faster and earlier in the process, whittling down production cost on projects of all scopes and sizes.

This off-the-shelf toolkit allows users to create previz to postviz without the need for large teams of operators, costly hardware or proprietary tools. It is platform-agnostic and fits seamlessly into any workflow out of box. Users can visualize and explore a CG virtual environment, then record, bookmark, create snapshots and replicate real camera movement as seamlessly as conducting a live-action shoot.

“Virtual production poses great potential for creators, but there were no off-the-shelf filming solutions available that worked out of the box,” notes co-founder/CPO Mariana Acuña. “In response, we made DragonFly: a virtual window that allows users to visualize complex sets, environments and performances through a viewfinder. Without the need for a big stage or mocap crew, it brings greater flexibility to the production and post pipeline for films, animation, immersive content, games and realtime VFX.”

The product was developed in collaboration with top Hollywood visualization and production studios, including The Third Floor for best-in-class results.

“Prior to DragonFly, each studio created its own bespoke virtual production workflow, which is costly and time-consuming per project. DragonFly makes realtime virtual production usable for all creators,” says Evelyn Cover, global R&D manager for The Third Floor. “We’re excited to collaborate with the Glassbox team to develop and test  DragonFly in all kinds of production scenarios from previz to post, with astounding success.”

Glassbox’s second in-beta virtual production software solution, BeeHive — the multi-platform, multi-user collaborative virtual scene syncing, editing and review solution–is slated to launch later this summer.

DragonFly is now available for purchase or can be downloaded for free as a 15-day trial from the Glassbox website. Pricing and licensing includes a permanent license option costing $750 USD (including $250 for the first year of support and updates) and an annual rental option costing $420 a year.

Behind the Title: Spot Welders’ editor Matt Osborne

After the time-consuming and sometimes stressful part of doing selects and putting together an assembly alone, I enjoy sitting in a room with the director and digging into the material.

NAME: Matt Osborne

COMPANY: Spot Welders

CAN YOU DESCRIBE YOUR COMPANY?
Spot Welders is a creative editorial company.

WHAT’S YOUR JOB TITLE?
Offline Editor

WHAT DOES THAT ENTAIL?
I take all the footage that production shoots, make selects on the best shots and performances, and craft it into a cohesive narrative or visually engaging film. I then work with the director, agency and client to get the best out of the material and try to make sure everyone is happy with the final result.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
Probably the amount of footage that editors get these days. The average person might think we just cut out the bad bits or choose the best takes and string them together, but we might get up to 30 hours of footage or more for a single 60-second commercial with no storyboard. It’s our job to somehow make sense of it all.

WHAT’S YOUR FAVORITE PART OF THE JOB?
I love working with directors. After the time-consuming and sometimes quite stressful part of doing selects and putting together an assembly alone, I really enjoy sitting in a room with the director and digging into the material. I like making sure we have the best moments and are telling the story in the most interesting way possible.

WHAT’S YOUR LEAST FAVORITE?
Sitting for long hours. I really want to try out one of those standing desks! Also, trying to wrangle 100 different opinions into the edit without butchering it.

WHAT IS YOUR MOST PRODUCTIVE TIME OF THE DAY?
Probably the morning after sleeping on an edit. There’s something about coming in with fresh eyes and marveling at your wondrous edit from the previous night. Or, conversely, crying about the disaster you have in front of you that needs immediate fixing. Either way, I find this is the best time to get in the flow with new ideas and work very quickly at improving the edit.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
After high school, I spent about five years working at various ski resorts in Australia and Canada and snowboarding every day, so I guess I’d still be doing that.

WHY DID YOU CHOOSE THIS PROFESSION?
It sounds cheesy but in hindsight I was probably destined to be an editor. I was always drawn to puzzles and figuring out how things go together, and editing is a lot like a giant puzzle with no correct answers.

I made skate videos with two VHS decks as a teenager, and then I realized a few years later that you could do it on a computer and could get paid to basically do the same thing. That’s when I knew it was the job for me.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
Audi, Nike, BMW and a couple of very cool passion projects, which will hopefully be released soon.

DO YOU PUT ON A DIFFERENT HAT WHEN CUTTING FOR A SPECIFIC GENRE?
Not really. It almost always comes down to storytelling. Whether that’s narrative or purely visual, you want to make the viewer feel something, so that’s always the goal. The methods to get there are usually pretty much the same.

Cayenne

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
Commercially, probably the Porsche film, Cayenne, with Rob Chiu at Iconoclast. It was a big project for the global release of a new car. They shot in amazing locations, and the footage was incredible, so I felt a lot of pressure on that one I’m really pleased with how it turned out.

Personally, the Medicine music video I did with Salomon Ligthelm and Khalid Mohtaseb was a humbling experience and something I’m very proud of. It’s actually more of a narrative short film than a music video and tells the fictionalized story of a real-life couple, in which the wife is blind.

It was a very sensitive story, shot beautifully and using non-actors. It might be the only time I’ve cried watching the rushes. I think we successfully managed to instill that raw emotion into the final edit.

Medicine

WHAT DO YOU USE TO EDIT?
I grew up on Final Cut Pro. I taught myself to edit back on Version 2 by reading the manual while working in a factory packing carrots. I was pretty upset when they ditched it but moved over to Avid Media Composer and haven’t looked back. I love it now. Well, maybe except for the effects tool.

ARE YOU OFTEN ASKED TO DO MORE THAN EDIT?
Yes, it’s sometimes expected these days that the offline will look and sound like the final product, so color grading, sound design, music editing, comping, etc.

Personally, if I have time, I’ll try and do some of these things on a basic level to get the edit approved, but it’s all going to be taken over by very talented professionals in their own craft who will do a much better job than I ever could. So I prefer to focus on the actual nuts and bolts — am I using the best shots? Am I telling this story in the most compelling, engaging and entertaining way possible? But sometimes you’ve got to throw a whoosh in to make people happy.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
iPhone, Garmin, MacBook. Although I spent a couple weeks on beaches last year and learned we don’t really need any of it, well, at least while you’re on the beach, and not working!

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I’ve gotten completely addicted to running the past couple of years. I find there’s nothing better than a run at 5am to clear the mind.

Accusonus intros plugin bundles for sound and video editors

Accusonus is bringing its single-knob audio cleaning and noise reduction technology to its new ERA 4 Bundles for video editors, audio engineers and podcasters.

The ERA 4 Bundles (Enhancement and Repair of Audio) are a collection of single-knob audio cleaning plugins designed to reduce the complexity of the sound design and audio workflow without compromising sound quality or fidelity.

Accusonus says that its patented single-knob design appeals to professional editors, filmmakers and podcasters because it reduces the time-consuming audio repair workflow to a twist of a dial. Additionally, the ERA 4 Standard family of plugins enables aspiring content creators, YouTubers and film and audio students to quickly master audio workflows with minimal effort or expertise.

ERA 4 Bundles are available in two collections: The Standard Bundle and the Pro Bundle.

The ERA 4 Standard Bundle features audio cleaning plugins designed for speed and fidelity with minimal effort, even if users have never edited audio before. The Standard Bundle offers professional sound design and includes: Noise Remover, Reverb Remover, De-esser, Plosive Remover, Voice Leveler and De-clipper.

The ERA 4 Pro Bundle targets professional editors, audio engineers and podcasters in advanced post and music production environments. It includes all of the plugins from the Standard Bundle and adds the sophisticated ERA De-Esser Pro plugin. Except from the large main knob, ERA De-Esser Pro offers extra controls for greater granularity and fine-tuning when fixing an especially rough recording.

The Accusonus ERA Bundle is fully supported by Avid Pro Tools 12.6 (or higher), Audacity 2.2.2, Apple Logic Pro 10.4.3 (or higher), Ableton Live 9 (or higher), Cockos Reaper v5.9, Image Line FL Studio 12, Presonus Studio One 3 (or higher), Steinberg Cubase 8 (or higher), Adobe Audition CC 2017 (or higher) and Apple GarageBand 10.3.2

The ERA Bundle supports Adobe Premiere CC 2017 (or higher), Apple Final Cut Pro X 10.4 (or higher), Blackmagic DaVinci Resolve 14 (or higher), Avid Media Composer 2018.12 and Magix Vegas Pro 15 (or higher).

The ERA 4 Standard Bundle is available at a special introductory price of $119 until July 31. After that, the price will be $149. The ERA 4 Pro Bundle is available at a special introductory price of $349 until July 31. After that, the price will be $499.

Assimilate Scratch 9.1: productivity updates, updated VFX workflow

Assimilate’s Scratch 9.1, a dailies and finishing software, now includes new and extensive performance and productivity features, including integration with Foundry Nuke and Adobe After Effects. It’s available now.

“A primary goal for us is to quickly respond to the needs of DITs and post artists, whether it’s for more advanced features, new format support, or realtime bug-fixes,” said Mazze Aderhold, Scratch product manager at Assimilate. “Every feature introduced in Scratch 9.1 is based on feedback we received from our users before and during the beta cycle.”

The software now features native touch controls for grading by clicking and dragging directly on the image. Thanks to this intuitive way to color and manipulate images, an artist can grade the overall image or even control curves and secondaries — all without a panel and directly where the cursor is dragging.

There is also a redesigned color management system, enabling deep control over how camera-specific gamut and gamma spaces are handled and converted. Additionally, there is a new color-space conversion plugin (any color space to any other) that can be applied at any stage of the color/mastering process.

Also new is integration with After Effects and Nuke. Within Scratch, users can now seamlessly send shots to and from Nuke and After Effects, including transparencies and alphas. This opens up Scratch to high-end tracking, compositing, 3D models, advanced stabilization, motion graphics and more.

Within the VFX pipeline, Scratch can act as a central hub for all finishing needs. It provides realtime tools for any format, data management, playback and all color management in a timeline with audio, including to and from After Effects and Nuke.

Other new features include:

• Integration with Avid, including all metadata in the Avid MXF. Additionally, Scratch includes all the source-shot metadata, such as the genuine Sound TC in Avid MXF, which is important later on in post for something like a Pro Tools roundtrip
• Per-frame metadata on ARRIRAW files, allowing camera departments to pass through camera roll and tilt, lens focus distance metadata items, and more. Editorial and VFX teams can benefit from per-frame info later in the post process.
• Faster playback and rendering
• Realtime, full-res Red 8K DeBayer on GPU
• A deep set of options to load media, including sizing options, LUTs and automatic audio-sync, speeding up the organizational process when dealing with large amounts of disparate media
• A LUT cycler that allows for quick preview and testing of large numbers of looks on footage
• Preset outputs for Pix, Dax, MediaSilo and Copra, simplifying the delivery of industry-standard web dailies


• Vector tool for advanced color remapping using a color grid
• Automatic installation of free Matchbox Shaders, opening Scratch up to a wealth of realtime VFX effects, including glows, lens effects, grain add/remove, as well as more advanced creative FX
• Built-in highlight glow, diffusion, de-noise and time-warp FX
• Added support for AJA’s Io 4K Plus and Kona 5 SDI output devices using the latest SDKs.
• Support for Apple’s new ProRes RAW compressed-acquisition format and Blackmagic RAW support on both OS X and Windows

Scratch 9.1 starts at $89 monthly and $695 annually.

Conductor boosts its cloud rendering with Amazon EC2

Conductor Technologies’ cloud rendering platform will now support Amazon Web Services (AWS) and Amazon Elastic Compute Cloud (Amazon EC2), bringing the virtual compute resources of AWS to Conductor customers. This new capability will provide content production studios working in visual effects, animation and immersive media access to new, secure, powerful resources that will allow them — according to the company — to quickly and economically scale render capacity. Amazon EC2 instances, including cost-effective Spot Instances, are expected to be available via Conductor this summer.

“Our goal has always been to ensure that Conductor users can easily access reliable, secure instances on a massive scale. AWS has the largest and most geographically diverse compute, and the AWS Thinkbox team, which is highly experienced in all facets of high-volume rendering, is dedicated to M&E content production, so working with them was a natural fit,” says Conductor CEO Mac Moore. “We’ve already been running hundreds of thousands of simultaneous cores through Conductor, and with AWS as our preferred cloud provider, I expect we’ll be over the million simultaneous core mark in no time.”

Simple to deploy and highly scalable, Conductor is equally effective as an off-the-shelf solution or customized to a studio’s needs through its API. Conductor’s intuitive UI and accessible analytics provide a wealth of insightful data for keeping studio budgets on track. Apps supported by Conductor include Autodesk Maya and Arnold; Foundry’s Nuke, Cara VR, Katana, Modo and Ocula; Chaos Group’s V-Ray; Pixar’s RenderMan; Isotropix’s Clarisse; Golaem; Ephere’s Ornatrix; Yeti; and Miarmy. Additional software and plug-in support are in progress, and may be available upon request.

Some background on Conductor: it’s a secure cloud-based platform that enables VFX, VR/AR and animation studios to seamlessly offload rendering and simulation workloads to the public cloud. As the only rendering service that is scalable to meet the exact needs of even the largest studios, Conductor easily integrates into existing workflows, features an open architecture for customization, provides data insights and can implement controls over usage to ensure budgets and timelines stay on track.

Andy Greenberg on One Union Recording’s fire and rebuild

San Francisco’s One Union Recording Studios has been serving the sound needs of ad agencies, game companies, TV and film producers, and corporate media departments in the Bay Area and beyond for nearly 25 years.

In the summer of 2017, the facility was hit by a terrible fire that affected all six of its recording studios. The company, led by president John McGleenan, immediately began an ambitious rebuilding effort, which it completed earlier this year. One Union Recording is now back up to full operation and its five recording studios, outfitted with the latest sound technologies including Dolby Atmos capability, are better than ever.

Andy Greenberg, One Union Recording’s facility engineer and senior mix engineer, who works alongside engineers Joaby Deal, Eben Carr, Matt Wood and Isaac Olsen. We recently spoke with Greenberg about the company’s rebuild and plans for the future.

Rebuilding the facility after the fire must have been an enormous task.
You’re not kidding. I’ve worked at One Union for 22 years, and I’ve been through every growth phase and upgrade. I was very proud of the technology we had in place in 2017. We had six rooms, all cutting-edge. The software was fully up to date. We had few if any technical problems and zero downtime. So, when the fire hit, we were devastated. But John took a very business-oriented approach to it, and within a few days he was formulating a plan. He took it as an opportunity to implement new technology, like Dolby Atmos, and to grow. He turned sadness into enthusiasm.

How did the facility change?
Ironically, the timing was good. A lot of new technology had just come out that I was very excited about. We were able to consolidate what were large systems into smaller units while increasing quality 10-fold. We moved leaps and bounds beyond where we had been.

Prior to the fire, we were running Avid Pro Tools 12.1. Now we’re on Pro Tools Ultimate. We had just purchased four Avid/Euphonix System 5 digital audio consoles with extra DSP in March of 2017 but had not had time to install them before the fire due to bookings. These new consoles are super powerful. Our number of inputs and outputs quadrupled. The routing power and the bus power are vastly improved. It’s phenomenal.

We also installed Avid MTRX, an expandable interface designed in Denmark and very popular now, especially for Atmos. The box feels right at home with the Avid S5 because it’s MADI and takes the physical outputs of our ProTools systems up to 64 or 128 channels.

That’s a substantial increase.
A lot of delivered projects use from two to six channels. Complex projects might go to 20. Being able to go far beyond that increases the power and flexibility of the studio tremendously. And then, of course, our new Atmos room requires that kind of channel count to work in immersive surround sound.

What do you do for data storage?
Even before the fire, we had moved to a shared storage network solution. We had a very strong infrastructure and workflow in terms of data storage, archiving and the ability to recall sessions. Our new infrastructure includes 40TB of active storage of client data. Forty terabytes is not much for video, but for audio, it’s a lot. We also have 90TB of instantly recallable data.

We have client data archived back 25 years, and we can have anything online in any room in just a few minutes. It’s literally drag and drop. We pride ourselves on maintaining triple redundancy in backups. Even during the fire, we didn’t lose any client data because it was all backed up on tape and off site. We take backup and data security very seriously. Backups happen automatically every day…  actually every three hours.

What are some of the other technical features of the rebuilt studios?
There’s actually a lot. For example, our rooms — including the two Dolby-certified Atmos rooms — have new Genelec SAM studio monitors. They are “smart” speakers that are self-tuning. We can run some test tones and in five minutes the rooms are perfectly tuned. We have custom tunings set up for 5.1 and Atmos. We can adjust the tuning via computer and the speakers have built-in DPS, so we don’t have to rely on external systems.

Another cool technology that we are using is Dante, which is part of the Avid MTRX interface. Dante is basically audio-over-IP or audio-over-Cat6. It essentially replaced our AES router. We were one of the first facilities in San Francisco to have a full audio AES router, and it was very strong for us at the time. It was a 64×64 stereo-paired AES router. It has been replaced by the MTRX interface box that has, believe it or not, a three-inch by two-inch card that handles 64×64 routing per room. So, each room’s routing capability went up exponentially by 64.

We use Dante to route secondary audio, like our ISDN and web-based IP communication devices. We can route signals from room to room and over the web securely. It’s seamless, and it comes up literally into your computer. It’s amazing technology. The other day, I did a music session and used a 96K sample rate, which is very high. The quality of the headphone mix was astounding. Everyone was happy and it took just one, quick setting and we were off and running. The sound is fantastic and there is no noise and no latency problems. It’s super-clean, super-fast and easy to use.

What about video monitoring?
We have 4K monitors and 4K projection in all the rooms via Sony XBR 55A1E Bravia OLED monitors, Sony VPL-VW885ES True 4K Laser Projectors and a DLP 4K550 projector.Our clients appreciate the high-quality images and the huge projection screens.

Technicolor opens prepro studio in LA

Technicolor is opening a new studio in Los Angeles dedicated to creating a seamless pipeline for feature projects — from concept art and visualization through virtual production, production and into final VFX.

As new distribution models increase the demand for content, Technicolor Pre-Production will provide the tools, the talent and the space for creatives to collaborate from day one of their project – from helping set the vision at the start of a job to ensuring that the vision carries through to production and VFX. The result is a more efficient filmmaking process.

Technicolor Pre-Production studio is headed by Kerry Shea, an industry veteran with over 20 years of experience. She is no stranger to this work, having held executive positions at Method Studios, The Third Floor, Digital Domain, The Jim Henson Company, DreamWorks Animation and Sony Pictures Imageworks.

Kerry Shea

Credited on more than 60 feature films including The Jungle Book, Pirates of the Caribbean: Dead Men Tell No Tales and Guardians of the Galaxy Vol. 2, Shea has an extensive background in VFX and post production, as well as live action, animatronics and creature effects.

While the Pre-Production studio stands apart from Technicolor’s visual effects studios — MPC Film, Mill Film, MR. X and Technicolor VFX — it can work seamlessly in conjunction with one or any combination of them.

The Technicolor Pre-Production Studio will comprise of key departments:
– The Business Development Department will work with clients, from project budgeting to consulting on VFX workflows, to help plan and prepare projects for a smooth transition into VFX.
– The VFX Supervisors Department will offer creative supervision across all aspects of VFX on client projects, whether delivered by Technicolor’s studios or third-party vendors.
– The Art Department will work with clients to understand their vision – including characters, props, technologies, and environments – creating artwork that delivers on that vision and sets the tone for the rest of the project.
– The Virtual Production Department will partner with filmmakers to bridge the gap between them and VFX through the production pipeline. Working on the ground and on location, the department will deliver a fully integrated pipeline and shooting services with the flexibility of a small, manageable team — allowing critical players in the filmmaking process to collaborate, view and manipulate media assets and scenes across multiple locations as the production process unfolds.
– The Visualization Department will deliver visualizations that will assist in achieving on screen exactly what clients envisioned.

“With the advancements of tools and technologies, such as virtual production, filmmaking has reached an inflection point, one in which storytellers can redefine what is possible on-set and beyond,” says Shea. “I am passionate about the increasing role and influence that the tools and craft of visual effects can have on the production pipeline and the even more important role in creating more streamlined and efficient workflows that create memorable stories.”

Editing Roundtable

By Randi Altman

The world of the editor has changed over the years as a result of new technology, the types of projects they are being asked to cut (looking at you social media) and the various deliverables they must create. Are deadlines still getting tighter and are budgets still getting smaller? The answer is yes, but some editors are adapting to the trends, and companies that make products for editors are helping by making the tools more flexible and efficient so pros can get to where they need to be.

We posed questions to various editors working in TV, short form and indies, who do a variety of jobs, as well as to those making the tools they use on a daily basis. Enjoy.

Cut+Run Editor/Partner Pete Koob

What trends do you see in commercial editing? Good or bad?
I remember 10 years ago a “colleague,” who was an interactive producer at the time, told me rather haughtily that I’d be out of work in a few years when all advertising became interactive and lived online. Nothing could have been further from the truth, of course, and I think editors everywhere have found that the viewer migration from TV to online has yielded an even greater need for content.

The 30-second spot still exists, both online and on TV, but the opportunities for brands to tell more in-depth stories across a wide range of media platforms mean that there’s a much more diverse breadth of work for editors, both in terms of format and style.

For better or worse, we’ve also seen every human being with a phone become their own personal brand manager with a highly cultivated and highly saturated digital presence. I think this development has had a big impact on the types of stories we’re telling in advertising and how we’re telling them. The genre of “docu-style” editing is evolving in a very exciting way as more and more companies are looking to find real people whose personal journeys embody their brands. Some of the most impressive editorial work I see these days is a fusion of styles — music video, fashion, documentary — all being brought to bear on telling these real stories, but doing it in a way that elevates them above the noise of the daily social media feed.

Selecting the subjects in a way that feels authentic — and not just like a brand co-opting someone’s personal struggle — is essential, but when done well, there are some incredibly inspirational and emotional stories to be told. And as a father of a young girl, it’s been great to show my daughter all the empowering stories of women being told right now, especially when they’re done with such a fresh and exciting visual language.

What is it about commercial editing that attracted you and keeps attracting you?
Probably the thing that keeps me most engaged with commercial editing is the variety and volume of projects throughout the year. Cutting commercials means you’re on to the next one before you’ve really finished the last.

The work feels fresh when I’m constantly collaborating with different people every few weeks on a diverse range of projects. Even if I’m cutting with the same directors, agencies or clients, the cast of characters always rotates to some degree, and that keeps me on my toes. Every project has its own unique challenges, and that compels me to constantly find new ways to tell stories. It’s hard for me to get bored with my work when the work is always changing.

Conoco’s Picnic spot

Can you talk about challenges specific to short-form editing?
I think the most obvious challenge for the commercial editor is time. Being able to tell a story efficiently and poignantly in a 60-, 30-, 15- or even six-second window reveals the spot editor’s unique talent. Sometimes that time limit can be a blessing, but more often than not, the idea on the page warrants a bigger canvas than the few seconds allotted.

It’s always satisfying to feel as if I’ve found an elegant editorial solution to telling the story in a concise manner, even if that means re-imagining the concept slightly. It’s a true testament to the power of editing and one that is specific to editing commercials.

How have social media campaigns changed the way you edit, if at all?
Social media hasn’t changed the way I edit, but it has certainly changed my involvement in the campaign as a whole. At its worst, the social media component is an afterthought, where editors are asked to just slap together a quick six-second cutdown or reformat a spot to fit into a square framing for Instagram. At its best, the editor is brought into the brainstorming process and has a hand in determining how the footage can be used inventively to disperse the creative into different media slots. One of the biggest assets of an editor on any project is his or her knowledge of the material, and being able to leverage that knowledge to shape the campaign across all platforms is incredibly rewarding.

Phillips 76 “Jean and Gene”

What system do you edit on, and what else other than editing are you asked to supply?
We edit primarily on Avid Media Composer. I still believe that nothing else can compete when it comes to project sharing, and as a company it allows for the smoothest means of collaboration between offices around the world. That being said, clients continue to expect more and more polish from the offline process, and we are always pushing our capabilities in motion graphics and visual effects in After Effects and color finessing in Blackmagic DaVinci Resolve.

What projects have you worked on recently?
I’ve been working on some bigger campaigns that consist of a larger number of spots. Two campaigns that come to mind are a seven-spot TV campaign for Phillips 76 gas stations and 13 short online films for Subaru. It’s fun to step back and look at how they all fit together, and sometimes you make different decisions about an individual spot based on how it sits in the larger group.

The “Jean and Gene” spots for 76 were particularly fun because it’s the same two characters who you follow across several stories, and it almost feels like a mini TV series exploring their life.

Earlier in the  year I worked on a Conoco campaign, featuring the spots Picnic, First Contact and River, via Carmichael Lynch.

Red Digital Cinema Post and Workflow Specialist Dan Duran

How do you see the line between production and post blurring?
Both post and on set production are evolving with each other. There has always been a fine line between them, but as tech grows and becomes more affordable, you’re seeing tools that previously would have been used only in post bleed onto set.

One of my favorite trends is seeing color-managed workflows on locations. With full color control pipelines being used with calibrated SDR and HDR monitors, a more accurate representation of what the final image will look like is given. I’ve also seen growth in virtual productions where you’re able to see realtime CGI and environments on set directly through camera while shooting.

What are the biggest trends you’ve been facing in product development?
Everyone is always looking for the highest image quality at the best price point. As sensor technology advances, we’re seeing users ask for more and more out of the camera. Higher sensitivity, faster frame rates, more dynamic range and a digital RAW that allows them to effortlessly shape the images into a very specific creative look that they’re trying to achieve for their show. 8K provides a huge canvas to work with, offering flexibility in what they are trying to capture.

Smaller cameras are able to easily adapt into a whole new myriad of support accessories to achieve shots in ways that weren’t always possible. Along with the camera/sensor revolution, Red has seen a lot of new cinema lenses emerge, each adding their own character to the image as it hits the photo sites.

What trends do you see from editors these days. What enables their success?
I’ve seen post production really take advantage of modern tech to help improve and innovate new workflows. Being able to view higher resolution, process footage faster and playback off of a laptop shows how far hardware has come.

We have been working more with partners to help give pros the post tools they need to be more efficient. As an example, Red recently teamed up with Nvidia to not only get realtime full resolution 8K playback on laptops, but also allow for accelerated renders and transcode times much faster than before. Companies collaborating to take advantage of new tech will enable creative success.

AlphaDogs Owner/Editor Terence Curren

What trends do you see in editing? Good or bad.
There is a lot of content being created across a wide range of outlets and formats, from theatrical blockbusters and high-end TV shows all the way down to one-minute videos for Instagram. That’s positive for people desiring to use their editing skills to do a lot of storytelling. The flip side is that with so much content being created, the dollars to pay editors gets stretched much thinner. Barring high-end content creation, the overall pay rates for editors have been going down.

The cost of content capture is a tiny fraction of what it was back in the film days. The good part of that is there is a greater likelihood that the shot you need was actually captured. The downside is that without the extreme expense of shooting associated with film, we’ve lost the disciplines of rehearsing scenes thoroughly, only shooting while the scene is being performed, only printing circled takes, etc. That, combined with reduced post schedules, means for the most part editors just don’t have the time to screen all the footage captured.

The commoditization of the toolsets, (some editing systems are actually free) combined with the plethora of training materials readily available on the Internet and in most schools means that video storytelling is now a skill available to everyone. This means that the next great editors won’t be faced with the barriers to entry that past generations experienced, but it also means that there’s a much larger field of editors to choose from. The rules of supply and demand tell us that increased availability and competition of a service reduces its cost. Traditionally, many editors have been able to make upper-middle-class livings in our industry, and I don’t see as much of that going forward.

To sum it up, it’s a great time to become an editor, as there’s plenty of work and therefore lots of opportunity. But along with that, the days of making a higher-end living as an editor are waning.

What is it about editing that attracted you and keeps attracting you?
I am a storyteller at heart. The position of editor is, in my opinion, matched with the director and writer for responsibility of the structural part of telling the story. The writer has to invent the actual story out of whole cloth. The director has to play traffic cop with a cornucopia of moving pieces under a very tight schedule while trying to maintain the vision of the pieces of the story necessary to deliver the final product. The editor takes all those pieces and gives the final rewrite of the story for the audience to hopefully enjoy.

Night Walk

As with writing, there are plenty of rules to guide an editor through the process. Those rules, combined with experience, make the basic job almost mechanical much of the time. But there is a magic thing that happens when the muse strikes and I am inspired to piece shots together in some way that just perfectly speaks to the audience. Being such an important part of the storytelling process is uniquely rewarding for a storyteller like me.

Can you talk about challenges specific to short-form editing versus long-form?
Long-form editing is a test of your ability to maintain a fresh perspective of your story to keep the pacing correct. If you’ve been editing a project for weeks or months at a time, you know the story and all the pieces inside out. That can make it difficult to realize you might be giving too much information or not enough to the audience. Probably the most important skill for long form is the ability to watch a cut you’ve been working on for a long time and see it as a first-time viewer. I don’t know how others handle it, but for me there is a mental process that just blanks out the past when I want to take a critical fresh viewing.

Short form brings the challenge of being ruthless. You need to eliminate every frame of unnecessary material without sacrificing the message. While the editors don’t need to keep their focus for weeks or months, they have the challenge of getting as much information into that short time as possible without overwhelming the audience. It’s a lot like sprinting versus running a marathon. It exercises a different creative muscle that also enjoys an immediate reward.

Lafayette Escadrille

I can’t say I prefer either one over the other, but I would be bored if I didn’t get to do both over time, as they bring different disciplines and rewards.

How have social media campaigns changed the way you edit, if at all? Can you talk about the variety of deliverables and how that affects things?
Well, there is the horrible vertical framing trend, but that appears to be waning, thankfully. Seriously, though, the Instagram “one minute” limit forces us all to become commercial editors. Trying to tell the story in as short a timeframe as possible, knowing it will probably be viewed on a phone in a bright and noisy environment is a new challenge for seasoned editors.

There is a big difference between having a captive audience in a theater or at home in front of the TV and having a scattered audience whose attention you are trying to hold exclusively amid all the distractions. This seems to require more overt attention-grabbing tricks, and it’s unfortunate that storytelling has come to this point.

As for deliverables, they are constantly evolving, which means each project can bring all new requirements. We really have to work backward from the deliverables now. In other words, one of our first questions now is, “Where is this going?” That way we can plan the appropriate workflows from the start.

What system do you edit on and what else other than editing are you asked to supply?
I primarily edit on Media Composer, as it’s the industry standard in my world. As an editor, I can learn any tool to use. I have cut with Premiere and FCP. It’s knowing where to make the edit that is far more important than how to make the edit.

When I started editing in the film days, we just cut picture and dialogue. There were other editors for sound beyond the basic location-recorded sound. There were labs from which you ordered something as simple as a dissolve or a fade to black. There were color timers at the film lab who handled the look of the film. There were negative cutters that conformed the final master. There were VFX houses that handled anything that wasn’t actually shot.

Now, every editor has all the tools at hand to do all those tasks themselves. While this is helpful in keeping costs down and not slowing the process, it requires editors to be a jack-of-all-trades. However, what typically follows that term is “and master of none.”

Night Walk

One of the main advantages of separate people handling different parts of the process is that they could become really good at their particular art. Experience is the best teacher, and you learn more doing the same thing every day than occasionally doing it. I’ve met a few editors over the years that truly are masters in multiple skills, but they are few and far between.

Using myself as an example, if the client wants some creatively designed show open, I am not the best person for that. Can I create something? Yes. Can I use After Effects? Yes, to a minor degree. Am I the best person for that job? No. It is not what I have trained myself to do over my career. There is a different skill set involved in deciding where to make a cut versus how to create a heavily layered, graphically designed show open. If that is what I had dedicated my career to doing, then I would probably be really good at it, but I wouldn’t be as good at knowing where to make the edit.

What projects have gone through the studio recently?
We work on a lot of projects at AlphaDogs. The bulk of our work is on modest-budget features, documentaries and unscripted TV shows. A recent example is a documentary on World War I fighter pilots called The Lafayette Escadrille and an action-thriller starring Eric Roberts and Mickey Rourke, called Night Walk.

Unfortunately for me I have become so focused on running the company that I haven’t been personally working on the creative side as much as I would like. While keeping a post house running in the current business climate is its own challenge, I don’t particularly find it as rewarding as “being in the chair.”

That feeling is offset by looking back at all the careers I have helped launch through our internship program and by offering entry-level employment. I’ve also tried hard to help editors over the years through venues like online user groups and, of course, our own Editors’ Lounge events and videos. So I guess that even running a post house can be rewarding in its own way.

Luma Touch Co-Founder/Lead Designer Terri Morgan

Have there been any talks among NLE providers about an open timeline? Being able to go between Avid, Resolve or Adobe with one file like an AAF or XML?
Because every edit system uses its own editing paradigms (think Premiere versus FCP X), creating an open exchange is challenging. However, there is an interesting effort by Pixar (https://github.com/PixarAnimationStudios/OpenTimelineIO) that includes adapters for the wide range of structural differences of some editors. There are also efforts for standards in effects and color correction. The core editing functionality in LumaFusion is built to allow easy conversion in and out to different formats, so adapting to new standards will not be challenging in most cases.

With AI becoming a popular idea and term, at what point does it stop? Is there a line where AI won’t go?
Looking at AI strictly as it relates to video editing, we can see that its power is incrementally increasing, and automatically generated movies are getting better. But while a neural network might be able to put together a coherent story, and even mimic a series of edits to match a professional style, it will still be cookie-cutter in nature, rather than being an artistic individual endeavor.

What we understand from our customers — and from our own experience — is that people get profound joy from being the storyteller or the moviemaker. And we understand that automatic editing does not provide the creative/ownership satisfaction that you get from crafting your own movie. You only have to make one automatic movie to learn this fact.

It is also clear that movie viewers feel a lack of connection or even annoyance when watching an automatically generated movie. You get the same feeling when you pay for parking at an automated machine, and the machine says, “Thank you, have a nice day.”

Here is a question from one of our readers: There are many advancements in technology coming in NLEs. Are those updates coming too fast and at an undesirable cost?
It is a constant challenge to maintain quality while improving a product. We use software practices like Agile, engage in usability tests and employ testing as robust as possible to minimize the effects of any changes in LumaFusion.

In the case of LumaFusion, we are consistently adding new features that support more powerful mobile video editing and features that support the growing and changing world around us. In fact, if we stopped developing so rapidly, the app would simply stop working with the latest operating system or wouldn’t be able to deliver solutions for the latest trends and workflows.

To put it all in perspective, I like to remind myself of the amount of effort it took to edit video 20 years ago compared to how much more efficient and fun it is to edit a video now. It gives me reason to forgive the constant changes in technology and software, and reason to embrace new workflows and methodologies.

Will we ever be at a point where an offline/online workflow will be completely gone?
Years ago, the difference in image quality provided a clear separation between offline and online. But today, online is differentiated by the ability to edit with dozens of tracks, specialized workflows, specific codecs, high-end effects and color. Even more importantly, online editing typically uses the specialized skills that a professional editor brings to a project.

Since you can now edit a complex timeline with six tracks of 4K video with audio and another six tracks of audio, basic color correction and multilayered titles straight from an iPad, for many projects you might find it unnecessary to move to an online situation. But there will always be times that you need more advanced features or the skills of a professional editor. Since not everybody wants to understand the complex world of post production, it is our challenge at Luma Touch to make more of these high-end features available without greatly limiting who can successfully use the product.

What are the trends you’re seeing in customer base from high-end post facility vs. independent editor/contractor?
High-end post facilities tend to have stationary workstations that employ skilled editor/operators. The professionals that find LumaFusion to be a valuable tool in their bag are often those who are responsible for the entire production and post production, including independent producers, journalists and high-end professionals who want the flexibility of starting to edit while on location or while traveling.

What are the biggest trends you’ve been seeing in product development?
In general, moving away from lengthy periods of development without user feedback. Moving toward getting feedback from users early and often is an Agile-based practice that really makes a difference in product development and greatly increases the joy that our team gets from developing LumaFusion. There’s nothing more satisfying than talking to real users and responding to their needs.

New development tools, languages and technologies are always welcome. At WWDC this year, Apple announced it would make it easier for third-party developers to port their iOS apps over to the desktop with Project Catalyst. This will likely be a viable option for LumaFusion.

You come from a high-end editing background, with deep experience editing at the workstation level. When you decided to branch off and do something on your own, why did you choose mobile?
Mobile offered a solution to some of the longest running wishes in professional video editing: to be liberated from the confines of an edit suite, to be able to start editing on location, to have a closer relationship to the production of the story in order to avoid the “fix it in post” mentality, and to take your editing suite with you anywhere.

It was only after starting to develop for mobile that we fully understood one of the most appealing benefits. Editing on an iPad or iPhone encourages experimentation, not only because you have your system with you when you have a good idea, but also because you experience a more direct relationship to your media when using the touch interface; it feels more natural and immersive. And experimentation equals creativity. From my own experience I know that the more you edit, the better you get at it. These are benefits that everyone can enjoy whether they are a professional or a novice.

Hecho Studios Editor Grant Lewis

What trends do you see in commercial editing? Good or bad.
Commercials are trending away from traditional, large-budget cinematic pieces to smaller, faster, budget-conscious ones. You’re starting to see it now more and more as big brands shy away from big commercial spectacles and pivot toward a more direct reflection of the culture itself.

Last year’s #CODNation work for the latest installment of the Call of Duty franchise exemplifies this by forgoing a traditional live-action cinematic trailer in favor of larger number of game-capture, meme-like films. This pivot away from more dialogue-driven narrative structures is changing what we think of as a commercial. For better or worse, I see commercial editing leaning more into the fast-paced, campy nature of meme culture.

What is it about commercial editing that attracted you and keeps attracting you?
What excites me most about commercial editing is that it runs the gamut of the editorial genre. Sometimes commercials are a music video; sometimes they are dramatic anthems; other times they are simple comedy sketches. Commercials have the flexibility to exist as a multitude of narrative genres, and that’s what keeps me attracted to commercial editing.

Can you talk about challenges specific to short form versus long form?
The most challenging thing about short-form editing is finding time for breath. In a 30-second piece, where do you find a moment of pause? There’s always so much information being packed into smaller timeframes; the real challenge is editing at a sprint, but still having it feel dynamic and articulate.

How have social media campaigns changed the way you edit, if at all? Can you talk about the variety of deliverables and how that affects things?
All campaigns will either live on social media or have specific social components now. I think the biggest thing that has changed is being tasked with telling a compelling narrative in 10 or even five or six seconds. Now, the 60-second and 90-second anthem film has to be able to work in six seconds as well. It is challenging to boil concepts down to just a few seconds and still maintain a sense of story.

#CODNation

All the deliverable aspect ratios editors are asked to make now is also a blossoming challenge. Unless a campaign is strictly shot for social, the DP probably shot for a traditional 16×9 framing. That means the editor is tasked with reframing all social content to work in all the different deliverable formats. This makes the editor act almost as the DP for social in the post process. Shorter deliverables and a multitude of aspect ratios have just become another layer to editing and demand a whole new editorial lens to view and process the project through.

What system do you edit on and what else other than editing are you asked to supply?
I currently cut in Adobe Premiere Pro. I’m often asked to supply graphics and motion graphic elements for offline cuts as well. That means being comfortable with the whole Adobe suite of tools, including Photoshop and After Effects. From type setting to motion tracking, editors are now asked to be well-versed in all tangential aspects of editorial.

What projects have you worked on recently?
I cut the launch film for Razer’s new Respawn energy drink. I also cut Toms Shoes’ most recent campaign, “Stand For Tomorrow.”

EditShare Head of Marketing Lee Griffin

What are the biggest trends you’ve been seeing in product development?
We see the need to produce more video content — and produce it faster than ever before — for social media channels. This means producing video in non-broadcast standards/formats and, more specifically, producing square video. To accommodate, editing tools need to offer user-defined options for manipulating size and aspect ratio.

What changes have you seen in terms of the way editors work and use your tools?
There are two distinct changes: One, productions are working with editors regardless of their location. Two, there is a wider level of participation in the content creation process.

In the past, the editor was physically located at the facility and was responsible for assembling, editing and finishing projects. However, with the growing demand for content production, directors and producers need options to tap into a much larger pool of talent, regardless of their location.

EditShare AirFlow and Flow Story enable editors to work remotely from any location. So today, we frequently see editors who use our Flow editorial tools working in different states and even on different continents.

With AI becoming a popular idea and term, at what point does it stop?
I think AI is quite exciting for the industry, and we do see its potential to significantly advance productions. However, AI is still in its infancy with regards to the content creation market. So from our point of view, the road to AI and its limits are yet to be defined. But we do have our own roadmap strategy for AI and will showcase some offerings integrated within our collaborative solutions at IBC 2019.

Will we ever be at a point where an offline/online workflow will be completely gone?
It depends on the production. Offline/online workflows are here to stay in the higher-end production environment. However, for fast turnaround productions, such as news, sports and programs (for example, soap operas and reality TV), there is no need for offline/online workflows.

What are the trends you’re seeing in customer base from high-end post facility vs, independent editor. How is that informing your decisions on products and pricing?
With the increase in the number of productions thanks to OTTs, high-end post facilities are tapping into independent editors more and more to manage the workload. Often the independent editor is remote, requiring the facility to have a media management foundation that can facilitate collaboration beyond the facility walls.

So we are seeing a fundamental shift in how facilities are structuring their media operations to support remote collaborations. The ability to expand and contract — with the same level of security they have within the facility — is paramount in architecting their “next-generation” infrastructure.

What do you see as untapped potential customer bases that didn’t exist 10 to 20 years ago, and how do you plan on attracting and nurturing them? What new markets are you seeing.
We are seeing major growth beyond the borders of the media and entertainment industry in many markets. From banks to real estate agencies to insurance companies, video has become one of the main ways for them to communicate to their media-savvy clientele.

While EditShare solutions were initially designed to support traditional broadcast deliverables, we have evolved them to accommodate these new customers. And today, these customers want simplicity coupled with speed. Our development methodology puts this at the forefront of our core products.

Puget Systems Senior Labs Technician Matt Bach

Have there been any talks between NLE providers about an open timeline. Essentially being able to go between Avid, Resolve, or Adobe with one file like an AAF or XML?
I have not heard anything on this topic from any developers, so keep in mind that this is pure conjecture, but the pessimistic side of me doesn’t see an “open timeline” being something that will happen anytime soon.

If you look at what many of the NLE developers are doing, they are moving more and more toward a pipeline that is completely contained within their ecosystem. Adobe has been pushing Dynamic Link in recent years in order to make it easier to move between Premiere Pro and After Effects. Blackmagic is going even a step further by integrating editing, color, VFX and audio all within DaVinci Resolve.

These examples are both great advancements that can really improve your workflow efficiency, but they are being done in order to keep the user within their specific ecosystem. As great as an open timeline would be, it seems to be counter to what Adobe, Blackmagic, and others are actively pursuing. We can still hold out hope, however!

With AI becoming a popular idea and term, at what point does it stop?
There are definitely limitations to what AI is capable of, but that line is moving year by year. For the foreseeable future, AI is going to take on a lot of the tedious tasks like tagging of footage, content-aware fill, shot matching, image enhancement and other similar tasks. These are all perfect use cases for artificial intelligence, and many (like content-aware fill) are already being implemented in the software we have available right now.

The creative side is where AI is going to take the longest time to become useful. I’m not sure if there is a point where AI will stop from a technical standpoint, but I personally believe that even if AI was perfect, there is value in the fact that an actual person made something. That may mean that the masses of videos that get published will be made by AI (or perhaps simply AI-assisted), but just like furniture, food, or even workstations, there will always be a market for high-quality items crafted by human hands.

I think the main thing to keep in mind with AI is that it is just a tool. Moving from black and white to color, or from film to digital, was something that at the time, people thought was going to destroy the industry. In reality, however, they ended up being a huge boon. Yes, AI will change how some jobs are approached — and may even eliminate some job roles entirely —but in the end, a computer is never going to be as creative and inventive as a real person.

There are many advancements in technology coming in NLEs seemingly daily, are those updates coming too fast and at an undesirable cost?
I agree that this is a problem right now, but it isn’t limited to just NLEs. We see the same thing all the time in other industries, and it even occurs on the hardware side where a new product will be launched simply because they could, not because there is an actual need for it.

The best thing you can do as an end-user is to provide feedback to the companies about what you actually want. Don’t just sit on those bugs, report them! Want a feature? Most companies have a feature request forum that you can post on.

In the end, these companies are doing what they believe will bring them the most users. If they think a flashy new feature will do it, that is what they will spend money on. But if they see a demand for less flashy, but more useful, improvements, they will make that a priority.

Will we ever be at a point where an offline/online workflow will be completely gone?
Unless we hit some point where camera technology stops advancing, I don’t think offline editing is ever going to fully go away. It is amazing what modern workstations can handle from a pure processing standpoint, but even if the systems themselves could handle online editing, you also need to have the storage infrastructure that can keep up. With the move from HD to 4K, and now to 8K, that is a lot of moving parts that need to come together in order to eliminate offline editing entirely.

With that said, I do feel like offline editing is going to be used less and less. We are starting to hit the point that people feel their footage is higher quality than they need without having to be on the bleeding edge. We can edit 4K ProRes or even Red RAW footage pretty easily with the technology that is currently available, and for most people that is more than enough for what they are going to need for the foreseeable future.

What are the trends you’re seeing in customer base from high-end post facility vs. independent editor, and how is that informing your decisions on products and pricing?
From a workstation side, there really is not too much of a difference beyond the fact that high-end post facilities tend to have larger budgets that allow them to get higher-end machines. Technology is becoming so accessible that even hobbyist YouTubers often end up getting workstations from us that are very similar to what high-end professionals use.

The biggest differences typically revolves not around the pure power or performance of the system itself, but rather how it interfaces with the other tools the editor is using. Things like whether the system has 10GB (or fiber) networking, or whether they need a video monitoring card in order to connect to a color calibrated display, are often what sets them apart.

What are the biggest trends you’ve been seeing in product development?
In general, the two big things that have come up over and over in recent years are GPU acceleration and artificial intelligence. GPU acceleration is a pretty straight-forward advancement that lets software developers get a lot more performance out of a system for tasks like color correction, noise reduction and other tasks that are very well suited for running on a GPU.

Artificial intelligence is a completely different beast. We do quite a bit of work with people that are on the forefront of AI and machine learning, and it is going to have a large impact on post production in the near future. It has been a topic at conferences like NAB for several years, but with platforms like Adobe Sensei starting to take off, it is going to become more important

However, I do feel that AI is going to be more of an enabling technology rather than one that replaces jobs. Yes, people are using AI to do crazy things like cut trailers without any human input, but I don’t think that is going to be the primary use of it anytime in the near future. It is going to be things like assisting with shot matching, tagging of footage, noise reduction, and image enhancement that is going to be where it is truly useful.

What do you see as untapped potential customer bases that didn’t exist 10-20 years ago, and how do you plan on attracting and nurturing them? What new markets are you seeing?
I don’t know if there are any customer bases that are completely untapped, but I do believe that there is going to be more overlap between industries in the next few years. One example is how much realtime raytracing has improved recently, which is spurring the use of video game engines in film. This has been done for previsualization for quite a while, but the quality is getting so good that there are some films already out that include footage straight from the game engine.

For us on the workstation side, we regularly work with customers doing post and customers who are game developers, so we already have the skills and technical knowledge to make this work. The biggest challenge is really on the communication side. Both groups have their own set of jargon and general language, so we often find ourselves having to be the “translator” when a post house is looking at integrating realtime visualization in their workflow.

This exact scenario is also likely to happen with VR/AR as well.

Lucky Post Editor Marc Stone

What trends do you see in commercial editing?
I’m seeing an increase in client awareness of the mobility of editing. It’s freeing knowing you can take the craft with you as needed, and for clients, it can save the ever-precious commodity of time. Mobility means we can be an even greater resource to our clients with a flexible approach.

I love editing at Lucky Post, but I’m happy to edit anywhere I am needed — be it on set or on location. I especially welcome it if it means you can have face-to-face interaction with the agency team or the project’s director.

What is it about commercial editing that attracted you and keeps attracting you?
The fact that I can work on many projects throughout the year, with a variety of genres, is really appealing. Cars, comedy, emotional PSAs — each has a unique creative challenge, and I welcome the opportunity to experience different styles and creative teams. I also love putting visuals together with music, and that’s a big part of what I do in 30-or 60-second… or even in a two-minute branded piece. That just wouldn’t be possible, to the same extent, in features or television.

Can you talk about challenges specific to short-form editing?
The biggest challenge is telling a story in 30 seconds. To communicate emotion and a sense of character and get people to care, all within a very short period of time. People outside of our industry are often surprised to hear that editors take hours and hours of footage and hone it down to a minute or less. The key is to make each moment count and to help make the piece something special.

Ram’s The Promise spot

How has social media campaigns changed the way you edit, if at all?
It hasn’t changed the way I edit, but it does allow some flexibility. Length isn’t constrained in the same way as broadcast, and you can conceive of things in a different way in part because of the engagement approach and goals. Social campaigns allow agencies to be more experimental with ideas, which can lead to some bold and exciting projects.

What system do you edit on, and what else other than editing are you asked to supply?
For years I worked on Avid Media Composer, and at Lucky Post I work in Adobe Premiere. As part of my editing process, I often weave sound design and music into the offline so I can feel if the edit is truly working. What I also like to do, when the opportunity presents, is to be able to meet with the agency creatives before the shoot to discuss style and mood ahead of time.

What projects have you worked on recently?
Over the last six months, I have worked on projects for Tazo, Ram and GameStop, and I am about to start a PSA for the Salvation Army. It gets back to the variety I spoke about earlier and the opportunity to work on interesting projects with great people.

Billboard Video Post Supervisor/Editor Zack Wolder

What trends do you see in editing? Good or bad.I’m noticing a lot of glitch transitions and RGB splits being used. Much flashier edits, probably for social content to quickly grab the viewers attention.

Can you talk about challenges specific to short-form editing versus long-form?
With short-form editing, the main goal is to squeeze the most amount of useful information into a short period of time while not overloading the viewer. How do you fit an hour-long conversation into a three-minute clip while hitting all the important talking points and not overloading the viewer? With long-form editing, the goal is to keep viewers’ attention over a long period of time while always surprising them with new and exciting info.

What is it about editing that attracted you and keeps attracting you?
I loved the fact that I could manipulate time. That hooked me right away. The fact that I could take a moment that lasts only a few seconds and drag it out for a few minutes was incredible.

Can you talk about the variety of deliverables for social media and how that affects things?
Social media formats have made me think differently about framing a shot or designing logos. Almost all the videos I create start in the standard 16×9 framing but will eventually be delivered as a vertical. All graphics and transitions I build need to easily work in a vertical frame. Working in a 4K space and shooting in 4K helps tremendously.

Rainn Wilson and Billie Eilish

What system do you edit on, and what else other than editing are you asked to supply?
I edit in Adobe Premiere Pro. I’m constantly asked to supply design ideas and mockups for logos and branding and then to animate those ideas.

What projects have you worked on recently?
Recently, I edited a video that featured Rainn Wilson — who played Dwight Schrute on The Office — quizzing singer Billie Eilish, who is a big-time fan of the show.

Main Image: AlphaDogs editor Herrianne Catolos


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Editing for Short Form

By Karen Moltenbrey

Unlike features or even series, short-form projects such as commercials give the editor the opportunity for a fresh start with each new job. Indeed, some brands have a specific style that they adhere to, but even so, there is a good deal of creative flexibility placed in the hands of the editor.

The challenge here is to condense a story into 30, 60 or 90 seconds. And more and more, there are other deliverables associated with a job aside from the traditional commercial, as editors also may be asked to provide social media spots, cinema spots and more. And as some editors point out, it’s no longer enough to excel at solely working with video; today, it is helpful to have a wider range of skills, such as audio editing and basic animation, to support the workflow.

Here we examine the editing work on a trio of spots and the approach each editor took to deliver a compelling piece.

Nespresso: The Quest
George Clooney has been the brand ambassador for coffee-machine maker Nespresso since 2006, and his commercials have been featured in Europe and around the world. In a recent spot airing in North America, Clooney embarks on a quest for the perfect cup of coffee, and does so with true Hollywood flair.

In The Quest, the actor plays a medieval knight who throws the head of a dragon he has just slain at the feet of his queen. Thankful, she asks what he desires as his reward. He pauses, then steps through a movie screen and enters the modern world, where he wanders the streets in his armor until he finds a coffee shop and his long-sought-after cup of Nespresso coffee. Satisfied, he heads back, walks down the theater aisle, through the movie screen once again and is back in the medieval world. When the queen asks if he has enough coffee for the kingdom, the actor gives a sheepish look, and soon we see the queen and court riding in a double-decker city bus, merrily on their way to get their own cup of Nespresso coffee.

Clooney’s producing partner, Grant Heslov, directed the spot, which was filmed against greenscreen on a backlot in Los Angeles. The background plates were shot in New York City, and compositing was done by VFX supervisor Ryan Sears from Big Sky Edit. The spot was edited by Chris Franklin, who launched New York-based Big Sky Edit in 1992.

Chris Franklin

“Ryan and I were working as a team on this. As I’m cutting, he’s compositing scenes so we can really get an idea of what everything looks like, and then I properly sound-designed it,” says Franklin. “He dealt with everything in terms of George on the movie screen and popping out of the screen and walking through New York, while I dealt with the sound design and the editing. It helped keep the job efficient, so Grant could come in and see everything pretty much completed.”

Having the various departments under one roof at Big Sky Edit enables Franklin to show work to clients, agencies or directors with effects integrated into the cut, so they do not have to rely on their imaginations to visualize the spot. “They’re judging the story as opposed to the limitations of the footage if effects work isn’t done yet,” he explains.

This is not Franklin’s first Nespresso ad, having worked on the very first one for the US market, and all of them have been directed by Heslov (who also directed Clooney in the Hulu series Catch-22). “He has shorthand with George, so the shoots go beautifully,” Franklin says, noting there is also a feeling of trust with everyone who has a responsibility on the post side.

When asked to describe the editing style he used for The Quest, Franklin was hard-pressed to pinpoint one specifically, saying “sometimes you just go by instinct in terms of what feels right. The fact that this was a movie within a movie, you’re kind of looking at it like an epic. So, you deal with it as a bigger type of thing. And then once [the story] got to New York, we were feeding off the classic man-on-the-street vibe.” So, rather than using a specific editing style on the spot, Franklin says he concentrated on making sure the piece was put together well, had a good storytelling aspect and that everything clicked.

The footage was delivered to Big Sky Edit as transcoded dailies, which were downloaded overnight from LA. Franklin cut the spot on an Avid Media Composer, and the completed spot was delivered in standard HD for 60- and 30-second versions, as well as pullouts and social media material. “There are so many deliverables attached to things now, and a job tends to be longer than it used to be due to all the elements and pieces of content you’re delivering to finish the job,” Franklin says. While time-consuming, these demands also force him to tell the story in different ways for the various deliverables.

Franklin describes his general workflow as fairly straightforward. He will string the entire shoot together – “literally every piece of film that was exposed” — and go through the material, then whittle that down and review it a second time. After that, he starts breaking it down in terms of sequences for all the pieces he needs, and then he starts building the edit. Without question, this process takes a substantial amount of time on the front end, as it takes an editor roughly four hours to go through one hour of footage in order to screen it properly, learn it, understand the pieces in it, break it apart, label it and prepare it — all before any assembly can be done. “It’s not unusual to have 10 or 12 hours of footage, so it’s going to take 40 hours to go through that material and break it down before I can start assembling,” he says.

As Franklin points out, he does his own sound design — his favorite part of the process — while editing. In fact, he started out as an audio engineer years ago, and doing both the audio and editing simultaneously “helps me see the story,” he says. “If I wasn’t doing sound design while I am working, I would get totally lost.” (Tom Jucarone at Sound Lounge mixed The Quest.)

Franklin has edited features, documentaries and even short films, and his workflow remains fairly constant across the genres. “It’s just longer sometimes. You have to learn the footage, so you’ve got to watch everything. It’s a lot of watching and thinking,” he notes. “Deadlines give you an end that you have to shoot for, but you can’t rush things. It takes time to do the work properly.”

Despite his experience with other genres, commercials have been Franklin’s bread and butter for the past 30 years. He says he likes the challenge of whittling down 10 hours of footage into 30 or 60 seconds of storytelling.

M&M’s ‘Hazelnut Spread’ Campaign
Over the years, audiences have been treated to commercial spots featuring the various spokescandies for Mars Incorporated’s M&M’s, from the round-bodied regular flavored character to the egg-shaped yellow peanut character. And, there have been other new flavor characters, too. Most recently, the company introduced its latest addition: hazelnut spread M&M’s. And helping to launch the product is PS260 owner/editor Maury Loeb and assistant editor Sara Sachs, who “divided and conquered” on the campaign, which features three spots to promote the new flavor and the ever-popular M&M’s chocolate bar, which came out in 2018.

The first spot, New Spokescandy, is currently airing. The two other spots, which will be launching next year, are called Injury Attorney and Psychiatrist. Sachs focused on the latter, a comical session between a therapist and the yellow M&M, who is “feeling stuck.” The therapist points out that it’s because he is stuck in a chocolate bar. “We played around a lot with the humor of that moment. It was scripted with three progressively wider shots to ultimately reveal the candy bar, but in the edit, we decided the humor was more impactful if it was just one single reveal at the end,” says Sachs.

Helping to unite the three spots, aside from the brand’s humor and characters, is a consistent editing style. “The pacing is consistent. M&M’s as a whole doesn’t really do very music-heavy spots; they are more real-world in nature,” Sachs notes.

At PS260, the editors often collaborate on client campaigns, so as ideas are being worked out and implemented in one suite, revisions are made in another, allowing the clients to move from space to space to view the work progression.

Sara Sachs

To edit the spot, Sachs worked primarily in Adobe Premiere, using After Effects and Photoshop for some of the quick graphics, as PS260’s graphics department did the heavy lifting for the bigger moving elements, such as the M&M’s characters. The biggest challenge came from getting the tonality of the actor just right. “When a person is talking on camera to an empty couch or stage, you really have to think about both sides of the emotion,” she explains. “VO talent comes in after you have a cut in place, so even though these things are recorded a month apart, it still needs to feel like the characters are talking to each other and come across emotionally true.”

Having to do some minor graphics work is not so unusual these days; Sachs points out that editors today are becoming multitalented and handling other aspects of a project aside from cutting. “It’s not enough to just know the edit side; you also need a base in graphics, audio fine-tuning and color correction. More and more we try to get the rough cut closer to what the final picture will actually look like,” she says. “In this campaign, they even took a lot of the graphics that we applied in the rough and used them directly for air.”

Most of Sachs’ experience has come from commercials, but she has also done shorts, features, documentaries, music videos, promotional and internal videos, pitch and instructional videos, web series and so on. Of those, she prefers short-form projects because they afford her the opportunity to painstakingly watch every frame of a video “900 times and put some love into every 24th of a second,” she adds.

That level of focus is usually not practical or applicable on longer-form projects, which often require scene-to-scene organization with 15- and 30-second spots. “Shorter content maintains the same basic project structure but tends to get more attention on the little things like line-by-line sequences, which are every time a character says something in any situation,” she explains.

Nike Choose Phenomenal
Charlie Harvey recently finished a unique spot for Nike Korea for the South Korean market titled Choose Phenomenal, an empowering ad for women created by Wieden + Kennedy Tokyo that has over 10 million views on YouTube. The spot opens on a young girl dressed in traditional Korean clothing before evolving into a fast-paced, split-screen succession of images — video, animation, graphic elements, pictures and more, mainly of women in action — set to an inspiring narration.

“The agency always wanted it to be split-screen,” says Harvey of Whitehouse Post, who edited the spot. The DP shot the majority of the “moments” in a few different ways and from different angles, giving her the ability to find the elements that complemented each other from a split-screen standpoint. Yet it was up to Harvey to sort through the plethora of clips and images and select the most appropriate ones.

“There’s a Post-it note moment in there, too. That’s a big thing in South Korea, where people write messages on Post-its and stick them on a wall, so it’s culturally significant,” Harvey explains. Foremost in her mind while editing the spot was that it was culturally significant and inspiring to young women, resulting in her delving deep into that country’s traditions to find elements that would resonate best with the intended audience.

Charlie Harvey

Harvey initially began cutting the spot in Los Angeles but then traveled to Tokyo to do the majority of the edit.

In fact, when Harvey began the project, she didn’t have an opportunity to work one-on-one with the director – something that would always be her preference. “I always want to create what the director has envisioned. I always like to make that [vision] come to life while adding my own point of view, too,” she says.

Working with split screens or multiple screens is always trickier because you need to work with multiple layers while maintaining the rhythm of the film, Harvey says. “Making what seems like a small change in one shot will affect not only the shot that comes before and after it, but also the shots next to those. It’s more a puzzle you are solving,” she adds.

The visual element, however, was just one aspect of the project; here, like on many other projects, finding the right music accompaniment is not easy. “You end up going around and around trying to find exactly what you are looking for, and music is always a challenge. If you find the right track, it makes all the difference. It elevates a spot, or impacts it negatively,” Harvey points out. “Music is so important.”

In addition, the split-screen concept forced Harvey to concentrate on both sides of the screen – akin to concentrating on two shorts playing at the same time. “You have to make sure they work together and they link to the next page, where you have another two shorts,” she explains. “You need that harmonious relationship, and there needs to be a rhythm. Otherwise, it could get choppy, and then you are looking at one side or the other, not both together in unison.”

Indeed, dealing with the multiple split-screen images was difficult, but perhaps even more daunting was ensuring that the spot respected the culture of the young women to whom it was directed. To this end, Harvey incorporated as much reference as she could that would resonate with the audience, as opposed to using more generic references geared for audiences outside of that country. “I’m sure it meant a lot to these girls,” she says of the inspirational spot and the effort put into it.

Harvey performed the edit on an Avid system, preferring the simplistic interface to other systems. “It has everything for what I want to do,” she says. “There are no extra tabs here and there. It’s just really easy to use, and it’s very stable and steady.”

For the most part, Harvey sticks with shorter-form projects like commercials, though she has experience with longer formats. “I think you get into a routine with commercials, so you know you have a certain number of days to do what you need to do. I know where I need to be at certain points, and where I need to get to by the time I see the director or the agency,” she explains. “I have a very specific routine. I have a way that I work, and I am comfortable with it. It works for me.”


Karen Moltenbrey is a veteran writer/editor covering VFX and post production.

Yesterday director Danny Boyle

By Iain Blair

Yesterday, everyone knew The Beatles. Today, only a struggling singer-songwriter in a tiny English seaside town remembers their songs. That’s the brilliant-yet-simple setup for Yesterday, the new rock ’n’ roll comedy from Academy Award-winning director Danny Boyle (Slumdog Millionaire, Trainspotting) and Oscar-nominated screenwriter Richard Curtis (Four Weddings and a Funeral, Love Actually, Notting Hill).

Danny Boyle on set with lead actor Himesh Patel

Jack Malik (Himesh Patel of BBC’s EastEnders) is the struggling singer-songwriter whose dreams of fame are rapidly fading, despite the fierce devotion and support of his childhood best friend/manager, Ellie (Lily James, Mamma Mia! Here We Go Again). But after a freak bus accident during a mysterious global blackout, Jack wakes up to discover that only he remembers The Beatles and their music, and his career goes supercharged when he ditches his own mediocre songs and instead starts performing hit after hit by the Fab Four — as if he’d written them.

Yesterday co-stars Ed Sheeran and James Corden (playing themselves) and Emmy Award-winner Kate McKinnon as Jack’s Hollywood agent. Along with new versions of The Beatles’ most beloved hits, Yesterday features a seasoned group of collaborators, including DP Christopher Ross (Terminal, the upcoming Cats), editor Jon Harris (Kingsman: The Secret Service, 127 Hours), music producer Adem Ilhan (The Ones Below, In the Loop) and composer Daniel Pemberton (Steve Jobs, Spider-Man: Into the Spider-Verse).

I recently spoke with Boyle, whose eclectic credits include Shallow Grave, The Beach, A Life Less Ordinary, Trance, Steve Jobs, Sunshine and 127 Hours, about making the film and the workflow.

What was your first reaction when you read this script?
I was a big fan of Richard’s work, and we’d worked together on the opening ceremony for the 2012 London Olympics, when we did this Chariots of Fire spoof with Rowan Atkinson, and I casually said to him, “If you’ve ever got anything for me, send it over.” And he said, “Funnily enough, I do have a script that might suit you,” and he sent it over, and I was just overwhelmed when I read it. He’d managed to take these two fairly ordinary people and their love story, and then intertwine it, like a double helix, with this love letter to The Beatles, which is the whole texture and feeling of this film.

It comes across as this very uplifting and quite emotional film.
I’m glad you said that, as I thought this whole simple idea — and it’s not sci-fi, but it’s not really explained — of this global amnesia about The Beatles and all their songs was just so glorious and wonderful, and just like listening to one of their songs. It really moved me, and especially the scene at the end. That affected me in a very personal way.  It’s about the wonder of cinema and its relationship to time, and film is the only art form that really looks at time in such detail because film is time. And that relates directly to editing, where you’re basically compressing time, stretching it, speeding it up, freezing it — and even stopping it. No other art form can do that.

The other amazing aspect of film is that going to the movies is also an expression of time. The audience says, “I’m yours for the next two hours,” and in return you give them time that’s manipulated and squeezed and stretched, and even stopped. That’s pretty amazing, I think. That’s what I tried to do with this film, do something that brings back The Beatles and all that sense of pure joy in their music, and how it changed people’s lives forever.

Is it true that Jack is partly based on Ed Sheeran’s own life story?
It is, absolutely, and he’s good friends with Richard Curtis. Ed played all the little pubs and small festivals where we shot, and very unsuccessfully when he started out. Then he was propelled into superstardom, and that also appeared to happen overnight. Where did all his great songs come from? Then, like in the film, Ed actually returned to his childhood sweetheart and they ended up getting married, and you go, “Wow! OK. That’s amazing.” So all that gave us the exo-skeleton of the film, and Ed’s also done some acting — he was in Game of Thrones and Bridget Jones’ Baby, and then he also wrote the song at the end, so it was really perfect he was also in it.

What did Himesh bring to the role of Jack?
The only trepidation I had was when I began auditioning people for the part, as it was basically, “Come in and sing a couple of Beatles songs.” And some were probably better technically than Himesh, but I soon realized it was going to be far harder than I thought to get the right guy. We had great actors who weren’t great singers, and vice versa, and we didn’t want just a karaoke version of 17 songs.

And making it more complicated was that, unlike in the film, we all do remember The Beatles. But then Himesh walked in, played “Yesterday” and “Back in the USSR,” and even though I was oversaturated by The Beatles music at this point, they just grabbed me. He made them his own, as if they were his songs. He was also very modest with it as well, in his demeanor and approach. He doesn’t rethink the wheel. He says, “This is the song you’ve missed, and I’m bringing it back to you.” And that’s the quality he brings to his performance. There’s a genuine simplicity, but he’s also very funny and subtle. He doesn’t try and hijack The Beatles and lay on extra notes that you don’t need. He’s a very gentle guy, and he lets you see the song for what it is, the beauty of them.

Obviously, the music and sound were crucial in this, and usually films have the actors lipsync, but Himesh sang live?
Totally. He played and sang live — no dubs or pre-records. Early on I sat down with Simon Hayes, who won the Oscar for mixing Les Mis, and told him that’s what I wanted. It’s very difficult to do live recording well, but once Simon heard Himesh sing, he got it.

The songs in this help tell the story, and they’re as important as all the dialogue, so every time you hear Himesh play and sing it live. Then for all the big concerts, like at Wembley, we added extra musicians, which we over-dubbed. So even if there were mistakes or problems with Himesh’s performances, we kept it, as you’ve got to believe it’s him and his songs. It had to be honest and true.

We screened the premiere in Dolby Vision Atmos in London, and it’s got such a fantastic range. The sound is so crisp and clean — and not just the effects, but all the dialogue, which is a big tribute to Simon. It’ll be so sad if we lose cinema to streaming on TV and watching films on tiny phones because we’ve now achieved a truly remarkable technical standard in sound.

Where did you do all the post?
We edited at a few places. We were based at Pinewood to start with, as I was involved with the Bond film, and then we moved to some offices in central London. Finally, we ended up at Working Title, where they have a great editing setup in the basement. Then as usual we did all the sound mixing at Pinewood with Glenn Freemantle and his team from Sound 24. They’ve done a lot of my films.

We did all the visual effects with my usual guy, VFX supervisor Adam Gascoyne over at Union Visual Effects in London. He’s done all my films for a very long time now, and they did a lot of stuff with crowd and audience work for the big shows. Plus, a lot of invisible stuff like extensions, corrections, cleanup and so on.

You also reteamed with editor Jon Harris, whose work on 127 Hours earned him an Oscar nom. What were the big editing challenges?
We had quite a few. There was this wonderful scene of Jack going on the James Corden show and playing “Something,” the George Harrison song, and we ultimately had to cut the whole thing. On its own, it was this perfect scene, but in the context of the film it came too late, and it was also too reminiscent of “Yesterday” and “The Long and Winding Road.”

The film just didn’t need it, and it was quite a long sequence, and it was really sad to cut it, but it just flowed better without it. Originally, we started the film with a much longer sequence showing Jack being unsuccessful, and once we tested that, it was immediately obvious that the audience understood it all very quickly. We just didn’t need all that, so we had to cut a lot of that. It’s always about finding the right rhythm and pace for the story you’re telling.

L-R: Iain Blair and Danny Boyle

Where was the DI done?
At Goldcrest with colorist Adam Glasman, who has worked a lot with DP Chris Ross. It was a very joyous film to make and I wanted it to look joyful too, with a summer spirit, but also with a hint of melancholy. I think Himesh has that too, and it doesn’t affect the joy, but it’s a sub-note. It’s like the English countryside, where we tried to capture all its beauty but also that feeling it’s about to rain all the time. It’s that special bittersweet feeling.

I assume Paul and Ringo gave you their blessing on this project?
Yeah, you have to get their agreement as they monitor the use of the songs, and Working Title made a great deal with them. It was very expensive, but it gave us the freedom to be able to change the songs in the edit at the last minute if need be, which we did a few times. We got beautiful letters back, very touching, and Paul was very funny as he gave us permission to use “Yesterday,” which we also used as the film title. He told us that his original lyric title was “Scrambled Eggs,” and if the film turned out to be a mess, we could just call it Scrambled Eggs instead.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

EP Nick Litwinko leads Nice Shoes’ new long-form VFX arm

NYC-based creative studio Nice Shoes has hired executive producer Nick Litwinko to lead its new film and episodic VFX division. Litwinko, who has built a career on infusing a serial entrepreneur approach to the development of creative studios, will grow the division, recruiting talent to bring a boutique, collaborative approach to visual effects for long-form entertainment projects. The division will focus on feature film and episodic projects.

Since coming on board with Nice Shoes, Litwinko and his team already have three long-form projects underway and will continue working to sign on new talent.

Litwinko launched his career at MTV during the height of its popularity, working as a senior producer for MTV Promos/Animation before stepping up as executive producer/director for MTV Commercials. His decade-long tenure led him to launch his own company, Rogue Creative, where he served dual roles as EP and director and oversaw a wide range of animated, live-action and VFX-driven branded campaigns. He was later named senior producer for Psyop New York before launching the New York office of Blind. He moved on to join the team at First Avenue Machine as executive producer/head of production. He was then recruited to join Shooters Inc. as managing director, leading a strategic rebrand, building the company’s NYC offices and playing an instrumental part in the rebrand to Alkemy X.

Behind the Title: Neko founder Lirit Rosenzweig Topaz

NAME: Lirit Rosenzweig Topaz

COMPANY: Burbank’s Neko Productions

CAN YOU DESCRIBE YOUR COMPANY?
We are an animation studio working on games, TV, film, digital, AR, VR and promotional projects in a variety of styles, including super-cartoony and hyper-realistic CG and 2D. We believe in producing the best product for the budget, and giving our clients and partners peace of mind.

WHAT’S YOUR JOB TITLE?
Founder/Executive Producer

WHAT DOES THAT ENTAIL?
I established the company and built it from scratch. I am the face of the company and the force behind it. I am in touch with our clients and potential clients to make sure all are getting the best service possible.

Dr. Ruth doc

I am a part of the hiring process, making sure our team meets the standards of creativity, communication ability, responsibility and humanness. It is important for me to make sure all of our team members are great human beings, as well as being amazing and talented artists. I oversee all projects and make sure the machine is working smoothly to everyone’s satisfaction.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
I am always looking at the big picture, from the macro to the micro, as well. I need to be aware of so many of the smaller details making sure everything is running smoothly for both sides, employees and clients.

WHAT HAVE YOU LEARNED OVER THE YEARS ABOUT RUNNING A BUSINESS?
I have learned that it is a roller coaster and one should enjoy the ride, and that one day doesn’t look like another day. I learned that if you are true to yourself, stick to your objectives and listen to your inner voice while doing a great job, things will work out. I always remember we are all human beings; you can succeed as a business person and have people and clients love working with you at the same time.

A LOT OF IT MUST BE ABOUT TRYING TO KEEP EMPLOYEES AND CLIENTS HAPPY. HOW DO YOU BALANCE THAT?
For sure! That is the key for everything. When employees are happy, they give their heart and soul. As a result, the workplace becomes a place they appreciate, not just a place they need to go to earn a living. Happy clients mean that you did your job well. I balance it by checking in with my team to make sure all is well by asking them to share with me any concerns they may have. At the end of the day, when the team is happy, they do a good job, and that results in satisfied clients.

WHAT’S YOUR FAVORITE PART OF THE JOB?
It is important for me that everybody comes to work with a smile on their face and to be a united team with the goal to create great projects. This usually results in their thinking out of the box and looking for ways to be efficient, to push the envelope and to make sure creativity is always at the highest level. Working on projects similar to ones we did in the past, but also to work on projects and styles we haven’t done before.

Dr. Ruth doc

I like the fact that I am a woman running a company. Being a woman allows me to juggle well, be on top of a few things at the same time and still be caring and loving.

WHAT IS YOUR FAVORITE TIME OF THE DAY?
I have two. One is the beginning of the day when I know I have a full day ahead of me to create work, influence, achieve and do many things. Two is the evening, when I am back home with my family.

CAN YOU NAME SOME RECENT CLIENTS?
Sega, Wayforward and the recent Ask Dr Ruth documentary.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
My iPhone, my iPad and my computer.

Tribeca: Emily Cohn on editing her own film, CRSHD

By Amy Leland

At this year’s Tribeca Film Festival, I went to a screening of the college comedy CRSHD, a feature film by first-time feature writer/director Emily Cohn. CRSHD tells the story of a group of college freshman trying to gain entry to an invitation-only “crush” party, in order to accomplish a pledge one of them made to lose her virginity before the end of the year. The story is told in part through dramatization of their social networking activity in unique and creative ways I hadn’t seen before. She also edited the film.

While I had initially set out to write something about editors working on feature films edited with Adobe Premiere, Emily turned out to be an interesting story all on her own.

Emily Cohn

A fairly recent college graduate, this enterprising and incredibly talented young woman managed to write her first feature, find a way to get it produced and handle the bulk of the post production process herself. I needed to know more about how she navigated all of this and ended up with a film that looked like far more than a first-time, small-budget indie.

You wrote, directed, edited and produced this film. What was it like bringing your baby to life?
While it’s been so much work, I never really saw it that way. When I look back at it, I’ve loved every step of the process, and I feel so grateful that through this process, I’ve had the most extreme film school boot camp. I was a creative writing major in college, but this was beyond any prior film experience I had, especially in post. I didn’t understand the full process of getting a feature through post.

When I was in high school and was part of the Film Fellows program at the Tribeca Film Institute, I thought that I wanted to be an editor. So that’s something that I really took to.

But on this film, I learned all about OMFs for sound deliverables, and XMLs and all of that. I couldn’t believe I had never heard of all this, because I’d been editing for a while. But for short films or web series, you don’t have to do the back and forth, the round tripping, you don’t have to worry about it as much.

Did that play a part in your decision to edit in Premiere?
Choosing Premiere was definitely a strategic decision. I learned editing on Final Cut, and then I was editing on Avid for a while. I was an assistant editor on a feature doc, doing transcoding and organizing the files in Avid. There was a big part of me that thought, “We should just use Avid,” but it’s an indie film, and I knew we were going to have to use a lot of After Effects, so Premiere was going to be the most streamlined. It was also something that I could operate on my laptop, which is where 90 percent of the editing happened.

What did you shoot the film with?
We had an equipment sponsorship from Canon, so we shot on the C100 with some really nice Prime lenses they gave us. The big question was, “Do we try to shoot 4K or not?” Ultimately, we didn’t because that meant I didn’t have to spend as much money on hard drives, and my laptop could handle the footage. Obviously, I could’ve transcoded it to work with the proxies, but this was all-around much easier and more manageable.

You used a lot of VFX in this, but nothing where having the extra resolution of 4K would make a huge difference, right?
It probably would have been better if we had shot 4K for some of those things, yeah. But it was manageable, and it’s … I mean, it’s an indie film. It’s a baby film. The fact that it made it to Tribeca was beyond cool to say the least.

You said something during the Q&A at Tribeca about focusing on the people using social media versus showing screens —you didn’t want to show technology that would look out of date as fast. That was smart.
I’ve done a lot of reading on it, and it’s a subject I’m very interested in. I’ve been to many filmmaker Q&A panels where the question was, “What makes you want to set it in the ‘90s?” and the answer was, “Because there are no cell phones.” That feels so sad to me.

I’ve read articles that say rom-coms from the ‘90s, or other other older movies, wouldn’t exist if they were set in the 2000s because you would just get a text, and the central conflict would be over.

Some people talk about issues editing long form. Some complain that the project becomes too bloated, or that things don’t perform as well once the project is too long. Did you have any issues like that?
I organized the project carefully. I know a lot of people who say the bin system in Avid is superior to Premiere’s, but I didn’t find that to be a problem, although I did have a major crash that I ended up solving. I do wish Adobe had some sort of help line because that would have helped a lot.

CRSHD was shot with a Canon 100.

But that was toward the beginning, and essentially what it came down to was that I’m still running the 2017 OS on my laptop with no upgrade, because when I upgraded my OS, Premiere started crashing. It just needed it all to stay where it was, which, now I know!

In addition to After Effects, what other post tools did you use on the film?
We used After Effects for the VFX, Resolve for color and Avid Pro Tools sound. My cousin Jim Schultz, who is a professional music editor, really held my hand through all of that. He’s in LA, but whenever I had a question, he would answer it. Our whole post sound team, which included Summit Post, also in LA, was amazing and made Crshd feel like a real film. I loved that process.

Did you still keep everything grounded in Premiere, and roundtrip everything out to Resolve or Pro Tools, and then bring it back?
Yeah, I onlined everything in Premiere myself.

I’m so overwhelmed thinking about you doing all of this yourself.
I didn’t know any other way to do it. I think it would be a lot easier if I had a bigger team. But that’s also something that’s been really funny with the sales agents (which we secured through the Tribeca Film Festival). They’re like, “Can your team do that?” and I’m like, “No, that’s all me. It’s just me.”

I mean, I have a colorist and a really great sound team. But I was the one who onlined everything.

When the festivals are saying, “We need a cut like this,” you are probably the one pushing all of those exports and deliverables out yourself.
Yeah. I wouldn’t have had it any other way.

During that whole process, did you have any parts where you were happy you did it in Premiere, or things that were frustrating?
The film is really funky. We have some YouTube clips in there, a lot of VFX and other types of footage. There were so many moments where I needed to test things out quickly and easily. In Premiere, you don’t have to worry about transcoding everything and different file formats. That was definitely the best in terms of accessibility and how quickly you can play around with your timeline and experiment.

My least favorite thing is, as I said, there is no real help line for Premiere. I was asking friends of friends of friends about a weird exporting glitch that I had. The forums are fine but, yeah, I wish there was a help line.

This film is your baby, but you did collaborate with some others in post. How did that process go?
I feel really lucky. I had an amazing post team. When I watch the movie, I’m happiest about all the things everyone else added that I never would’ve added myself.

One of the sound designers, Taylor Flinn — there’s a moment where our comical security guard character leans forward in his car, and Taylor added a little siren “whoop” sound. It’s so funny to me, and I like it more because I’m not the one who did it. It’s the one moment I always laugh at in the film now, even after seeing it hundreds of times. We had an amazing animator as well, Sean Buckelew. That was another portion of post for us.

I just didn’t realize how long it was going to take. We finished shooting in August 2017, and I was like, “We’re going to submit in December 2017 to Tribeca!” And then it was a full other year of insanity. I did a first cut by September 2017, did a lot of test screenings in my apartment and kept hammering away. I was working with another co-editor, Michelle Botticelli, for about six weeks leading up to submissions, and she was also giving her opinion on all of the future cuts and color and sound.

Any updates on where the film is heading next?
I hope we get distribution. It was at the Cannes marketplace, and we have sales agents. They came on as soon as we got into Tribeca. Tribeca recommended them, and I’m learning as I go.

What’s next for you?
I have a pilot and a show bible for the TV version of CRSHD. Then I have another rom-com that I’m writing. I’m still editing, and since making the movie, I’ve been doing a ton of other side work, like camera operating. But, ultimately, I hope to be writing and directing.


Amy Leland is a film director and editor. Her short film, “Echoes”, is now available on Amazon Video. She also has a feature documentary in post, a feature screenplay in development, and a new doc in pre-production. She is an editor for CBS Sports Network and recently edited the feature “Sundown.” You can follow Amy on social media on Twitter at @amy-leland and Instagram at @la_directora.

DP Chat: Good Omens cinematographer Gavin Finney

By Randi Altman

London-born cinematographer Gavin Finney, BSC, has a wealth of television series and film experience under his belt, including Wolf Hall, The Fear and the upcoming series based on the film of the same name, Hulu’s Four Weddings and a Funeral. One of his most recent projects was the six-episode Amazon series Good Omens, starring Michael Sheen (Aziraphale) and David Tennant (Crowley) as an angel and a demon with a very long history, who are tasked with saving the world. It’s based on the book by Neil Gaiman and Terry Pratchett.

Finney was drawn to cinematography by his love of still photography and telling stories. He followed that passion to film school and fell in love with what could be done with moving images.

Let’s find out more about Finney and his work on Good Omens.

How would you describe the look of Good Omens? How did you work with the director/s/producers to achieve the look they wanted?
There is a progression through the story where things get increasingly strange as Adam (who our main characters believe is the antichrist) comes into his powers, and things in his head start manifesting themselves. It is also a 6,000-year-long buddy movie between an angel and a demon! There is Adam’s world — where everything is heightened and strangely perfect — and Aziraphale and Crowley’s world of heaven and hell. At some point, all these worlds intersect. I had to keep a lot of balls in the air in regard to giving each section its own look, but also making sure that when these worlds collide, it still makes sense.

Each era depicted in the series had a different design treatment — obviously in the case of costume and production design — but also in the way we shot each scene and the way they were lit. For instance, Neil Gaiman had always imagined the scene in the church in the blitz in Episode 3 to be an homage to the film noir style of the time, and we lit and photographed it in that style. Ancient Rome was given the patina of an Alma-Tadema oil painting, and we shot Elizabethan London in an exact recreation of Shakespeare’s Globe Theatre. The ‘60s were shot mainly on our Soho set, but redressed with posters from that time, and we changed the lighting to use more neon and used bare bulbs for signage.

I also graded the dailies throughout production on DaVinci Resolve, adding film grain and different looks to different time periods to help anchor where we were in the story. Neil wanted heaven and hell to feel like two parts of the same celestial building, so heaven occupied the best penthouse offices, and hell was stuck in the damp, moldy basement where nothing works properly.

We found a huge empty building for the heaven set that had shiny metal flooring and white walls. I frosted all the windows and lit them from outside using 77 ARRI Skypanels linked to a dimmer desk so we could control the light over the day. We also used extremely wide-angle lenses such as the Zeiss rectilinear 8mm lens to make the space look even bigger. The hell set used a lot of old, slightly greenish fluorescent fittings, some of them flickering on and off. Slimy dark walls and leaking pipes were added into the mix.

For another sequence Neil and Douglas wanted an old-film look. To do this, ARRI Media in London constructed a hand-cranked digital camera out of an old ARRI D21 camera and connected it to an ARRI 435 hand-crank wheel and then to a Codex recorder. This gave us a realistic, organic varis-peed/vari-exposure look. I added a Lensbaby in a deliberately loose mount to emulate film weave and vignetting. In this way I was able to reproduce very accurately the old-style, hand-cranked black and white look of the first days of cinema.

How early did you get involved in the production?
I’d worked with the director Douglas Mackinnon a few times before (on Gentlemen’s Relish and The Flying Scotsman), and I’d wanted to work with him again a number of times but was never available. When I heard he was doing this project, I was extremely keen to get involved, as I loved the book and especially the kind of world that Neil Gaiman and Terry Pratchett were so good at creating. Fortunately, he asked me to join the team, and I dropped everything I was doing to come on board. I joined the show quite late and had to fly from London to Cape Town on an early scout the day after getting the job!

How did you go about choosing the right camera and lenses for this project?
We shot on Leica Summilux Primes and ARRI Alura zooms (15.5-45mm and 45-
250mm) and ARRI Alexa SXT and Alexa Mini cameras outputting UHD 4K files. The Alexa camera is very reliable, easy to work with, looks great and has very low noise in the color channels, which is useful for green/bluescreen work. It can also shoot at 120fps without cutting into the sensor size. We also had to make sure that both cameras and lenses were easily available in Cape Town, where we filmed after the
UK section.

The Alexa output is also very flexible in the grade, and we knew we were going to be pushing the look in a number of directions in post. We also shot with the Phantom Flex 4K high-speed camera at 1,000fps for some scenes requiring ultra-slo motion, and for one particular sequence, a specially modified ARRI D-21 that could be “hand-cranked” like an old movie camera.

You mentioned using Resolve on set. Is this how you usually work? What benefit did you get from doing this?
We graded the dailies on Blackmagic’s DaVinci Resolve with our DIT Rich
Simpson. We applied different looks to each period of the story, often using a modified film emulation plugin. It’s very important to me that the dailies look great and that we start to establish a look early on that can inform the grade later.

Rich would bring me a variety of looks each day and we’d pick the one we liked for that day’s work. Rich was also able to export our selected looks and workflow to the South African DIT in Cape Town. This formed the starting point of the online grade done at Molinare on FilmLight Baselight under the hugely capable hands of Gareth Spensley. Gareth had a big influence on the look of the series and did some fantastic work balancing all the different day exteriors and adding some magic.

Any challenging scenes you are particularly proud of?
We had some very big sets and locations to light, and the constantly moving style of photography we employed is always a challenge to light — you have to keep all the fixtures out of shot, but also look after the actors and make sure the tone is right for the scene. A complicated rig was the Soho street set that Michael Ralph designed and built on a disused airbase. This involved four intersecting streets with additional alleyways, many shops and a main set — the bookshop belonging to Aziraphale.

This was a two-story composite set (the interior led directly to the exterior). Not only did we have to execute big crane moves that began looking down at the whole street section and then flew down and “through” the windows of the bookshop and into an interior scene. We also had to rig the set knowing that we were going to burn the whole thing down.

Another challenge was that we were filming in the winter and losing daylight at 3:30pm but needing to shoot day exterior scenes to 8pm or later. My gaffer (Andy Bailey) and I designed a rig that covered the whole set (involving eight cranes, four 18Kw HMIs and six six-meter helium hybrid balloons) so that we could seamlessly continue filming daylight scenes as it got dark and went to full night without losing any time. We also had four 20×20-foot mobile self-lighting greenscreens that we could move about the set to allow for the CGI extensions being added later.

What inspires you artistically? And how do you simultaneously stay on top of advancing technology that serves your vision?
The script inspires me artistically. If I don’t love the story and can’t immediately “see” how it might look, I don’t do it. After that, I’m inspired by real life and the way changing light utterly transforms a scene, be it a landscape or an interior. I also visit art galleries regularly to understand how other people see, imagine and communicate.

What new technology has changed the way you work (looking back over the past few years)?
Obviously, digital cinematography has had a huge impact. I trained in film and spent the first 16 years of my career shooting film exclusively, but I was happy to embrace digital when it came in. I love keeping up with all the advances.

Lighting is also going digital with the advent of LED fixtures with on-board computers. I can now dial any gel color or mix my own at any dimmer level from an app on my phone and send it to dozens of fixtures. There is an incredible array of tools now at our disposal, and I find that very exciting and creatively liberating.

What are some of your best practices or rules you try to follow on each job?
I tend to work on quite long jobs — my last two shows shot for 109 and 105 days, respectively. So keeping to sensible hours is critical. Experienced producers who are concerned with the welfare, health and safety of their crew keep to 10 hours on camera, a one-hour lunch and five-days weeks only. Anything in excess of that results in diminishing returns and an exhausted and demoralized crew.

I also think prep time is incredibly important, and this is another area that’s getting squeezed by inexperienced producers to the detriment of the production. Prep time is a comparatively cheap part of the process but one that reaps huge dividends on the shoot. Being fully prepared, making the right location and set design choices, and having enough to time to choose equipment and crew and work out lighting designs all make for a smooth-running shoot.

Explain your ideal collaboration with the director when setting the look of a project.
This goes back to having enough prep time. The more time there is to visit possible locations and simply talk through all the options for looks, style, movement and general approach the better. I love working with visual directors who can communicate their ideas but who welcome input. I also like being able to ditch the plan on the day and go with something better if it suddenly presents itself. I like being pushed out of my comfort zone and challenged to come up with something wonderful and fresh.

What’s your go-to gear — things you can’t live without?
I always start a new production from scratch, and I like to test everything that’s available and proven in the field. I like to use a selection of equipment — often different cameras and lenses that I feel suit the aesthetic of the show. That said, I think
ARRI Alexa cameras are reliable and flexible and produce very “easy to work with” images.

I’ve been using the Letus Helix Double and Infinity (provided by Riz at Mr Helix) with an Exhauss exoskeleton support vest quite a lot. It’s a very flexible tool that I can operate myself and it produces great results. The Easyrig is also a great back-saver when doing a lot of handheld-work, as the best cameras aren’t getting any lighter.

Apart from that, comfortable footwear and warm, waterproof clothing are essential!


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Tips for inside —and outside — the edit suite

By Brady Betzel

Over the past 15 years, I’ve seen lots of good and bad while working in production and post — from people being overly technical (only looking at the scopes and not the creative) to being difficult just for the sake of being difficult. I’ve worked on daytime and nighttime talk shows, comedies, reality television, Christmas parades, documentaries and more. All have shaped my editing skills as well as my view on the work-life balance.

Here are some tips I try to keep in mind that have helped me get past problems I’ve encountered in and out of the edit bay:

No One Cares
This one is something I constantly have to remind myself. It’s not necessarily true all the time, but it’s a good way to keep my own ego in check, especially on social media. When editing and coloring, I constantly ask myself, “Does anyone care about what I’m doing? If not, why not?” And if the answer is that they don’t, then something needs to change. I also ask myself, “Does anything about my comment or edit further the conversation or the story, or am I taking away from the story to draw attention to myself?” In other words, am I making an edit just to make an edit?

It’s an especially good thing to think about when you get trolled on Twitter by negative know-it-alls telling you why you’re wrong about working in certain NLEs. Really, who cares? After I write my response and edit it a bunch of times, I tell myself, “No one cares.” This philosophy not only saves me from feeling bad about not answering questions that no one really cares about, but it also helps improve my editing, VFX and color correction work.

Don’t be Difficult!
As someone who has worked everywhere and in all sorts of positions — from a computer tech at Best Buy (before Geek Squad), a barista at Starbucks, a post PA for the Academy Awards, and assistant editor, editor, offline editor, online editor — I’ve seen the power of being amenable.

I am also innately a very organized person, both at work and at home, digitally and in real life — sometimes to my wife’s dismay. I also constantly repeat this mantra to my kids: “If you’re not early, you’re late.”

But sometimes I need to be reminded that it’s OK to be late, and it’s OK not to do things the technically “correct” way. The same applies to work. Someone might have a different way of doing something that’s slower than the way I’d do it, but that doesn’t mean that person is wrong. Avoiding confrontation is the best way to go. Sure, go ahead and transcode inside of Adobe Premiere Pro instead of batch transcoding in Media Encoder. If the outcome is the same and it helps avoid a fight, just let it slide. You might also learn something new by taking a back seat.

Sometimes Being Technically Perfect Isn’t Perfect
I often feel like I have a few obsessive traits: leaving on time, having a tidy desktop and doing things (I feel) correctly. One of the biggest examples is when color correcting. It is true that scopes don’t lie; they give you the honest truth. But when I hear about colorists bragging that they turn off the monitors and color using only Tektronix Double Diamond displays, waveforms and vectorscopes — my skeptical hippo eyes come out. (Google it; it’s a thing).

While scopes might get you to a technically acceptable spot in color correction, you need to have an objective view from a properly calibrated monitor. Sometimes an image with perfectly white whites and perfectly black blacks is not the most visually pleasing image. I constantly need to remind myself to take a step back and really blend the technical with the creative. That is, I sit back and imagine myself as the wide-eyed 16-year-old in the movie theater being blown away and intrigued by American Beauty.

You shouldn’t do things just because you think that is how they should be done. Take a step back and ask yourself if you, your wife, brother, uncle, mom, dad, or whoever else might like it.

Obviously, being technically correct is vital when creating things like deliverables, and that is where there might be less objectivity, but I think you understand my point. Remember to add a little objectivity into your life.

Live for Yourself, Practice and Repeat
While I constantly tell people to watch tutorials on YouTube and places like MZed.com, you also need to physically practice your craft. This idea becomes obvious when working in technically creative positions like editing.

I love watching tutorials on lighting and photography since so much can be translated over to editing and color correcting. Understanding relationships between light and motion can help guide scenes. But if all you do is watch someone tell you how light works, you might not really be absorbing the information. Putting into practice the concepts you learn is a basic but vital skill that is easy to forget. Don’t just watch other people live life, live it for yourself.

For example, a lot of people don’t understand trimming and re-linking in Media Composer. They hear about it but don’t really use these skills to their fullest unless they actively work them out. Same goes for people wanting to use a Wacom tablet instead of a mouse. It took me two weeks of putting my mouse in the closet to even get started on the Wacom tablet, but in the end, it is one of those things I can’t live without. But I had to make the choice to try it for myself and practice, practice, practice to know it.

If you dabble and watch videos on a Wacom tablet, using it once might turn you off. Using trimming once might not convince you it is great. Using roles in FCPX once might not convince you that it is necessary. Putting those skills into practice is how you will live editing life for yourself and discover what is important to you … instead of relying on other people to tell you what’s important.

Put Your Best Foot Forward
This bit of advice came to me from a senior editor on my first real professional editing job after being an assistant editor. I had submitted a rough cut and — in a very kind manner — the editor told me that it wasn’t close to ready for a rough cut title. Then we went through how I could get there. In the end, I essentially needed to spend a lot more time polishing the audio, checking for lip flap, polishing transitions and much more. Not just any time, but focused time.

Check your edit from a 30,000-foot view for things like story and continuity, but also those 10-foot view things like audio pops and interviews that sound like they are all from one take. Do all your music cues sting on the right beat? Is all your music panned for stereo and your dialogue all center-panned to cut up the middle?

These are things that take time to learn, but once you get it in your head, it will be impossible to forget … if you really want to be a good editor. Some might read this and say, “If you don’t know these workflows, you shouldn’t be an editor.” Not true! Everyone starts somewhere, but regardless of what career stage you’re in, always put your best foot forward.

Trust Your Instincts
I have always had confidence in my instincts, and I have my parents to thank for that. But I have noticed that a lot of up-and-coming production and post workers don’t want to make decisions. They also are very unsure if they should trust their first instinct. In my experience, your first instinct is usually your best instinct. Especially when editing.

Obviously there are exceptions to this rule, but generally I rely heavily on my instincts even when others might not agree. Take this with a grain of salt, but also throw that salt away and dive head first!

This notion really came to a head for me when I was designing show titles in After Effects. The producers really encouraged going above and beyond when making opening titles of a show I worked on. I decided to go into After Effects instead of staying inside of the NLE. I used crazy compositing options that I didn’t often use, tried different light leaks, inverted mattes … everything. Once I started to like something, I would jump in head first and go all the way. Usually that worked out, but even if it didn’t, everyone could see the quality of work I was putting out, and that was mainly because I trusted my instincts.

Delete and Start Over
When you are done trusting your instincts and your project just isn’t hitting home — maybe the story doesn’t quite connect, the HUD you are designing just doesn’t quite punch or the music you chose for a scene is very distracting — throw it all away and start over. One of the biggest skills I have acquired in my career thus far seems to be the ability to throw a project away and start over.

Typically, scenes can go on and on with notes, but if you’re getting nowhere, it might be time to start over if you can. Not only will you have a fresh perspective, but you will have a more intimate knowledge of the content than you had the first time you started your edit — which might lead to an alternate pathway into your story.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Editing for Features and Docs

By Karen Moltenbrey

When editing a feature film, the cutter can often feel as if he or she is assembling a puzzle, putting together a plethora of pieces, from the acting, to the lighting, to the production design, and turning those raw elements into a cohesive, comprehensive story. With so much material to sort through, so many shots to select from and so many choices to be made overall, the final cut indeed is reflective of these many choices made by the editor over a significant period of time.

Here, we examine the unique workflow of two editors, one who worked on a drama with an up-and-coming director, and another who cut a documentary with a director who is very well known throughout the film world.

Clemency
The feature film Clemency will be released at the end of the year, but already it commanded attention at the 2019 Sundance Film Festival and beyond, taking home the Grand Jury Prize. The drama, directed by Chinonye Chukwu, stars Alfre Woodard as prison warden Bernadine Williams, who’s preparing for the execution of another inmate and struggling with the emotional toll that task has taken on her. The film was edited by Phyllis Housen.

As is often the case, while cutting Clemency, the editing style evolved organically from the story being told. “That happens all the time, unless it’s a very specific genre piece, like a thriller or horror film,” Housen says. For this film, she describes the style as “deliberate.” “There is no racing through the day. We feel the time pass. In prison, time stretches and changes, and we wanted to recreate that feeling of time. Often you don’t know if it’s night or day,” she says.

Also, the director wanted the film to feel as though the audience was there, living inside the prison. “So, there is a lot of repetition. We visit and revisit some of the routines of daily life like the prisoners do,” Housen adds.

Phyllis Housen

According to Housen, the film looks at what it is like for the warden to have a day-to-day relationship with the inmates on the ward as well as what happens when the warden, ultimately and occasionally, has to perform an execution. “By creating the routine of daily life, we would get to know how the prisoners live and experience that through them,” she explains.

The focus in the film turns to one prisoner in particular, Anthony Woods, a convicted felon on death row. “You get attached to these people, and to him,” Housen points out.

For instance, there is a good deal of walking in the movie, as the camera follows the warden while she sets out from her office and through the prison hallways, passing prisoners all the way to the death row ward, which is separated from everything and everyone — even the general population of prisoners. “You feel that length and distance as she is walking. You get a sense of how far away — literally and figuratively — the death row prisoners are,” Housen explains.

Housen (Cargo, the I, Witness TV documentary series and much more) cut the film at Tunnel in Santa Monica, California, using Adobe Premiere Pro, after having migrated from Apple Final Cut Pro years back. She says she finds the Adobe platform very intuitive.

In terms of her workflow, Housen believes it is fairly consistent across all projects. She receives dailies that are transcoded from raw footage, an assistant organizes all the footage for her, then she starts putting scenes together, maybe one or two days after principal photography begins so there is some footage to work with. “I am thinking of the footage as if I am building a house, with the scenes as the bricks,” she says. “So, I might get footage from, say, scenes 12, 84 and 105 on the first day, and I start lightly sketching those scenes out. I watch the dailies to get a feeling of what the scenes might eventually become. It takes longer than it sounds! And then the next day, four more scenes might come in, and the day after, seven. You’re always getting a bit behind the eight ball during dailies, but you sketch as best you can.”

Once Housen begins building out the scenes, she starts creating what she calls “reels,” an assembly of the film in 20-minute segments — a habit from the days of working with film in the predigital age. “Once you create your reels, then you end up, when they are done shooting, with a rather long, but not paced, assembly of the film that serves as a blueprint,” she says. “When post begins, we roll up our sleeves and start at Scene 1 and dig in.”

Housen finds that on an independent film, there isn’t a lot of time to interact with the director while the film is being shot, and this held true for Clemency; but once they got to the cutting room, “we were there every day together, all day, attached at the hip,” she says.

Clemency was the first time Housen had the opportunity to work with Chukwu. “She’s a very bold director, and there are some very bold choices in this film,” Housen says. She points out that some liberties were taken in terms of pacing and editing style, especially toward the end of the film, which she believes really pays off. However, Housen stops short of revealing too much about the scenes prior to the film’s release.

“It is a very heavy film, a difficult film,” Housen continues. “It’s thought-provoking. For such a heavy movie, we had a light time making it. We laughed a lot and enjoyed the process very much. I was just so pleased with [Chukwu’s] vision. She is a young and up-and-coming filmmaker. I think we are going to continue hearing a lot about her.”

Pavarotti
In the documentary Pavarotti, in limited release starting June 7, director Ron Howard tells the story of the opera legend Luciano Pavarotti through an assemblage of unique footage, concert performances and interviews. The film was edited by Paul Crowder, ACE.

Director Ron Howard and editor Paul Crowder take a selfie at the Pavarotti premiere.

As Crowder notes, it seemed logical to approach the story of Pavarotti in the format of an opera. His art and his life lent themselves to a natural three-act arc: The tenor starts his career as an opera singer and becomes successful. Then there is a period of self-doubt, followed by the meteoric success of The Three Tenors, his philanthropic period and then his marriage to a much younger woman. “You have these dramatic moments in his life story like you do in an opera, and we thought if we could use Pavarotti’s music and operas, that would tell the story, so we gave the documentary an operatic feel,” he says.

A musician himself, Crowder well understood this unique dimension to the documentary. Still, approaching the film in this way required careful navigation in the editing suite. “It’s not like editing pop music or something like that. You can’t just drop in and out of arias. They don’t come in four-bar sections or a middle-eight section,” he points out. “They’re all self-contained. Each section is its own thing. You have to select the moments when you can get in and out of them. Once you commit to them, you have to really commit to a degree, and it all becomes part of the style and approach of the editing.”

Crowder edited the film on an Avid system at his home studio. “I am an Avid Media Composer guy and will be to the day I die,” he says. “I was brought up on Avid, and that will always be my go-to choice.”

The biggest editing challenge on the documentary was dealing with the large mix of media and formats, as the film integrates footage from past concerts and interviews that took place all over the world at different points in time. In all, music was pulled from 22 different operas – not opera pieces, but different operas themselves. The footage was digitized in Avid using native frame rates “because Avid is so adept at dealing with multiple frame rates on a single timeline,” he says, noting that his assistant, Sierra Neal, was instrumental in keeping all the various media in check.

Nevertheless, dealing with various frame rate issues in the online was tough. Everyone has a way to do it, Crowder says, but “there is no definitive excellent way to go from standard def to HD.”

The mixed frame rates and formats also made it difficult to spot flaws in the imported footage: The overall transfer might look good, but there might be a frame or two that did not transfer well. “We kept spotting them throughout the online in the same clips we had already fixed, but then we’d find another flaw that we hadn’t seen,” Crowder says.

The film was built in pieces. The first section Crowder and writer Mark Monroe built pertained to Pavarotti’s children. “It was a leaping-off point for the film. We knew the girls were going to be in it and they would have something fun to say,” he says. He then worked forward from that point.

Crowder praises the research team at Pavarotti production company White Horse Pictures with assembling the tremendous amount of research and documentation for the film, organizing the various content and clips that made it easier for him and Neal to locate those with the best potential for particular scenes. “Still, it was essential to really look closely at everything and know where it was,” he adds, “Otherwise, you don’t know what you might miss.”

In fact, Crowder has something he calls his “hip pocket,” interesting material that hasn’t been placed yet. “It’s just a bin that contains material when I need something strong,” he says. On this project, footage of Pavarotti’s trip to the Amazon in 1995 is in that bin. And, they found an ideal place for it in the opening of the film.

“The film always talks to you, and sometimes you can’t find something you’re looking for but it’s staring you right in the face,” Crowder says. In this case, it was the Amazon footage. “It was vital that we hear Luciano’s voice at the opening of the film. If you don’t hear him sing, then there’s no point because it’s all about his voice. And everything we are going to tell you from that point on comes off the back of his voice.”

While he has worked on series as well as other kinds of projects, Crowder prefers films — documentaries, in particular. Series work, he says, can become a little too “factory-esque” for his taste — especially when there is a deadline and you are aware of what worked before, it can be come easy to get into a rhythm and possibly lose creative drive. “With a film, you lead audiences on an emotional journey, but you can take it to completion in one sitting, and not drag it out week after week,” Crowder says.

And, how did the editor feel working alongside famed director Ron Howard? Crowder says it was a fantastic experience, calling Howard very decisive and knowledgeable. “A wonderful and generous person to learn from. It was a great working relationship where we could discuss ideas honestly.”

Another bonus about this project: Crowder’s mom was an amateur opera singer, a soprano, while his grandfather was an amateur tenor. “I wanted to work on the film for my mom. She passed away, unfortunately, but she would have loved it.” Surely others will as well.


Karen Moltenbrey is a veteran writer/editor covering VFX and post production.

Editing for Episodics

By Karen Moltenbrey

Consistency is important when editing series. Initially, the editor may collaborate with the director and DP on the style of the show, and once it is set, the focus is on supporting that direction, reflecting the essence and feel of the show and the performance of the characters.

However, not every series is the same, and editors adapt their editing pattern and pacing based on the project’s genre. For instance, the pacing of a comedy should elevate the punchline, not detract from it, whereas with a drama, choosing the best performance that fits the story is paramount. Additionally, the use of music and sound design can heighten the emotion or tension far more than, say, in a comedy.

Here we look at two very different series — a comedy and a drama — and examine how the editors approached the cut on their respective shows.

Insecure
Life is complicated, especially for the main characters of HBO’s Insecure, which focuses on two African American women as they navigate modern-day life in Los Angeles. Best friends since college and now in their late 20s, they are trying to find their footing both personally and professionally, with Issa Dee working at a nonprofit school and living with her longtime boyfriend, and Molly Carter finding success as a lawyer but less so in her dating life.

The comedy has been renewed for a fourth season, which will be released sometime in 2019. The series debuted in 2016 and is created by Issa Rae — who plays the main character Issa Dee — and Larry Wilmore. A number of people have directed and served as DP, and there have been four editors, including Nena Erb, ACE, who came aboard during Season 3.

“The series is built around [Issa’s and Molly’s] experiences as they try to find their place in the world. When I approach a scene, I do so from their point of view,” says Erb. “South LA is also portrayed as a character in the series; we do our best to incorporate shots of the various neighborhoods in each episode so viewers get a flavor of the city.”

According to Erb, the composition for the series is cinematic and unconventional from the typical television series. “The editing pattern is also not the typical start with a master, go to medium shots, close-up and so forth,” she says. “Having unique composition and coming up with interesting ways to transition in and out of a scene give this series a distinct visual style that’s unlike other television shows out right now.”

Nena Erb

Scenes wherein Issa is the focus are shot mostly handheld. The shots have more movement and convey a sense of uncertainty and flux, which is in keeping with the character, who is trying to find herself when it comes to her career. On the other hand, Molly’s scenes are typically locked-off to convey steadiness, as she is a little more settled in her career as an attorney. For example, in “Fresh-Like” (Season 3 Episode 4), Molly has a difficult time establishing herself after taking a job at a new law firm, and things are not going as smoothly as she had hoped. When she discusses her frustrations with her therapist, the scene was shot with locked-off cameras since it focuses on Molly, but camera moves were then added in the edit to give it a handheld look to convey she was on unsteady ground at that moment.

Erb edits the series on an Avid Media Composer, and temp effects are done in Adobe Photoshop and After Effects.

Erb’s workflow for Insecure is similar to other series she has edited. She reads the script a few times, and before starting dailies, will re-read the scene she is working on that day, paying particular attention to the screen direction. “That is extremely helpful in letting me know the tone of the scene. I like having that fresh in my mind when I watch the dailies,” says Erb. She also reviews all the circle as well as non-circle takes — a step that is time-consuming but ensures she is using all the best performances. “And sometimes there are hidden gems in the non-circle takes that make all the difference, so I feel it’s worth the time to watch them all,” she adds.

While watching the dailies, Erb often jots down notes while cutting it in her head. Then she sits down and starts putting the scene together in the actual edit.

When Erb signed on to do the series, the style and tone were already established, and the crew had been together since the beginning. “It’s never easy to come into a show like that,” she says. “I was the new kid on the block who had to figure out team dynamics in addition to learning the style of the show. My biggest challenge was to make sure my work was in the language of the series, while still maintaining my own sense of style.”

Insofar as social media has become a big part of everyone’s life, it is now turning up in series such as Insecure, where it has become a recurring character — although in the episode titled “Obsessed-Like,” it is much more. As Erb explains, Insecure uses social media graphics as elements that play on the screen next to the person texting or tweeting. But in that episode, the editor wanted the audience alongside Issa as she checks on her new love interest Nathan and used social media graphics in a completely different way than had been done previously on the show.

“I told my assistant editor, Lynarion Hubbard, that I wanted her to create all these graphics in a way that they could be edited as if they were dailies. Doing so enabled me to use them full-screen, and I could animate them so they could crash-zoom into the shot of this woman kissing Nathan and then tilt down to the caption, which is when you realize the woman is his mom, as she delivers the punchline, ‘Oh, it’s your mom. She looks young for 50,’” says Erb.

“I felt the graphics made me more invested and allowed me to experience the emotional roller coaster with Issa as she obsesses over being ghosted. It was a risk to use them that way because it wasn’t in the language of the show. Fortunately for me, the producers loved it, and that episode was nominated for an ACE Eddie Award earlier this year.”

Erb might be new to Insecure, but she feels a personal connection to the series: When she and her family first immigrated to the US, they settled in Ladera Heights, and she attended school in Inglewood. “I remember this awkward girl who didn’t speak a word of English, and yet the neighbors welcomed us with open arms,” she recalls. “That community will always be special to me. The series pokes fun at Ladera Heights, but I think it’s great that they are highlighting a part of South LA that was my first connection in the US.”

Erb predominantly edits television series, but she has also edited feature films and documentaries. “I’d say I am drawn to powerful stories and captivating characters rather than a genre or format. Performance is paramount. Everything is in service of the story and the characters, regardless of whether it’s a series or a film,” she states.

On a series, “it’s a sprint to the finish, especially if it’s a series that has started airing while you’re still shooting and editing the later episodes. You’ll have anywhere from one to three days after the last day of dailies to do your editor’s cut, and then it’s off to the director, producers, the studio and so forth,” Erb explains. Conversely, with the features she has done, the schedule has offered more wiggle room – more time to do the editor’s cut and more time for the directors’ involvement. “And you have the luxury to experiment and sit with the cut to make sure it is working.”

In addition to Insecure, Erb has worked on Crazy Ex-Girlfriend, Being Mary Jane and Project Greenlight, to name a few. And each has its own recipe. For instance, Crazy Ex has music videos in each episode that run the gamut from the ’50s to present day, from a Fosse-inspired number to ’80s rock, ’90s hip-hop and three decades of the Beach Boys. “In an industry where it is easy to get pigeonholed, being able to work on a show that allows you to challenge yourself with different genres is rare, and I loved the experience.”

Ozark
At first glance, the Ozarks seem to be a tranquil place, a wholesome, idyllic location to raise a family. But, looks can be deceiving, especially when it comes to the Netflix family crime drama Ozark, which will be starting its third season sometime this year.

The series follows financial planner Marty Byrde, who relocates with his family from Chicago to the summer resort area of Osage Beach, Missouri, in the Ozark Mountains. The move is not voluntary. To make amends for a scheme that went awry, he must launder millions of dollars belonging to a Mexican drug cartel through the Ozarks. Soon he becomes entangled with local criminals.

Jason Bateman, who plays Marty, directed some of the episodes in Season 1 and 2, with other directors filling that role as well. Editing the series since it began is Cindy Mollo, ACE, and Heather Goodwin Floyd, who have a longtime working relationship. Goodwin Floyd, who was Mollo’s assistant editor for many years, started on both seasons of Ozark in the assistant role but also edited and co-edited episodes in each season.

Cindy Mollo

When Mollo first met with Bateman to talk about the project, they discussed the series as being akin to a 10-hour feature. “He wanted to spend time in moments, giving time to the performances, and not be too ‘cutty’ or too manipulative,” she says. “There’s a tendency with someone like Bateman to always be looking for the comedy and to cut for comedy, but ours is a dramatic show where sometimes things just happen to be funny; we don’t cut for that.”

The show has a naturalistic feel, and many scenes are shot outdoors, but there is always a lingering sense of threat, played up with heavy shadows. The look, as the humor, is dark, in a figurative and literal way. And the editors play into the suspense. “By letting moments play out, it helps put you in the head of the character, figuring things out as you go along. So, you’re not ever letting the audience get ahead of the character by showing them something that the character doesn’t see,” explains Mollo. “There’s a little bit of a remoteness in that, so you’re not really spoon-feeding the audience.”

On Ozark, the editors make sure they do not get in the way of the material. The writing is so solid, says Mollo, and the performances are so good, “the challenge is to resist the temptation to impose too much on the material and to just achieve the goals of the scene. Doing things simply and elegantly, that is how I approach this series.”

Goodwin Floyd agrees. “We support the material and let it speak for itself, and tell the story in the most authentic way possible,” she adds.

The series is set in the Ozarks but is filmed outside Atlanta, where the dailies are processed before they are sent to editorial. Assistants pull all the media into a Media Composer, where the cut is done.

Heather Goodwin Floyd

According to Mollo, she and Goodwin Floyd have four days to work on their cut. Then the directors have four days per episode to work with them. “We’re cross-boarded, so that ends up being eight days with the director for two episodes, for the most part,” she says. After that, the producers are brought in, and as Mollo points out, Bateman is very involved in the edit. Once the producers sign off, the final cut is sent to producer Media Rights Capital (MRC) and Netflix.

The first two seasons of Ozark were shot at 4K; this season, it is shot at nearly 6K, though delivery to Netflix is still at 4K.

Both editors have a range of experience in terms of genres. Goodwin Floyd started out in features and now primarily edits TV dramas. Mollo got her start in commercials and then migrated to dramatic series, with some TV movies and features, as well. “I love the mix. Honestly, I love doing both [series and films]. I have fun when I’m on a series, and then it seems like every two years or so I get to do a feature. With everyone editing digitally, the feature process has become very similar to the television process,” she says. “It’s just a little more director-focused rather than producer/writer-focused.”

For Goodwin Floyd, she’s drawn more to the content as opposed to the format. “I started in features and at the time thought I wanted to stay in features, but the quality of series on television has evolved and gotten so great that I love working in TV as much as in features,” she says.

With the rise of cable, then premium movie channels and now streaming services, Mollo says there is a feeling that the material can be trusted more, that there is no longer the need to feel like you have to be cutting every couple of seconds to keep the audience excited and engaged. For instance, when she worked on House of Cards, the MRC and Netflix executives were very hands-off — they wanted to have a fantastic opening episode every season and a really compelling cliffhanger, and for everything in between, they trusted the filmmakers to take care of it.

“I really gravitated toward that trend of trusting the filmmakers, and it is resulting in some really great television,” says Mollo.

In as much as we are in a golden age of television, Mollo also believes we are in a golden age of editing, where people understand more of what an editor does and appreciates the results more. Editing is basically a final rewrite of the script, she says. “You’re the last line of defense; sometimes you need to guide the story back to its original direction [if it veers off course].”


Karen Moltenbrey is a veteran writer/editor covering VFX and post production.

Bipolar Studio gives flight to Uber Air campaign

Tech company Uber has announced their latest transportation offering — an aerial ride-sharing service. With plans to begin service within cities as soon as 2023, Uber Elevate launched their marketing campaign today at the Uber Elevate Summit. Uber Elevate picked LA-based creative and production boutique Bipolar Studio to create their integrated campaign, which includes a centerpiece film, experiential VR installation, print stills and social content.

The campaign’s centerpiece film Airborne includes stunning imagery that is 100 percent CGI. Beginning on an aerial mass transit platform at Mission Bay HQ, the flight travels across the city of San Francisco, above landmarks like where the Warriors play and Transamerica Tower. Lights blink on the ground below and buildings appear as silhouettes in the far background. The Uber Air flight lands in Santa Clara on Uber’s skytower with a total travel time of 18 minutes — compared to an hour or more driving through rush hour traffic. Multi-floor docking will allow Uber Air to land up to 1000 eVTOLs (those futuristic-looking vehicles that hover, take off and land vertically) per hour.

At the Uber Elevate Summit, attendees had the opportunity to experience a full flight inside a built-out physical cabin via a high-fidelity four-minute VR installation. After the Summit, the installation will travel to Uber events globally. Still images and social media content will further extend the campaign’s reach.

Uber Elevates head of design, John Badalamenti, explains, “We worked shoulder-to-shoulder with Bipolar Studio to create an entirely photoreal VR flight experience, detailed at a high level of accuracy from the physics of flight and advanced flight patterns, down to the dust on the windows. This work represents a powerful milestone in communicating our vision through immersive storytelling and creates a foundation for design iteration that advances our perspective on the rider experience. Bipolar took things a step beyond that as well, creating Airborne, our centerpiece 2D film, enabling future Uber Air passengers to take in the breadth and novelty of the journey outside the cabin from the perspective of the skyways.”

Bipolar developed a bespoke AI-fueled pipeline that could capture, manage and process miles and miles of actual data, then faithfully mirror the real terrain, buildings, traffic and scattered people in cities. They then re-used the all-digital assets, which gave them full freedom to digitally scout the city for locations for “Airborne.” Shooting the spot, as with live-action production, they were able to place the CG camera anywhere in the digital city to capture the aircraft. This gave the team a lot of room to play.

For the animation work, they built a new system through Side Effects Houdini where the flight of the vehicle wasn’t animated but rather simulated with real physics. The team coded a custom plugin to be able to punch in a number for the speed of the aircraft, its weight, and direction, then have AI do everything else. This allowed them to see it turn on the flight path, respond to wind turbulence and even oscillate when taking off. It also allowed them to easily iterate, change variables and get precise dynamics. They could then watch the simulations play out and see everything in realtime.

City Buildings
To bring this to life, Bipolor had to entirely digitize San Francisco. They spent a lot of time creating a pipeline and built the entire city with miles and miles of actual data that matched the terrain and buildings precisely. They then detailed the buildings and used AI to generate moving traffic — and even people, if you can spot them — to fill the streets. Some of the areas required a LIDAR scan for rebuilding. The end result is an incredibly detailed digital recreation of San Francisco. Each of the houses is a full model with windows, walls and doors. Each of the lights in the distance is a car. Even Alcatraz is there. They took the same approach to Santa Clara.

Data Management
Bipolar rendered out 32-bit EXRs in 4K, with each frame having multiple layers for maximum control by the client in the comp stage. That gave them a ton of data and raw number of files to deal with. Thankfully, it wasn’t the studio’s first time dealing with massive amounts of data — their internal infrastructure is already setup to handle a high volume of data being worked on simultaneously. They were also able to use the SSDs on their servers, in certain cases, for a faster time in rendering comps and pre-comps.

Behind the Title: Artifex VFX supervisor Rob Geddes

NAME: Rob Geddes

COMPANY: Artifex Studios (@artifexstudios)

CAN YOU DESCRIBE YOUR COMPANY?
Artifex is a small to mid-sized independent VFX studio based in Vancouver, BC. We’ve built up a solid team over the years, with very low staff turnover. We try our best to be an artist-centric shop.

That probably means something different to everyone, but for me it means ensuring that people are being challenged creatively, supported as they grow their skills and encouraged to maintain a healthy work-life balance.

WHAT’S YOUR JOB TITLE?
VFX Supervisor

WHAT DOES THAT ENTAIL?
I guess the simplest explanation is that I have to interpret the needs and requests of our clients, and then provide the necessary context and guidance to our team of artists to bring those requests to life.

Travelers – “Ave Machina” episode

I have to balance the creative and technical challenges of the work, and work within the constraints of budget, schedule and our own studio resources.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
The seemingly infinite number of decisions and compromises that must be made each day, often with incomplete information.

HOW LONG HAVE YOU BEEN WORKING IN VFX?
I started out back in 2000 as a 3D generalist. My first job was building out environments in 3ds Max for a children’s animated series. I spent some years providing 3D assets, animation and programming to various military and private sector training simulations. Eventually, I made the switch over to the 2D side of things and started building up my roto, paint and compositing skills. This led me to Vancouver, and then to Artifex.

HOW HAS THE VFX INDUSTRY CHANGED IN THE TIME YOU’VE BEEN WORKING? 
The biggest change I have seen over the years is the growth in demand for content. All of the various content portals and streaming services have created this massive appetite for new stories. This has brought new opportunities for vendors and artists, but it’s not without challenges. The quality bar is always being raised, and the push to 4K for broadcast puts a lot of pressure on pipelines and infrastructure.

WHY DO YOU LIKE BEING ON SET FOR SHOTS? WHAT ARE THE BENEFITS?
As the in-house VFX supervisor for Artifex, I don’t end up on set — though there have been projects for which we were brought in prior to shooting and could help drive the creative side of the VFX in support of the storytelling. There’s really no substitute for getting all of the context behind what was shot in order to help inform the finished product.

DID A PARTICULAR FILM INSPIRE YOU ALONG THIS PATH IN ENTERTAINMENT?
When I was younger, I always assumed I would end up in classical animation. I devoured all of the Disney classics (Beauty and the Beast, The Lion King, etc.) Jurassic Park was a huge eye-opener though, and seeing The Matrix for the first time made it seem like anything was possible in VFX at that point.

DID YOU GO TO FILM SCHOOL?
Not film school specifically. Out of high school I still wasn’t certain of the path I wanted to take. I went to university first and ended up with a degree in math and computing science. By the time I left university I was convinced that animation and VFX were what I wanted. I worked through two diploma programs in 3D modeling, animation and film production.

WHAT’S YOUR FAVORITE PART OF THE JOB?
The best part of the job for me is seeing the evolution of a shot, as a group of artists come together to solve all of the creative and technical challenges.

WHAT’S YOUR LEAST FAVORITE?
Realizing the limits of what can be accomplished on any given day and then choosing what has to be deferred.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
That’s a tough one. When I wasn’t working in VFX, I was working toward it. I’m obsessed with video game development, and I like to write, so maybe in an alternate timeline I’d be doing something like that.

Zoo

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
This past year has been a pretty busy one. We’ve been on Travelers and The Order for Netflix, The Son for AMC, Project Blue Book for A&E, Kim Possible for Disney, Weird City for YouTube, and a couple of indie features for good measure!

WHAT IS THE PROJECT/S THAT YOU ARE MOST PROUD OF?
I’m a big fan of our work on Project Blue Book. It was an interesting challenge to contribute to a project with historical significance and I think our team really rose to the occasion.

WHAT TOOLS DO YOU USE DAY TO DAY?
At Artifex we run our shows through ftrack for reviews and management, so I spend a lot of time in the browser keeping tabs on things. For daily communication we use Slack and email. I use Google Docs for organizational stuff. I pop into Foundry Nuke to test out some things or to work with an artist. I use Photoshop or Affinity Photo on the iPad to do draw-overs and give notes.

WHERE DO YOU FIND INSPIRATION NOW?
It’s such an incredible time to be a visual artist. I try to keep an eye on work getting posted from around the world on sites like ArtStation and Instagram. Current films, but also any other visual mediums like graphic novels, video games, photography, etc. Great ideas can come from anywhere.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I play a lot of video games, drink a lot of tea, and hang out with my daughter.

Envoi’s end-to-end cloud solution for data migration, post 

Envoi has launched a cloud-based data, migration and post production workflow solution at the AWS M&E Symposium on June 18  in Los Angeles. Enabled by Cantemo and Teradici PCoIP technology, Envoi is offering this as a media “production-to-payment” platform.

Envoi is a cloud-based content management, distribution and monetization platform, giving broadcasters and video content providers a complete secure end-to-end video management and distribution system. Available on Amazon Web Services (AWS) Marketplace, Envoi is designed to provide simple and efficient data migration to the cloud and between services in the workflow.

Thanks to a partnership with Cantemo, Envoi has been integrated with Cantemo’s media asset management solution, Cantemo Portal and its cloud video management hub, Cantemo Iconik. Iconik makes it easy to collaborate on media, regardless of geographic location. Advanced Artificial Intelligence simplifies content discovery by improving metadata collection. By combining Envoi with Cantemo Portal, media companies of virtually all sizes can now monetize their video libraries within 48 hours.

Envoi also enables post in the cloud with the integration of Iconik with Teradici-enabled workstations on AWS. These workstations are configured to support a wide range of editing and post production tools. By supporting the entire post process on AWS, Envoi says it is providing a solution that increases the security, performance and collaboration potential within the creative process. Delivering the solution through AWS Marketplace simplifies procurement, delivery and deployment for Envoi’s customers.

 

SIGGRAPH making-of sessions: Toy Story 4, GoT, more

The SIGGRAPH 2019 Production Sessions program offers attendees a behind-the-scenes look at the making of some of the year’s most impressive VFX films, shows, games and VR projects. The 11 production sessions will be held throughout the conference week of July 28 through August 1 at the Los Angeles Convention Center.

With 11 total sessions, attendees will hear from creators who worked on projects such as Toy Story 4, Game of Thrones, The Lion King and First Man.

Other highlights include:

Swing Into Another Dimension: The Making of Spider-Man: Into the Spider-Verse
This production session will explore the art and innovation behind the creation of the Academy Award-winning Spider-Man: Into the Spider-Verse. The filmmaking team behind the first-ever animated Spider-Man feature film took significant risks to develop an all-new visual style inspired by the graphic look of comic books.

Creating the Immersive World of BioWare’s Anthem
The savage world of Anthem is volatile, lush, expansive and full of unexpected characters. Bringing these aspects to life in a realtime, interactive environment presented a wealth of problems for BioWare’s technical artists and rendering engineers. This retrospective panel will highlight the team’s work, alongside reflections on innovation and the successes and challenges of creating a new IP.

The VFX of Netflix Series
From the tragic tales of orphans to a joint force of super siblings to sinister forces threatening 1980s Indiana, the VFX teams on Netflix series have delivered some of the year’s most best visuals. Creatives behind A Series of Unfortunate Events, The Umbrella Academy and Stranger Things will present the work and techniques that brought these worlds and characters into being.

The Making of Marvel Studios’ Avengers: Endgame
The fourth installment in the Avengers saga is the culmination of 22 interconnected films and has drawn audiences to witness the turning point of this epic journey. SIGGRAPH 2019 keynote speaker Victoria Alonso will join Marvel Studios, Digital Domain, ILM and Weta Digital as they discuss how the diverse collection of heroes, environments, and visual effects were assembled into this ultimate, climactic final chapter.

Space Explorers — Filming VR in Microgravity
Felix & Paul Studios, along with collaborators from NASA and the ISS National Lab, share insights from one of the most ambitious VR projects ever undertaken. In this session, the team will discuss the background of how this partnership came to be before diving into the technical challenges of capturing cinematic virtual reality on the ISS.

Productions Sessions are open to conference participants with Select Conference, Full Conference or Full Conference Platinum registrations. The Production Gallery can be accessed with an Experiences badge and above.

London’s Media Production Show: technology for content creation

By Mel Lambert

The fourth annual Media Production Show, held June 11-12 at Olympia West, London, once again attracted a wide cross section of European production, broadcast, post and media-distribution pros. According to its organizers, the two-day confab drew 5,300 attendees and “showcased the technology and creativity behind content creation,” focusing on state-of-the-art products and services. The full program of standing room-only discussion seminars covered a number of contemporary topics, while 150-plus exhibitors presented wares from the media industry’s leading brands.

The State of the Nation: Post Production panel.

During a session called “The State of the Nation: Post Production,” Rowan Bray, managing director of Clear Cut Pictures, said that “while [wage and infrastructure] costs are rising, our income is not keeping up.” And with salaries, facility rent and equipment amortization representing 85% of fixed costs, “it leaves little over for investment in new technology and services. In other words, increasing costs are preventing us from embracing new technologies.”

Focusing on the long-term economic health of the UK post industry, Bray pointed out that few post facilities in London’s Soho area are changing hands, which she says “indicates that this is not a healthy sector [for investment].”

“Several years ago, a number of US companies [including Technicolor and Deluxe] invested £100 million [$130 million] in Soho; they are now gone,” stated Ian Dodd, head of post at Dock10.

Some 25 years ago, there were at least 20 leading post facilities in London. “Now we have a handful of high-end shops, a few medium-sized ones and a handful of boutiques,” Dodd concluded. Other panelists included Cara Kotschy, managing director of Fifty Fifty Post Production.

The Women in Sound panel

During his keynote presentation called “How we made Bohemian Rhapsody,” leading production designer Aaron Haye explained how the film’s large stadium concert scenes were staged and supplemented with high-resolution CGI; he is currently working on Charlie’s Angels (2019) with director/actress Elizabeth Banks.

The panel discussion “Women in Sound” brought together a trio of re-recording mixers with divergent secondary capabilities and experience. Participants were Emma Butt, a freelance mixer who also handles sound editorial and ADR recordings; Lucy Mitchell, a freelance sound editor and mixer; plus Kate Davis, head of sound at Directors Cut Films. As the audience discovered, their roles in professional sound differ. While exploring these differences, the panel revealed helpful tips and tricks for succeeding in the post world.


LA-based Mel Lambert is principal of Content Creators. He can be reached at mel.lambert@content-creators.com. Follow him on Twitter @MelLambertLA.

Review: Red Giant’s VFX Suite plugins

By Brady Betzel

If you have ever watched After Effects tutorials, you are bound to have seen the people who make up Red Giant. There is Aharon Rabinowitz, who you might mistake for a professional voiceover talent; Seth Worley, who can combine a pithy sense of humor and over-the-top creativity seamlessly; and my latest man-crush Daniel Hashimoto, better known as “Action Movie Dad” of Action Movie Kid.

In these videos, these talented pros show off some amazing things they created using Red Giant’s plugin offerings, such as the Trapcode Suite, the Magic Bullet Suite, Universe and others.

Now, Red Giant is trying to improve your visual effects workflow even further with the new VFX Suite for Adobe After Effects (although some work in Adobe Premiere as well).

The new VFX Suite is a compositing focused toolkit that will complement many aspects of your work, from green screen keying to motion graphics compositing with tools such as Video Copilot’s Element 3D. Whether you want to seamlessly composite light and atmospheric fog with fewer pre-composites, add a reflection to an object easily or even just have a better greenscreen keyer, the VFX Suite will help.

The VFX Suite includes Supercomp, Primatte Keyer 6, King Pin Tracker, Spot Clone Tracker, Optical Glow; Chromatic Displacement, Knoll Light Factory 3.1, Shadow and Reflection. The VFX Suite is priced at $999 unless you qualify for the academic discount, which means you can get it for $499.

In this review, I will go over each of the plugins within the VFX Suite. Up first will be Primatte Keyer 6.

Overall, I love Red Giant’s GUIs. They seem to be a little more intuitive, allowing me to work more “creatively” as opposed to spending time figuring out technical issues.

I asked Red Giant what makes VFX Suite so powerful and Rabinowitz, head of marketing for Red Giant and general post production wizard, shared this: “Red Giant has been helping VFX artists solve compositing challenges for over 15 years. For VFX suite, we looked at those challenges with fresh eyes and built new tools to solve them with new technologies. Most of these tools are built entirely from scratch. In the case of Primatte Keyer, we further enhanced the UI and sped it up dramatically with GPU acceleration. Primatte Keyer 6 becomes even more powerful when you combine the keying results with Supercomp, which quickly turns your keyed footage into beautifully comped footage.”

Primatte Keyer 6
Primatte is a chromakey/single-color keying technology used in tons of movies and television shows. I got familiar with Primatte when BorisFX included it in its Continuum suite of plugins. Once I used Primatte and learned the intricacies of extracting detail from hair and even just using their auto-analyze function, I never looked back. On occasion, Primatte needs a little help from others, like Keylight, but I can usually pull easy and tough keys all within one or two instances of Primatte.

If you haven’t used Primatte before, you essentially pick your key color by drawing a line or rectangle around the color, adjust the detail and opacity of the matte, and — boom — you’re done. With Primatte 6 you now also get Core Matte, a new feature that draws an inside mask automatically while allowing you to refine the edges — this is a real time-saver when doing hundreds of interview greenscreen keys, especially when someone decides to wear a reflective necklace or piece of jewelry that usually requires an extra mask and tracking. Primatte 6 also adds GPU optimization, gaining even more preview and rendering speed than previous versions.

Supercomp
If you are an editor like me — who knows enough to be dangerous when compositing and working within After Effects — sometimes you just want (or need) a simpler interface without having to figure out all the expressions, layer order, effects and compositing modes to get something to look right. And if you are an Avid Media Composer user, you might have encountered the Paint Effect Tool, which is one of those one-for-all plugins. You can paint, sharpen, blur and much more from inside one tool, much like Supercomp. Think of the Supercomp interface as a Colorista or Magic Bullet Looks-type interface, where you can work with composite effects such as fog, glow, lights, matte chokers, edge blend and more inside of one interface with much less pre-composing.

The effects are all GPU-accelerated and are context-aware. Supercomp is a great tool to use with your results from the Primatte Keyer, adding in atmosphere and light wraps quickly and easily inside one plugin instead of multiple.

King Pin Tracker and Spot Clone Tracker
As an online editor, I am often tasked with sign replacements, paint-out of crew or cameras in shots, as well as other clean-ups. If I can’t accomplish what I want with BorisFX Continuum while using Mocha inside of Media Composer or Blackmagic’s DaVinci Resolve, I will jump over to After Effects and try my hand there. I don’t practice as much corner pinning as I would like, so I often forget the intricacies when tracking in Mocha and copying Corner Pin or Transform Data to After Effects. This is where the new King Pin Tracker can ease any difficulties, especially when performing corner pinning on relatively simple objects but still need to keyframe positions or perform a planar track without using multiple plugins or applications.

The Spot Clone Tracker is exactly what is says it is. Much like Resolve’s Patch Replace, Spot Clone Tracker allows you to track one area while replacing that same area with another area from the screen. In addition, Spot Clone Tracker has options to flip vertical, flip horizontal, add noise, and adjust brightness and color values. For such a seemingly simple tool, the Spot Clone Tracker is the darkhorse in this race. You’d be surprised how many clone and paint tools don’t have adjustments, like flipping and flopping or brightness changes. This is a great tool for quick dead-pixel fixes and painting out GoPros when you don’t need to mask anything out. (Although there is an option to “Respect Alpha.”)

Optical Glow and Knoll Light Factory 3.1
Have you ever been in an editing session that needed police lights amplified or a nice glow on some text but the stock plugins just couldn’t get it right? Optical Glow will solve this problem. In another amazing, simple-yet-powerful Red Giant plugin, Optical Glow can be applied and gamma-adjusted for video, log and linear levels right off the bat.

From there you can pick an inner tint, outer tint and overall glow color via the Colorize tool and set the vibrance. I really love the Falloff, Highlight Rolloff, and Highlights Only functions, which allow you to fine-tune the glow and just how much it shows what it affects. It’s so simple that it is hard to mess up, but the results speak for themselves and render out quicker than with other glow plugins I am using.

Knoll Light Factory has been newly GPU-accelerated in Version 3.1 to decrease render times when using its more than 200 presets or when customizing your own lens flares. Optical Glow and Knoll Light Factory really complement each other.

Chromatic Displacement
Since watching an Andrew Kramer tutorial covering displacement, I’ve always wanted to make a video that showed huge seismic blasts but didn’t really want to put the time into properly making chromatic displacement. Lucky for me, Red Giant has introduced Chromatic Displacement! Whether you want to make rain drops appear on the camera lens or add a seismic blast from a phaser, Chromatic Displacement will allow you to offset your background with a glass-, mirror- or even heatwave-like appearance quickly. Simply choose the layer you want to displace from and adjust parameters such as displacement amount, spread and spread chroma, and whether you want to render using the CPU or GPU.

Shadow and ReflectionRed Giant packs Shadow and Reflection plugins into the VFX Suite as well. The Shadow plugin not only makes it easy to create shadows in front of or behind an object based on alpha channel or brightness, but, best of all, it gives you an easy way to identify the point where the shadow should bend. The Shadow Bend option lets you identify where the bend exists, what color the bend axis should be, the type of seam and seam the size, and even allows for motion blur.

The Reflection plugin is very similar to the Shadow plugin and produces quick and awesome reflections without any After Effects wizardry. Just like Shadow, the Reflection plugin allows you to identify a bend. Plus, you can adjust the softness of the reflection quickly and easily.

Summing Up
In the end, Red Giant always delivers great and useful plugins. VFX Suite is no different, and the only downside some might point to is the cost. While $999 is expensive, if compositing is a large portion of your business, the efficiency you gain might outweigh the cost.

Much like Shooter Suite does for online editors, Trapcode Suite does for VFX masters and Universe does for jacks of all trades, VFX Suite will take all of your ideas and help them blend seamlessly into your work.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

LumaFusion mobile filmmaking editing app updated

Luma Touch has updated LumaFusion, its video editing application for iOS. Created by video editing industry veterans Chris Demiris and Terri Morgan, LumaFusion Version 2 introduces new features and a new UI, and effectively doubles the number of audio/video tracks supported to 12 tracks, with six video tracks supporting 4K video in realtime.

The UI now features all-new vector icons streamline editing, with new track headers for locking, hiding and muting all tracks, and an overview of the timeline that lets users jump to any location in your edit with a single touch.

Keying

Additional updates include:
• New Timeline Overview:, which makes it quick and easy to see your whole project and jump to a specific location in your edit
• New Shuttle Control: Press-and-hold the Play button to scrub at different rates to find the right frame
• Track Headers with track link/unlink, track locking, hide and mute
• Flexible Editing: Video and audio clips on the primary (anchor) track let users to edit the way they want
• External Display: Users can view their video on the large screen and get more room for your timeline and library with new UI layouts
• Support for Gnarbox 2.0 SSD, as well as improvements for supporting Gnarbox1.0
• Dozens of editing and media management improvements

Ryan Connolly is a filmmaker, writer, director and creator of the YouTube channel, Film Riot. He has been testing LumaFusion 2.0. “LumaFusion is surprisingly fast and fluid, and is also perfect for doing previs on location scouts.”

LumaFusion Version 2  is available now on the App Store for $29.99, but the company is offering a discount of 50% until June 27, 2019.

DP Chat: Catch-22’s Martin Ruhe, ASC

By Randi Altman

For the bibliophiles out there, you know Catch-22 as the 1961 book by Joseph Heller. Cinephiles might remember the 1970 film of the same name starring Alan Arkin. And for those who are familiar with the saying, but not its origins, a Catch-22 is essentially a no-win situation. The famous idiom comes from the book — specifically the main character, Captain John Yossarian, a World War II bombardier who finds himself needing to escape the war, but rules and regulations hold him back.

Martin Ruhe (right) on-set with George Clooney.

Now there is yet another Catch-22 to point to: Hulu’s miniseries, which stars Christopher Abbott, Kyle Chandler, Hugh Laurie and George Clooney. Clooney is also an executive producer, alongside Grant Heslov, Luke Davies, David Michôd, Richard Brown, Steve Golin and Ellen Kuras. The series was written by Davies and Michôd and directed by Clooney, Heslov and Kuras, who each directed two episodes. It was shot entirely in Italy.

We recently reached out to the show’s German-born DP, Martin Ruhe, ASC, to find out about his workflow on the series and how he became a cinematographer.

Tell us about Catch-22. How would you describe the look of the film that you and the directors wanted to achieve?
George was very clear — he wanted to push the look of the show toward something we don’t see very often these days in TV or films. He wanted to feel the heat of the Italian summer.

We also wanted to contrast the absurdity of what happens on the ground with the claustrophobic and panic of the aerial work. We ended up with a strong warm tone and a lot of natural light. And we move the camera as if we‘re always with our hero (Abbott). Very often we travel with him in fluent camera moves, and then we contrast that with shaky hand-held camera work in the air. It was good fun to be able to have such a range to work with.

Were you given examples of the look that was wanted?
We looked at newsreel footage from the period and at stills and benefitted from production designer David Gropman‘s research. Then I took stills when we did camera tests with our actors in costume. I worked on those on my computer until we got to a place we all liked.

Stefan Sonnenfeld at Company 3 did the grading for the show and loved it. He gave us a LUT that we used for our dailies. Later, when we did the final grade, we added film grain and refined our look to what it is now.

How early did you get involved in the production?
I spoke with George Clooney and Grant Heslov for the first time four months before we started to shoot. I had eight weeks of prep.

How did you go about choosing the right camera and lenses for this project?
A lot of the scenes were happening in very small spaces. I did a lot of research on smaller cameras, and since we would have a lot of action scenes in those planes, I did not want to use any cameras with a rolling shutter.

I ended up using Arri Alexa Minis with Cooke S4 lenses and also some Flare cameras by IO industries, which could record 4K raw to Q7 Odyssey recorders. We mounted those little ones on the planes whenever they were flying for real. We also used it for the parachute jump.

This is a period piece. How did that affect your choices?
The main effect was the choice of light sources when we shot interiors and night scenes. I love fluorescents, and they existed in the period, but just not in those camps and not in the streets of Rome at night. We used a lot of practicals and smaller sources, which we spread out in the little streets of a small town where we shot, called Viterbo (standing in for Rome).

Another thing I learned was that in those camps at night, lights were blacked out. That meant we were stuck with moonlight and general ambience for night scenes, which we created with HMI sources — sometimes direct if we needed to cover big areas, like when the air base gets attacked at night in Episode 5.

Any challenging scenes that you are particularly proud of or found most challenging? 
In the end of Episode 5, Yossarian’s plane loses both engines in combat and goes down. We see YoYo and others escape the plane, while the pilot takes the plane over water and tries to land it. It’s a very dramatic scene.

We shot some exteriors of the real B25 Mitchell over Sardinia. We mounted camera systems in a DC3 and our second Mitchell to get the shots with the real planes. The destruction on the engines and the additional planes were added in post. The interiors of our actors in the plane were shot at Cinecitta Studios in Rome. We had a fuselage of a real B-25 on a gimbal. The studio was equipped with a 360-degree screen and a giant top light.

In the plane, we shot with a hand-held ARRI Alexa Mini camera. It was only the actors, myself and my focus puller inside. We never altered the physical space of the plane but instead embraced the claustrophobia. We see all of the crew members getting out — only the pilot stays on board. There was so little physical space for our actors since the fuselage was rigged to the gimbal, and then we also had to create the lighting for them to jump into within a couple of feet of space.

Then, when Yossarian leaves the plane, we actually put a small camera on a stuntman while another stuntman in Yossarian’s wardrobe did a real jump. We combined that with some plate shots from a helicopter (with a 3D plane in it) and some shots of our actor on a rig on the backlot of Cinecitta.

It all worked out. It was always our goal to shoot as many real elements as we could and leave the rest with post.

Stepping away from Catch-22. How did you become interested in cinematography?
I grew up in a small town in western Germany. No one in my family had anything to do with film. I loved movies and wanted to work on them as a director. After a little journey, I got an internship at a camera rental in London. It was then I saw for the first time what cinematographers do. I loved it and knew that was it. Then I studied in Berlin, became a focus puller for a couple of years and started working as a DP on music videos, then commercials and then, a little later, films.

What inspires you artistically?
Photography and movies. There is a lot of good work out there by a lot of talented DPs. I love to look at photographers I like as well as some documentary stills like the ones you see in the World Press Photo contest once a year. I love it when it is real. There are so many images around us every day, but if I don’t believe them (where they seem real to me), they are just annoying.

Looking back over the last few years, what new technology has changed the way you work?
Maybe LED lighting and maybe the high sensitivity of today’s digital cameras. You are so much more free in your choice of locations, days and, especially, night work because you can work with fewer lights.

What are some of your best practices or rules you try to follow on each job?
Keep it as simple as you can, and stay true to your vision.

Explain your ideal collaboration with the director when setting the look of a project.
I’m not sure there is just one way to go. After reading the script, you have an idea of what it can be, and then you start getting the information of the where and in what frame you will work.

Martin Ruhe behind the ARRI Alexa.

I love to spend time with my directors in prep — going to the locations, seeing them in different light, like mornings, noon or during night. Then I love to work with stills and sometimes also reference pictures to show what I think it can be and present a way we can get there. It’s always very important to leave some space for things to develop.

What’s your go-to gear — things you can’t live without?
I look for the right gear for each project. I like ARRI cameras, but I’ve also shot two movies with Panavision cameras.

I have shot movies in various countries, and the early ones didn’t have big budgets, so I tried to work with local crew and gear that was available. The thing I like about that is you get to know different ways of doing things, and also you might work with gear you would have never picked yourself. It keeps you flexible. When I start a project, I am trying to develop a feel for the story and the places it lives. Once I have that feel, I start into how and decide what tools I’ll use.

Photo Credit: Philippe Antonello


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Yoomin Lee joins MPC London as senior colorist

Yoomin Lee has joined Moving Picture Company’s color team in London. Lee got her start working for some of Australia’s top post houses including Frame Set & Match, The Lab and Cutting Edge, before joining Jogger Studios London in 2016.

While at Jogger, she worked on many campaigns, including those for Google, Valentino, FIFA and Samsung. A collaboration with director Anton Corbijn has seen her grade projects for Depeche Mode and U2, including the visuals for the latter’s The Joshua Tree Tour in 2017, which played across the world’s largest concert screen.

When asked what brings her inspiration, Lee says, “I get inspired by any visual art form, and often from nature, especially for light. I become more observant of how things are lit. Color grading is such a unique art form and technology, and it’s all about details and finesse. I find it very inspiring when I collaborate with creative people who are always eager to push the boundaries to achieve their craft.”

Lee will be working on FilmLight’s Baselight.

You can check out her work here.

Amazon’s Sneaky Pete: DP Arthur Albert on the look of Season 3

By Karen Moltenbrey

Crime has a way of finding Pete Murphy, or should we say Marius Josipovic (Giovanni Ribisi). Marius is a con man who assumed his cellmate’s identity when he was paroled from prison. His plan was twofold: first, pretend to be the still-incarcerated Pete, from whom the family has been estranged for the past 20 years, and hide out on their farm in Connecticut. Second, con the family out of money so he can pay back a brutal mobster (Bryan Cranston, who also produces).

Arthur Albert

Marius’s plan, however, is flawed. The family is lovable, \ quirky and broke. Furthermore, they are in the bail bond business and one of his “cousins” is a police officer — not ideal for a criminal. Ultimately, Marius starts to really care for the family while also discovering that his cover is not that safe.

Similar to how Marius’ plans on Sneaky Pete have changed, so has the show’s production on the current and final Season 3, which is streaming on Amazon now. This season, the story shifts from New York to California, in tandem with the storylines. Blake Masters also took over as showrunner, and cinematographer Arthur Albert (ER, The Blacklist, Breaking Bad, Better Call Saul) came on as director of photography, infusing his own aesthetic into the series.

“I asked Blake if he wanted me to maintain the look they had used previously, and he said he wanted to put his own stamp on it and raise the bar in every department. So, I had free rein to change the look,” notes Albert.

The initial look established for Sneaky Pete had a naturalistic feel, and the family’s bail office was lit with fluorescent lighting. Albert, in contrast, opted for a more cinematic look with portrait-style lighting. “It’s just an aesthetic choice,” he says. “The sets, designed by (Jonathan) Carlson, are absolutely brilliant, and I tried to keep them as rich and layered as possible.”

For Manhattan scenes, Masters wanted a mid-century, modern look. “I made New York moody and as interesting as I could — cooler, more contrasty,” says Albert. When the story shifts to Southern California, Masters asked for a bright, more vibrant look. “There’s a big location change. For this season, you want to feel that change. It’s a big decision for the whole family to pick up their operation and move it, so I wanted the overall look of the show to feel new and different.”

The edginess and feeling of danger, though, comes less from the lighting in this show and more from the camera movement. The use of Steadicam gives it a bit of a stalking feel, serving as a moving viewpoint.

When Albert first met with Masters, they discussed what they thought worked in previous episodes. They liked the ones that used handheld and close-up shots that were wide and close to the actor, but in the end they went with a more traditional approach used by Jon Avnet, who directed four of the 10 episodes this season.

Season 3 was primarily shot with two cameras (Albert’s son, Nick, served as second-unit DP and A-camera operator, and Jordan Keslow, B-camera/Steadicam operator). A fan of Red cameras — Albert used an early incarnation for the last six episodes of ER – he employed Red’s DSMC2 with the new Gemini 5K S35 sensor for Season 3. The Gemini leverages dual sensitivity modes to provide greater flexibility for a variety of shooting environments.

The DP also likes the way it renders skin tones without requiring diffusion. “The color is really true and good, and the dynamic range is great. It held for really bright window areas and really dark areas, both with amazing range,” he says. The interiors of the sets were filmed on a stage in Los Angeles, and the exteriors were shot on location afterward. With the Gemini’s two settings (standard mode for well-lit conditions and a low-light setting), “You can shoot a room where you can barely see anyone, and it looks fully lit, or if it’s a night exterior where you don’t have enough time, money or space to light it, or in a big set space where suddenly you want to shoot high speed and you need more light. You just flip a switch, and you’ve got it. It was very clean with no noise.”

This capability came in handy for a shoot in Central Park at night. The area was heavily restricted in terms of using lights. Albert used the 3200 ISO setting and the entire skyline of 59th Street was visible — the clouds and how they reflected the light of the buildings, the detail of the night sky, the silhouettes of the buildings. In another similar situation, he used the low-light setting of the camera for a night sequence filmed in Grand Central Terminal. “It looked great, warm and beautiful; there is no way we could have lit that vast space at night to accommodate a standard ISO,” says Albert.

As far as lenses on Sneaky Pete, they used the Angenieux short zooms because they are lightweight and compact, can be put on a Steadicam and are easy to hold. “And I like the way they look,” Albert says. He also used the new Sigma prime lenses, especially when an extreme wide angle was needed, and was impressed with their sharpness and lack of distortion.

Throughout filming, the cinematographer relied on Red’s IPP2 (image processing pipeline) in-camera, which resulted in a more effective post process, as it is designed for an HDR workflow, like Sneaky Pete — which is required by Amazon.

The color grade for the series was done at Level 3 Post by Scott Ostrowsky, who had also handled all the previous seasons of Sneaky Pete and with whom Albert had worked with on The Night Shift and other projects. “He shoots a very cinematic look and negative. I know his style and was able to give him that look before he came into the suite. And when we did the reviews together, it was smooth and fast,” Ostrowsky says. “At times Sneaky Pete has a very moody look, and at times it has a very open look, depending on the environment we were shooting in. Some of the dramatic scenes are moody and low-light. Imagine an old film noir movie, only with color. It’s that kind of feel, where you can see through the shadows. It’s kind of inky and adds suspense and anticipation.”

Ostrowsky worked with the camera’s original negative — “we never created a separate stream,” he notes. “It was always from the camera neg, unless we had to send a shot out for a visual effects treatment.”

Sneaky Pete was shot in 5K, from which a 3840×2160 UHD image was extracted, and that is what Ostrowsky color graded. “So, if I needed to use some kind of window or key, it was all there for me,” he says. Arthur or Nick Albert would then watch the second pass with Ostrowsky, who would make any further changes, and then the producers would watch it, adding their notes. Ostrowsky worked used the Blackmagic DaVinci Resolve.

“I want to make the color work for the show. I don’t want the color to distract from the show. The color should tell the story and help the story,” adds Ostrowsky.

While not every change has been for the best for Pete himself since Season 1, the production changes on Sneaky Pete’s last season appear to be working just fine.


Karen Moltenbrey is a veteran VFX and post writer.

Remembering ARRI’s Franz Wieser

By Randi Altman

Franz Wieser passed away last week, and the world is worse for it. I’ve known Franz for over 20 years, going back to when he was still based in ARRI’s Blauvelt, New York, office and I was editor of Post Magazine.

We would meet in the city from time to time for an event or a meal. In fact, he introduced me to a hidden gem of a restaurant just off Washington Square Park that has become one of my favorites. It reminds me of him — warm, friendly and welcoming.

I always laugh when I remember him telling me about when his car broke down here in New York. Even though he had his hazard lights on and it was clear his car wasn’t cooperating, people kept driving by and giving him the finger. He was bemused but incredulous, which made it even funnier.

Then he moved to LA and I saw him less… a quick hello at trade shows a couple of times a year. When I think of Franz, I remember his smile first and how soft spoken and kind he was.

He touched many over the years and their stories are similar to mine.

“I have known Franz for nearly two decades, but it was during the earliest days of ARRI’s digital era that we truly connected,” shares Gary Adcock, an early ARRI digital adopter, writer and industry consultant. “We got together after one of the director of photography conferences I chaired at NAB to talk about ARRI’s early D20 and D21 digital cameras. Franz was just a great person, always a kind word, always wanting to know how your family and friends were. It will be that kindness that I will miss the most.”

“This is such sad news,” says Andy Shipsides, CTO at Burbank’s AbleCine. “Franz was a dear friend and will be greatly missed. He was an amazing person and brought fun and levity to his work everyday. I had lunch with him several months ago and I feel lucky to have shared that time with him. Franz was a truly a delightful person. He took me out when I first moved to LA to welcome me to the city, which I will always remember. He always had a smile on his face, and his positive energy was contagious. He will be very much missed, a big loss for our industry.”

ARRI sent out the following about Franz.

It is with great sadness, that we share the news of the passing of Franz Wieser, VP, marketing at ARRI Inc.

Franz Wieser grew up in Rosenheim in Bavaria, Germany. He was originally hired by ARRI CT in nearby Stephanskirchen, where ARRI’s Lighting factory is situated. Franz started at ARRI with an internship with Volker Bahnemann, a member of the supervisory board of the ARRI Group, at what was then called Arriflex Corporation in Blauvelt, NY, USA, and spent some time doing market research in New York and California.

In July 1994, Franz accepted a position as marketing manager at Arriflex with Volker Bahnemann and relocated to New York at that time. Franz had a distinguished career of 25 years in marketing for Arriflex and ARRI Inc., leading to his current position of VP of marketing based in the ARRI Burbank office. His contributions spanned the marketing of ARRI film and digital camera systems and analog and digital lighting fixtures. He also built sustaining relationships with the American Society of Cinematographers (ASC) and many others in the film and television industry. His ability to connect with people, his friendliness and reliability, along with his deep understanding of the film industry was outstanding. He was a highly valued member of the global marketing network and a wonderful person and colleague.

Glenn Kennel, president and CEO of ARRI Inc., says “Franz will be remembered by his colleagues and many friends in the industry as a friend and mentor, willing to listen and help. He always had a smile on his face and a gracious approach.”

We are very saddened by his early loss and will remember him well. Our deepest sympathy goes out to his wife and his parents. 

Lenovo intros next-gen ThinkPads

Lenovo has launched the next generation of its ThinkPad P Series with the release of five new ThinkPads, including the ThinkPad P73, ThinkPad P53, ThinkPad P1 Gen 2 and ThinkPad P53s and P43s.

The ThinkPad P53 features the Nvidia Quadro RTX 5000 GPU with RT and Tensor cores, offering realtime raytracing and AI acceleration. It now features Intel Xeon and 9th Gen Core class CPUs with up to eight cores (including the Core i9) up to 128GB of memory and 6TB of storage.

This mobile workstation also boasts a new OLED touch display with Dolby Vision HDR for superb color and some of the deepest black levels ever. Building on the innovation behind the ThinkPad P1 power supply, Lenovo is also maximizing the portability of this workstation with a 35 percent smaller power supply. The ThinkPad P53 is designed to handle everything from augmented reality and VR content creation to the deployment of mobile AI or ISV workflows. The ThinkPad P53 will be available in July, starting at $1,799.

At 3.74 pounds and 17.2mm thin, Lenovo’s thinnest and lightest 15-inch workstation — the ThinkPad P1 Gen 2 — includes the latest Nvidia Quadro Turing T1000 and T2000 GPUs. The ThinkPad P1 also features eight-core Intel 9th Gen Xeon and Core CPUs and an OLED touch display with Dolby Vision HDR.

The ThinkPad P1 Gen 2 will be available at the end of June starting at $1,949.

With its 17.3-inch Dolby Vision 4K UHD screen and mobility with a 35% smaller power adaptor, Lenovo’s ThinkPad P73 offers users maximum workspace and mobility. Like the ThinkPad 53, it features the Intel Xeon and Core processors and the most powerful Nvidia Quadro RTX graphics. The ThinkPad P73 will be available in August starting at $1,849.

The ThinkPad P43s features a 14-inch chassis and will be available in July starting at $1,499.

Rounding out the line is the ThinkPad P53s which combines the latest Nvidia Quadro graphics and Intel Core processors — all in a thin and light chassis. The ThinkPad P53s will be available in June, starting at $1,499.

For the first time, Lenovo is adding new X-Rite Pantone Factory Color Calibration to the ThinkPad P1 Gen 2, ThinkPad P53 and ThinkPad P73. The unique factory color calibration profile is stored in the cloud to ensure more accurate recalibration. This profile allows for dynamic switching between color spaces, including sRGB, Adobe RGB and DCI-P3 to ensure accurate ISV application performance.

The entire ThinkPad portfolio is also equipped with advanced ThinkShield security features – from ThinkShutter to privacy screens to self-healing BIOS that recover when attacked or corrupted – to help protect users from every angle and give them the freedom to innovate fearlessly.

Quick Chat: Sinking Ship’s Matt Bishop on live-action/CG series

By Randi Altman

Toronto’s Sinking Ship Entertainment is a production, distribution and interactive company specializing in children’s live-action and CGI-blended programming. The company has 13 Daytime Emmys and a variety of other international awards on its proverbial mantel. Sinking Ship has over 175 employees across all its divisions, including its VFX and interactive studio.

Matt Bishop

Needless to say, the company has a lot going on. We decided to reach out to Matt Bishop, founding partner at Sinking Ship, to find out more.

Sinking Ship produces, creates visual effects and posts its own content, but are you also open to outside projects?
Yes, we do work in co-production with other companies or contract our post production service to shows that are looking for cutting-edge VFX.

Have you always created your own content?
Sinking Ship has developed a number of shows and feature films, as well as worked in co-production with production companies around the world.

What came first, your post or your production services? Or were they introduced in tandem?
Both sides of company evolved together as a way to push our creative visions. We started acquiring equipment on our first series in 2004, and we always look for new ways to push the technology.

Can you mention some of your most recent projects?
Some of our current projects include Dino Dana (Season 4), Dino Dana: The Movie, Endlings and Odd Squad Mobile Unit.

What is your typical path getting content from set to post?
We have been working with Red cameras for years, and we were the first company in Canada to shoot in 4K over a decade ago. We shoot a lot of content, so we create backups in the field before the media is sent to the studio.

Dino Dana

You work with a lot of data. How do you manage and keep all of that secure?
Backups, lots of backups. We use a massive LTO-7 tape robot and we have over a 2PB of backup storage on top of that. We recently added Qumulo to our workflow to ensure the most secure method possible.

What do you use for your VFX work? What about your other post tools?
We use a wide range of software, but our main tools in our creature department are Pixologic Zbrush and Foundry Mari, with all animation happening inside Autodesk Maya.

We also have a large renderfarm to handle the amount of shots, and our render engine of choice is Arnold, which is now an Autodesk project.  In post we use an Adobe Creative Cloud pipeline with 4K HDR color grading happening in DaVinci Resolve. Qumulo is going to be a welcome addition as we continue to grow and our outputs become more complex.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Axis provides 1,000 VFX shots for the TV series Happy!

UK-based animation and visual effects house Axis Studios has delivered 1,000 shots across 10 episodes on the second series of the UCP-produced hit Syfy show Happy!.

Based on Grant Morrison and Darick Robertson’s graphic novel, Happy! follows alcoholic ex-cop turned hitman Nick Sax (Christopher Meloni), who teams up with imaginary unicorn Happy (voiced by Patton Oswalt). In the second season, the action moves from Christmastime to “the biggest holiday rebranding of all time” and a plot to “make Easter great again,” courtesy of last season’s malevolent child-kidnapper, Sonny Shine (Christopher Fitzgerald).

Axis Studios, working across its three creative sites in Glasgow, Bristol, and London, collaborated with executive producer and director Brian Taylor and showrunner Patrick Macmanus to raise the bar on the animation of the fully CG character. The studio also worked on a host of supporting characters, including a “chain-smoking man-baby,” a gimp-like Easter Bunny and even a Jeff Goldblum-shaped cloud. Alongside the extensive animation work, the team’s VFX workload greatly increased from the first season — including two additional episodes, creature work, matte painting, cloud simulations, asset building and extensive effects and clean-up work.

Building on the success of the first season, the 100-person team of artists further developed the animation of the lead character, Happy!, improving the rig, giving more nuanced emotions and continually working to integrate him more in the real-world environments.