Author Archives: Randi Altman

The-Artery sees red, creates VFX for Huawei’s AppGallery

The-Artery recently worked on a global campaign for agency LH in Israel, and consumer electronics brand Huawei’s official app distribution platform, AppGallery.

The campaign — set to an original musical track called Explore It by artist Tomer Biran — is meant to show the AppGallery as more than a mobile app store, but rather as a gate to an endless world of digital content that comes with data protection and privacy.

Each scene features the platform’s signature red square logo but shown in a variety of creative ways thanks to The-Artery’s visual effects work. This includes floating Tetris-like cubes that change with the beat of the music, camera focuses, red-seated subway cars with a floating red cube and more.

“Director Eli Sverdlov, editor Noam Weissman and executive producer Kobi Hoffman all have distinct artistic processes that are unforgiving to conventional storytelling,” explains founder/executive creative director Vico Sharabani. “We had ongoing conversations about how to create a deeper connection between the brand and audiences. The agency, LH, gave us the freedom to really explore the fun, convenience and security behind downloading apps on the Huawei AppGallery.”

Filming took place across the globe in Kiev, Ukraine, via production company Jiminy Creative Tel Aviv, while editing, design, animation, visual effects and color grading were all done under one roof in The-Artery’s New York studio. The entire production was completed in only 16 days.

The studio used Autodesk’s Flame and 3DS Max, Side Effects Houdini, Adobe’s After Effects and Photoshop for the visual effects and graphics. Colorist: Steve Picano called on Blackmagic’s DaVinci Resolve. Asaf Bitton provided sound design.

Quick Chat: Editing Leap Day short for Stella Artois

By Randi Altman

To celebrate February 29, otherwise known as Leap Day, beer-maker Stella Artois released a short film featuring real people who discover their time together is valuable in ways they didn’t expect. The short was conceived by VaynerMedia, directed by Division7s Kris Belman and cut by Union partner/editor Sloane Klevin. Union also supplied Flame work on the piece.

The film begins with the words, ”There is a crisis sweeping the nation” set on a black screen. Then we see different women standing on the street talking about how easy it is to cancel plans. “You’re just one text away,” says one. “When it’s really cold outside and I don’t want to go out, I use my dog excuse,” says another. That’s when the viewer is told, through text on the screen, that Stella Artois has set out to right this wrong “by showing them the value of their time together.”

The scene changes from the street to a restaurant where friends are reunited for a meal and a goblet of Stella after not seeing each other for a while. When the check comes the confused diners ask about their checks, as an employee explains, that the menu lists prices in minutes, and that Leap Day is a gift of 24 hours and that people should take advantage of that by “uncancelling plans.”

Prior to February 29, Stella encouraged people to #UnCancel plans and catch up with friends over a beer… paid for by the brand. Using the Stella Leap Day Fund — a $366,000 bank of beer reserved exclusively for those who spend time together (there are 366 days in a Leap Year) — people were able to claim as much as a 24-pack when sharing the film using #UnCancelPromo and tagging someone they would like to catch up with.

Editor Sloane Klevin

For the film short, the diners were captured with hidden cameras. Union editor Klevin, who used an Avid Media Composer 2018.12.03 with EditShare storage, was tasked with finding a story in their candid conversations. We reached out to her to find out more about the project and her process.

How early did you get involved in this project, and what kind of input did you have?
I knew I was probably getting the job about a week before they shot. I had no creative input into the shoot; that really only happens when I’m editing a feature.

What was your process like?
This was an incredibly fast turnaround. They shot on a Wednesday night, and it was finished and online the following Wednesday morning at 12am.

I thought about truncating my usual process in order to make the schedule, but when I saw their shooting breakdown for how they planned to shoot it all in one evening, I knew there wouldn’t be a ton of footage. Knowing this, I could treat the project the way I approach most unscripted longform branded content.

My assistant, Ryan Stacom, transcoded and loaded the footage into the Avid overnight, then grouped the four hidden cameras with the sound from the hidden microphones — and, brilliantly, production had time-of-day timecode on everything. The only thing that was tricky was when two tables were being filmed at once. Those takes had to be separated.

The Simon Says transcription software was used to transcribe the short pre and post interviews we had, and Ryan put markers from the transcripts on those clips so I could jump straight to a keyword or line I was searching for during the edit process. I watched all the verité footage myself and put markers on anything I thought was usable in the spot, typing into the markers what was said.

How did you choose the footage you needed?
Sometimes the people had conversations that were neither here nor there, because they had no idea they were being filmed, so I skipped that stuff. Also, I didn’t know if the transcription software would be accurate with so much background noise from the restaurant on the hidden table microphones, so markering myself seemed the best option. I used yellow markers for lines I really liked, and red for stuff I thought we might want to be able to find and audition, but those wasn’t necessarily my selects. That way I could open the markers tool, and read through my yellow selects at a glance.

Once I’d seen everything, I did a music search of Asche & Spencer’s incredibly intuitive, searchable music library website, downloaded my favorite tracks and started editing.  Because of the fast turnaround, the agency was nice enough to send an outline for how they hoped the material might be edited. I explored their road map, which was super helpful, but went with my gut on how to deviate. They gave me two days to edit, which meant I could post for the director first and get his thoughts.

Then I spent the weekend playing with the agency and trying other options. The client saw the cut and gave notes on both days I was with the agency, then we spent Monday and Tuesday color correcting (thanks to Mike Howell at Color Collective), reworking the music track, mixing (with Chris Afzal at Wave Studios), conforming, subtitling.

That was a crazy fast turnaround.
Considering how fast the turnaround was, it went incredibly smoothly. I attribute that to the manageable amount of footage, fantastic casting that got us really great reactions from all the people they filmed, and the amount of communication my producer at Union and the agency producer had in advance.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

 

Western Digital intros WD Gold NVMe SSDs  

Western Digital has introduced its new enterprise-class WD Gold NVMe SSDs designed to help small- and medium-sized companies transition to NVMe storage. The SSDs offer power loss protection and high performance with low latency.

The WD Gold NVMe SSDs will be available in four capacities — .96TB, 1.92TB, 3.84TB and 7.68TB — in early Q2 of this year. The WD Gold NVMe SSD is designed to be, according to the company, “the primary storage in servers delivering significantly improved application responsiveness, higher throughput and greater scale than existing SATA devices for enterprise applications.”

These new NVMe SSDs complement the recently launched WD Gold HDDs by providing a high-performance storage tier for applications and data sets that require low latency or high throughput.

The WD Gold NVMe SSDs are designed using Western Digital’s silicon-to-system technology, from its 3D TLC NAND SSD media to its purpose-built firmware and own integrated controller. The drives give users peace of mind knowing they’re protected against power loss and that data paths are safe. Secure boot and secure erase provide users with additional data-management protections, and the devices come with an extended five-year limited warranty.

Krista Liney directs Ghost-inspired promo for ABC’s The Bachelor

Remember the Ghost-inspired promo for ABC’s The Bachelor, which first aired during the 92nd Academy Awards telecast? ABC Entertainment Marketing developed the concept and wrote the script, which features current Bachelor lead Peter Weber in a send-up of the iconic pottery scene in Ghost between Demi Moore and Patrick Swayze. It even includes the Righteous Brothers song, Unchained Melody, which played over that scene in the film.

ABC Entertainment Marketing tapped Canyon Road Films to produce and Krista Liney to direct. Liney captured Peter taking off his shirt, sitting down at the pottery wheel and “getting messy” — a metaphor for how messy his journey to love has been. As he starts to mold the clay, he is joined by one set of hands, then another and another. As the clay collapses, Whoopi Goldberg appears to say, “Peter, you in danger, boy” – a take-off of the line she delivers to Moore’s character in the film.

This marks Liney’s first shoot as a newly signed director coming on board at Canyon Road Films, a Los Angeles-based creative production company that specializes in television promos and entertainment content.

Liney has a perspective from the side of the client and the production house, having previously served as a marketing executive on the network side. “With promos, I aim to create pieces that will cut through the clutter and command attention,” she explains. “For me, it’s all about how I can best build the anticipation and excitement within the viewer.”

The piece was shot on an ARRI Alexa Mini with Primes and Optimo lenses. ABC finished the spot in-house.

Other credits include EP Lara Wickes and DP Eric Schmidt.

Sonnet intros USB to 5GbE adapter for Mac, Windows and Linux

Sonnet Technologies has introduced the Solo5G USB 3 to 5Gb Ethernet (5GbE) adapter. Featuring NBASE-T (multigigabit) Ethernet technology, the Sonnet Solo5G adapter adds 5GbE and 2.5GbE network connectivity to an array of computers, allowing for superfast data transfers over the existing Ethernet network cabling infrastructure found in most buildings today.

Measuring 1.5 inches wide by 3.25 inches deep by 0.7 inches tall, the Solo5G is a compact, fanless 5GbE adapter for Mac, Windows and Linux computers. Equipped with an RJ45 port, the adapter supports 5GbE and 2.5GbE (5GBASE-T and 2.5GBASE-T, respectively) connectivity via common Cat 5e (or better) copper cabling at distances of up to 100 meters. The adapter’s USB port connects to a USB-A, USB-C or Thunderbolt 3 port on the computer and is bus-powered for convenient, energy-efficient and portable operation.

Cat 5e and Cat 6 copper cables — representing close to 100% of the installed cable infrastructure in enterprises worldwide — were designed to carry data at only up to 1Gb per second. NBASE-T Ethernet was developed to boost the speed capability well beyond that limit. Sonnet’s Solo5G takes advantage of that technology.

When used with a multigigabit Ethernet switch or a 10Gb Ethernet switch with NBASE-T support — including models from Buffalo, Cisco, Netgear, QNAP, TrendNet and others — the Sonnet adapter delivers performance gains from 250% to 425% the speed of Gigabit Ethernet without a wiring upgrade. When connecting to a multigigabit Ethernet-compatible switch is not possible, the Solo5G also supports 1Gb/s and 100Mb/s link speeds.

Sonnet’s Solo5G includes 0.5-meter USB-C to USB-C and USB-C to USB-A cables for connecting the adapter to the computer, saving users the expense of buying a second cable when needed.

The Solo5G USB-C to 5 Gigabit Ethernet adapter is available now for $79.99.

DP Chat: Carnival Row cinematographer Chris Seager

By Randi Altman

For British DP Chris Seager, BSC, his path to movies and television began at film school. After graduation, he found his way to BBC Television’s film department, working on docs and TV movies, including John Schlesinger’s Cold Comfort Farm, starring Kate Beckinsale, Ian McKellen and Rufus Sewell. Soon after, he left the BBC and started his career as a freelance cinematographer.

Chris Seager

Seager’s CV is long, and includes the films A Kind of Murder, Retreat and New in Town and series such as The Alienist, Watchmen and most recently Amazon Prime’s Carnival Row. In fact, Seager, who has been working on Season 2 of the series, received an ASC Award nomination in the non-commercial TV series category for his work on Episode 5, “Grieve No More.”

The show, which stars Orlando Bloom and Cara Delevingne, follows a variety of mythical creatures who had to flee their homeland for the big city during what looks very much like a fantastical Victorian era. As you can imagine, tensions grow between the local humans and these magical immigrants, who are forced to live in a ghetto known as Carnival Row.

We reached out to DP Seager to find out more about the show’s look, his workflow and what inspires him.

Can you talk about the look of the show?
There had been many discussions — between the producers, writers, Legendary, Amazon, the production designer, costume designer, etc. — before I came on board. The production design team had produced some very fine concept drawings that firmly put the show in a, shall we say, “Victorian period.”

That decision led everyone to research that period, so shape, color and design of buildings, sets, costumes and practical lights. For me, that meant the use of candles, oil and gas lamps and the warmth they generated in terms of color and quality of the emitted light from each of those sources. The variety of locations — from The Row exteriors to government buildings, a brothel, bars, upper class establishments and more — gave me many opportunities to use light sources to full effect.

The nighttime streets in The Row showed dark seedy corners and alleyways intermixed with orange dancing flames from braziers, with the street people warming their hands. The streets were awash with rain and mud, horses and carriages, humans, faes and pucks, all fighting their way through the smoke from the fires. It was backlit with a sultry greenish moonlight that gave the cinematic images that brought the viewer into the period. Daylight was slightly cold and threatening and was mixed with the oil lamp warmth on interiors.

How did the showrunners/producers tell you what they wanted for the look?
It all starts with the scripts. I’m a firm believer that it is the words that conjure up the emotions within the script, and in turn they are echoed in the cinematography. And not only the cinematography but the production design, the costumes the makeup, the visual effects, the editing.

In prep, I really like to spend time with the director and production designer going through the script page by page. It’s those first conversations that begin to bring life to the script, the mood of the actors in a scene, their emotions, their fear or anger, are they happy or sad. Just gaining that information from the director plants a suggestion of the feel of that particular scene, whether it’s hard shafts of light, high sun, moody sunset, soft silky light or dark dingy light.

A mood begins to be set and a discussion will take place about the use of the camera — whether it’s still, fast-moving, reflective or perhaps angry. Then comes the choice of the lens package. There are many choices and collectively — through the collaboration with the director — a mood and style emerges, which the team can take on board.

How early do you get involved in a production?
The cheeky answer to that is, probably never early enough. In truth, it does depend on the complexity of the production. You nearly always think you need more prep time, but invariably you just about get enough. Most departments in the filmmaking process would say the same.

Certainly, prep time is very important. It’s when you formulate the style, look and feeling of the piece. It’s also when you have time to meet, discuss, ask questions and get the answers that start to put shape to the project. Then you begin to plan scenes with the director and all the relevant departments that make up the team. On Carnival Row’s first season, I got six weeks of prep before we started shooting.

Can you talk about working with the show’s other departments?
One of the joys of being a cinematographer on a production is working with the other creative departments. Collectively, we are all responsible for giving the show a look. My first contact with other departments is usually the locations team and the production designer. Typically, these two teams have been busy before I arrive on the show, so some locations and set designs have already been looked at or even chosen.

During the first few days of my prep, I get up to speed quickly with their ideas and plans. This is often done with meetings with the showrunner, director, production designer, locations, 1st AD and visual effects to talk through the show’s concepts and journey. Then we discuss script requirements regarding locations or set builds and set extensions (CGI).

Do you enjoy working with the design team?
I love it. Lots of sets had to be built during the shooting of Carnival Row. Some were the mainstay sets, like the Constabulary, Spurnrose House, Balefire, Parliament and the boarding house. Then, of course, the backlot street set and numerous location sets, as well as real locations. Six stages were used at Prague’s Barrandov Studio. Discussions with the production designer were mostly about the size of a set, number of windows, entrances and ceilings, or whether to have them or not. And if you do have them, what’s their height, the set texture, color and darkness, etc. Not a day would go by without a discussion with the design team.

I would also talk the with costume and makeup departments about the colors of the costumes and hair styles, all important aspects of the show. There would be lots of “show and tell” with costumes and props. The makeup effects department was a fun place to visit. It was here where the wings of the fae were designed and built, along with the pucks’ horns and numerous dead bodies.

Can you talk about your camera team?
My A camera operator, Jakub Dvorsky, was a dream to work with. He somehow seemed to instantly understand what I would require with a shot. My gaffer, Enzo Cermak, was also exceptional, as was his team of talented friendly electricians. Thanks to Enzo’s help, I was able to effectively paint a scene.

Earlier you mentioned using candles, oil and gas lamps for lighting. Can you dive into that a bit further?
I liked the idea that the poorer streets of The Row had a cold daylight look to them, interspersed with firelight used for roasting chestnuts, cooking food or just for warmth. That, along with smoke from the fires, gave it a particular look. This cold light was used for the interiors scenes as well. For the Burgue, or well-off areas, we used warmer light with more contrast, and windows on sets were larger. This allowed more light in.

What about the nights?
Night shoots on The Row backlot were backlit or cross-lit with a blue/green moonlight, as referenced earlier. I used ¼ or ½ or sometimes full Wendy lights (tungsten) depending on distance from the set, with ½ blue and ¼ plus green gels. Invariably, if I used a ½ Wendy, one section would have ¼ or ½ diffusion filter as well as the moonlight gels. This gave me options to have a softer moonlight if needed, and I also had the ability to switch off sections of the Wendy lights to get the exposure levels that I wanted.

I would also use an LED tube balloon from a crane over the mid sections of the street set to allow a soft top moonlight. On the busy Row streets, I made use of brazier fires where street sellers cooked food. This gave me the warm light that contrasted with the moonlight. I would then add smoke and the scene would be set.

For interiors, I used a mixture of candles and oil lamps and, occasionally, gas lamps. Candles were used in the Haruspex set to great effect, and also the brothel set. Oil lamps were used in houses and the Constabulary.

Real oil lamps?
Some were real oil lamps and others were look-alike oil lamps. For these I used 650W lighting bulbs dimmed down to around 18% and then a slight flicker was added; the result was a very convincing warm glowing oil lamp look.

You mentioned that the show was shot in Prague?
Yes, Carnival Row was shot in the Czech Republic. The production base was at the historically built Barrandov Studio in Prague. Locations in and around Prague were also used.

You shot on the ARRI Alexa Mini. How did you go about choosing the right camera and lenses this project?
Our frame size was a wide-screen format 1: 2.40. and the lens package was the ARRI Master Primes. The cameras gave us 3.2K upscaled to 4K picture quality. The wide-screen format gave us the ability to use the wonderful width of the frame to our advantage. The Master Primes give us the solid look that they are known for, with solid frames with little to no distortion and with good contrast.

Why the Alexa Mini?
I’m a fan of the ARRI Alexa range of digital cameras. I was always keen to use ARRI film cameras when film was at its height. When digital cameras started to become the vogue, ARRI brought out the D21 camera. It was a heavy, rather large camera with what would be described today as having limited digital prowess. Strangely, I liked the “look” this D21 gave me and used it on quite a few TV shows up until the ARRI Alexa hit the scene.

The Alexa was a game changer for cinematographers. I believe that the Alexa Mini was designed to be used as its name suggests: as a compact camera to be used in tight corners or on weight-limited camera rigs, like Steadicam and stabilized rigs. However, it was soon being used as the studio camera on many productions, and thanks to upgrades over time, it has become my “must-have” camera. It has a wonderful look, and when used in low light, it seems to have a different life. You can push it and pull it into different exciting looks. It’s my friend.

Any scenes that you are particularly proud?
There are many, but here is an example of one such scene. It’s in Season 1, Episode 5, directed by Andy Goddard. Philo (Orlando Bloom) revisits his childhood orphanage to investigate a murder. Between me and Andy, we set up a series of shots following Philo into his old dormitory as memories of his childhood come flooding back.

With no words spoken, we tracked with him through the numerous beds in this grey stone room, with haze-filled soft light coming through tall soulless windows. This gave the room a monochrome feel — a going-back-in-time look. We then devised a camera dolly shot that tracks and pans with Philo on the dolly and moving with the camera as if he is floating. As we tracked, we panned the camera along with Philo, and that developed into a flashback of him as a child with his friends. We then cut out to a wider shot to reveal just Philo standing alone, and the flashback has disappeared. We used this technique a couple of times within those scenes and they were both telling and subtle.

 

Is this like any other project you’ve worked on?
A short answer to that is no. Working on Game of Thrones (Season 3)is probably the nearest it gets, but Carnival Row is a very different beast. Each episode had a wonderment about it and is very magical in some way. The whole series was written as a fantasy world, set in a supposedly Victorian age, with humans, pucks and fae all thrown together. It was dark, mysterious, dangerous, intriguing and exciting.

How did you become interested in cinematography?
It all started when I was 11 years old. The family TV, one late September night, suddenly went bang, with a cloud of blue smoke and a flash. I was somehow fascinated by this event, and on my 12th birthday, my parents bought me a wonderfully illustrated book on how television worked from the television studio to the home. I was then on a mission to be involved somehow in the TV/film business. Art was one of my favorite subjects at school and my art teacher encouraged me to take up photography alongside my painting.

At 18 years old, off I went to art school to study photography. I enjoyed my first year, but I somehow became more interested in the team-oriented film and TV crowd. I moved from photography to cinematography, and the rest is history.

What inspires you artistically?
The obvious answer to that is art. Paintings inspire me. They always have. The way an artist uses light, shape, form, darkness, color, technique, composition, aspect ratio and sheer size or smallness of a canvas, how depth is created, senses of emotion, fear, happiness. Photography equally inspires me. Black and white versus color and that “decisive moment” when the shot is taken, a magnificent moment.

How do you stay on top of new technology?
Advancing technology comes at you from seemingly every direction today. The speed of that advancement during the 20th and 21st centuries is outstanding. I started shooting film at school with a clockwork wind-up 16mm Bolex camera with just two lenses, and look where we are now. I love the technical revolution. It’s important to embrace it, otherwise it overtakes you.

I seem to always be reading trade magazines to see the new developments. It’s also important to talk with other cinematographers to discuss their views and experiences on new technology.

Looking back, what technology has changed the way you work?
I suppose the biggest game changer, apart from digital cameras, is the advancement in LED light fixtures. For me, to be able to use light fixtures like the ARRI SkyPanel LED range — it offers low power and low-output heat, bi-color capabilities, dimming and effects… plus it has firmware that lets you produce gel colors across the spectrum — is just awesome. Camera and grip technology have also changed. The use of high-quality/small-footprint drones are an example, along with telescopic cranes with stabilized heads and cable camera systems.

Digital cameras have advanced over the last few years too, with 4K and 6K capability along with ISO changes from the base ISO of 800 to 2500 and 5000. There’s also the on-set color grading facility that enables the cinematographer to put his/her “look” on to the dailies.

What are some of your best practices you try to follow on each job?
For me, being well prepped and turning up on set early every day is a must for me. I have that nervous adrenaline hit within me at the start of the shooting day. A mixture of excitement and just nerves, which is the way I am. As soon as the rehearsal starts, I’m calm… well, mostly.

Getting into the director’s head [so to speak] is also important for me. Finding out what they like or dislike. We have to be a team and that includes the 1st AD, production designer, operator and many more. It’s important to remember that filmmaking is a team effort and I,  for one, encourage input from my team.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Foundry Nuke 12.1 offers upgrades across product line

Foundry has released Nuke 12.1, with UI enhancements and tool improvements across the entire Nuke family. The largest update to Blink and BlinkScript in recent years improves Cara VR node performance and introduces new tools for developers, while extended functionality in the timeline-based applications speeds up and enriches artist and team review.

Here are the upgrade highlights:
– New Shuffle node updates the classic checkboxes with an artist-friendly, node-based UI that supports up to eight channels per layer (Nuke’s limit) and consistent channel ordering, offering a more robust tool set at the heart of Nuke’s multi-channel workflow.
– Lens Distortion Workflow improvements: The LensDistortion node in NukeX is updated to have a more intuitive workflow and UI, making it easier and quicker to access the faster and more accurate algorithms and expanded options introduced in Nuke 11.
– Blink and BlinkScript improvements: Nuke’s architecture for GPU-accelerated nodes and the associated API can now store data on the GPU between operations, resulting in what Foundry says are “dramatic performance improvements to chains of nodes with GPU caching enabled.” This new functionality is available to developers using BlinkScript, along with bug fixes and a debug print out on Linux.
– Cara VR GPU performance improvements: The Cara VR nodes in NukeX have been updated to take advantage of the new GPU-caching functionality in Blink, offering performance improvements in viewer processing and rendering when using chains of these nodes together. Foundry’s internal tests on production projects show rendering time that’s up to 2.4 times faster.
– Updated Nuke Spherical Transform and Bilateral: The Cara VR versions of the Spherical Transform and Bilateral nodes have been merged with the Nuke versions of these nodes, adding increased functionality and GPU support in Nuke. Both nodes take advantage of the GPU performance improvements added in Nuke 12.1. They are now available in Nuke and no longer require a NukeX license.
– New ParticleBlinkScript node: NukeX now includes a new ParticleBlinkScript node, allowing developers to write BlinkScripts that operate on particles. Nuke 12.1 ships with more than 15 new gizmos, offering a starting point for artists who work with particle effects and developers looking to use BlinkScript.
– QuickTime audio and surround sound support: Nuke Studio, Hiero and HieroPlayer now support multi-channel audio. Artists can now import Mov containers holding audio on Linux and Windows without needing to extract and import the audio as a separate Wav file.

– Faster HieroPlayer launch and Nuke Flipbook integration: Foundry says new instances of HieroPlayer launch 1.2 times faster on Windows and up to 1.5 times faster on Linux in internal tests, improving the experience for artists using HieroPlayer for review. With Nuke 12.1, artists can also use HieroPlayer as the Flipbook tool for Nuke and NukeX, giving them more control when comparing different versions of their work in progress.
– High DPI Windows and Linux: UI scaling when using high-resolution monitors is now available on Windows and Linux, bringing all platforms in line with high-resolution display support added for macOS in Nuke 12.0 v1.
– Extended ARRI camera support: Nuke 12.1 adds support for ARRI formats, including Codex HDE .arx files, ProRes MXFs and the popular Alexa Mini LF. Foundry also says there are performance gains when debayering footage on CUDA GPUs, and there’s an SDK update.

Review: Loupedeck+ for Adobe’s Creative Cloud — a year later

By Mike McCarthy

It has been a little over a year since Loupedeck first announced support for Adobe’s Premiere Pro and After Effects thanks to its Loupedeck+ hardware interface panel. As you might know, Loupedeck was originally designed for Adobe Lightroom users. When Loupedeck was first introduced, I found myself wishing there was something similar for Premiere, so I was clearly pleased when that became a reality.

I was eager to test it and got one before starting editorial on a large feature film back in January. While I was knee-deep in the film, postPerspective’s Brady Betzel wrote a thorough review of the panel and how to use it in Premiere and Lightroom. My focus has been a bit different, working to find a way to make it a bit more user-friendly and looking for ways to take advantage of the tool’s immense array of possible functions.

Loupedeck+

I was looking forward to using the panel on a daily basis while editing the film (which I can’t name yet, sorry) because I would have three months of consistent time in Premiere to become familiar with it. The assistant editor on the film ordered a Loupedeck+ when he heard I had one. To our surprise, both of the panels sat idle for most of the duration of that project, even though we were using Premiere for 12 to 16 hours a day. There are a few reasons fo that from my own experience and perspective:

1) Using Premiere Pro 12 involved a delay — which made the controls, especially the dials, much less interactive — but that has been solved in Premiere 13. Unfortunately, we were stuck in version 12 on the film for larger reasons.

2) That said, even in Premiere 13, every time you rotate the dials, it sends a series of individual commands to Premiere that fills up your actions history with one or two adjustments. Pressing a dial resets its value to the default, which alleviates the need to undo that adjustment, but what about the other edit I just made before that? Long gone by that point. If you are just color correcting, this limitation isn’t an issue, but if you are alternating between making color adjustments and other fixes as you work through a sequence, this is a potential problem.

Loupedeck+

A solution? Limit each adjustment so that it’s seen as a single action until another value is adjusted or until a second or two go by — similar in principle to linear keyframe thinning, when you use sliders to make audio level adjustments.

3) Lastly, there was the issue of knowing what each button and dial would do, since there are a lot of them (40 buttons and 14 dials), and they are only marked for their functionality in Lightroom. I also couldn’t figure out how to map it to the functions I wanted to use the most. (The intrinsic motion effect values.)

The first issue will solve itself as I phase out Premiere 12 once this project is complete. The second could be resolved by some programming work by Loupedeck or Adobe, depending on where the limitations lie. Also, adding direct access to more functions in the Loupedeck utility would make it more useful to my workflows — specifically, the access to the motion effect values. But all of that hinges on me being able to remember the functions associated with each control, and those functions being more efficient than doing it with my mouse and keyboard.=

What solution works for you?

Dedicated Interface v. Mouse/Keyboard
The Loupedeck has led to a number of interesting debates about the utility of a dedicated interface for editing compared to a mouse and/or keyboard. I think this is a very interesting topic, as the interface between the system and the user is the heart of what we do. The monitor(s) and speakers are the flip side of that interface, completing the feedback loop. While I have little opinion on speakers because most of my work is visual, I have always been very into having the “best” monitorsolutions and figuring out exactly what “best” means.

It used to mean two 24-inch WUXGA panels, and then it meant a 30-inch LCD. Then I discovered that two 30-inch LCDs were too much for me to use effectively. Similarly, 4K had too many pixels for a 27-inch screen in Windows 7. An ultrawide 34-inch 3440×1440 is my current favorite, although my 32-inch 8K display is starting to grow on me now that Windows 10 can usually scale content on it smoothly.

Our monitor is how our computer communicates with us, and the mouse and keyboard are how we communicate with it. The QWERTY keyboard is a relic from the typewriter era, designed to be inefficient, to prevent jamming the keys. Other arrangements have been introduced but have not gained widespread popularity. The mouse is a much more flexible analog input device for giving more nuanced feedback. (Keys are only on or off, no in-between.) But it is not as efficient at discrete tasks as a keyboard shortcut, provided that you can remember it.

Keyboard shortcuts

This conundrum has led to debates about the best or most efficient way of controlling applications on the system, and editors have some pretty strong opinions on the matter. I am not going to settle it once and for all, but I am going to attempt to step back and look at the bigger picture. Many full-time operators who have become accustomed to their applications are very fast using their keyboard shortcuts, and Avid didn’t even support mouse editing on the timeline until a few years ago. This leads many of those operators to think that keyboard shortcuts are the most efficient possible method of operating, dismissing the possibility that there might be better solutions. But I am confident that for people starting from scratch, they could be at least as efficient, if not more so, using an interface that was actually designed for what they are doing.

Loupedeck is by no means the first or only option in that regard. I have had a Contour Shuttle Pro 2 for many years and have used it on rare occasions for certain highly repetitive tasks. Blackmagic sells a number of physical interface options for Resolve, including its new editing keyboard, and there have been many others for color correction, which is the focus of the Loupedeck’s design as well.

Shuttle Pro 2

Many people also use tablets or trackballs as a replacement for the mouse, but that usually is more about ergonomics and doesn’t compete with keyboard functionality. These other dedicated interfaces are designed to replace some of the keyboard and mouse functionality, but none of them totally replace the QWERTY keyboard, as we will still have to be able to type, to name files, insert titles, etc. But that is what a keyboard is designed to do, compared to pressing spacebar for playback or CTRL+K to add a layer slice. These functions are tasks that have been assigned to the keyboard for convenience, but they are not intrinsically connected with them.

There is no denying the keyboard is a fairly flexible digital input tool, consistently available on nearly all systems and designed to give your fingers lots of easily accessible options. Editors are hardly the only people repurposing it or attempting to use it to maximize efficiency in ways it wasn’t originally designed for. Gamers wear out their WASD keys because their functionality has nothing to do with their letter values and is entirely based on their position on the board. And while other interfaces have been marketed, most gamers are still using a QWERTY keyboard and mouse as their primary physical interface. People are taught the QWERTY keyboard from an early age to develop unconscious muscle memory and, ideally, to allow them to type as they think.

QWERTY keyboard

Once those unconscious links are developed, it is relatively easy to repurpose them for other uses. You think “T” and you press it without thinking about where it is. This is why the keyboard is so efficient as an input device, even outside of the tasks it was originally designed for. But what is preventing people from becoming as efficient with their other physical interfaces? Time with the device and good design are required. Controls have to be able to be identified by touch, without looking, to make that unconscious link possible, which is the reason for the bumps on your F and J keys. But those mental finger mappings may compete with your QWERTY muscle memory, which you are still going to need to be an effective operator, so certain people might be better off sticking with that.

If you are super-efficient with your keyboard shortcuts, and they do practically everything you need, then you are probably not in the target market for the Loupedeck or other dedicated interfaces. If you aren’t that efficient on your keyboard, or you do more analog tasks (color correction) that don’t take place with the discrete steps provided by a keyboard, then a dedicated interface might be more attractive to you. Ironically, my primary temp color tool on my recent film was Lumetri curves, which aren’t necessarily controlled by the Loupedeck.

 

Mike’s solution

That was more about contrast because “color” isn’t really my thing, but for someone who uses those tools that the Loupedeck is mapped to, I have no doubt the Loupedeck would be much faster than using mouse and keyboard for those functions. Mapping the dials to the position, scale and opacity values would improve my workflow, and that currently works great in After Effects, especially in 3D, but not in Premiere Pro (yet). Other functions like slipping and sliding clips are mapped to the Loupedeck dials, but they are not marked, making them very hard to learn. My solution to that is to label them.

Labeling the Loupedeck
I like the Loupedeck, but I have trouble keeping track of the huge variety of functions available, with four possible tasks assigned to each dial per application. Obviously, it would help if the functions were fairly consistent across applications, but currently, by default, they are not. There are some simple improvements that can be made, but not all of the same functions are available, even between Premiere and After Effects. Labeling the controls would be helpful, even just in the process of learning them, but they change between apps, so I don’t want to take a sharpie to the console itself.

Loupedeck CT

The solution I devised was to make cutouts, which can be dropped over the controls, with the various functions labeled with color-coded text. There are 14 dials, 40 buttons and four lights that I had to account for in the cutout. I did separate label patterns for Premiere, After Effects and Photoshop. They were initially based on the Loupedeck’s default settings for those applications, but I have created custom cutouts that have more consistent functionality when switching between the various apps.

Loupedeck recently introduced the new Loupedeck CT (Creative Tool), which is selling for $550. At more than twice the price, it is half the size and labels the buttons and dials with LCD screens that change to reflect the functions available for whatever application and workspace you might be in. This offers a similar but static capability to the much larger set of controls available on the cheaper Loupedeck+.


Mike McCarthy is an online editor/workflow consultant with over 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.

The Call of the Wild director Chris Sanders on combining live-action, VFX

By Iain Blair

The Fox family film The Call of the Wild, based on the Jack London tale, tells the story of  a big-hearted dog named Buck whose is stolen from his California home and transported to the Canadian Yukon during the Gold Rush. Director Chris Sanders called on the latest visual effects and animation technology to bring the animals in the film to life. The film stars Harrison Ford and is based on a screenplay by Michael Green.

Sanders’ crew included two-time Oscar–winning cinematographer Janusz Kaminski; production designer Stefan Dechant; editors William Hoy, ACE, and David Heinz; composer John Powell; and visual effects supervisor Erik Nash.

I spoke with Sanders — who has helmed the animated films Lilo & Stitch, The Croods and How to Train Your Dragon — about making the film, which features a ton of visual effects.

You’ve had a very successful career in animation but wasn’t this a very ambitious project to take on for your live-action debut?
It was. It’s a big story, but I felt comfortable because it has such a huge animated element, and I felt I could bring a lot to the party. I also felt up to the task of learning — and having such an amazing crew made all of that as easy as it could possibly be.

Chris Sanders on set.

What sort of film did you set out to make?
As true a version as we could tell in a family-friendly way. No one’s ever tried to do the whole story. This is the first time. Before, people just focused on the last 30 pages of the novel and focused on the relationship between Buck and John Thornton, played by Harrison. And that makes perfect sense, but what you miss is the whole origin story of how they end up together — how Buck has to learn to become a sled dog, how he meets the wolves and joins their world. I loved all that, and also all the animation needed to bring it all alive.

How early on did you start integrating post and all the visual effects?
Right away, and we began with previs.

Your animation background must have helped with all the previs needed on this. Did you do a lot of previs, and what was the most demanding sequence?
We did a ton. In animation it’s called layout, a rough version, and on this we didn’t arrive on set without having explored the sequence many times in previs. It helped us place the cameras and block it all, and we also improvised and invented on set. But previs was a huge help with any heavy VFX element, like when Thornton’s going down river. We had real canoes in a river in Canada with inertial measurement devices and inertial recorders, and that was the most extensive recording we had to do. Later in post, we had to replace the stuntman in the canoe with Thornton and Buck in an identical canoe with identical movements. That was so intensive.

 

How was it working with Harrison Ford?
The devotion to his craft and professionalism… he really made me understand what “preparing for a role” really means, and he really focused on Thornton’s back story. The scene where he writes the letter to his wife? Harrison dictated all of that to me and I just wrote it down on top of the script. He invented all that. He did that quite a few times. He made the whole experience exciting and easy.

The film has a sort of retro look. Talk about working with DP Janusz Kaminski.
We talked about the look a lot, and we both wanted to evoke those old Disney films we saw as kids —something very rich with a magical storybook feel to it. We storyboarded a lot of the film, and I used all the skills I’d learned in animation. I’d see sequences a certain way, draw it out, and sometimes we’d keep them and cut them into editorial, which is exactly what you do in animation.

How tough was the shoot? It must have been quite a change of pace for you.
You’re right. It was about 50 days, and it was extremely arduous. It’s the hardest thing I’ve ever done physically, and I was not fully prepared for how exhausted you get — and there’s no time to rest. I’d be driving to set by 4:30am every day, and we’d be shooting by 6am. And we weren’t even in the Yukon — we shot here in California, a mixture of locations doubling for the Yukon and stage work.

 

Where did you post?
All on the Fox lot, and MPC Montreal did all the VFX. We cut it in relatively small offices. I’m so used to post, as all animation is basically post. I wish it was faster, but you can’t rush it.

You had two editors — William Hoy and David Heinz. How did that work?
We sent them dailies and they divided up the work since we had so much material. Having two great voices is great, as long as everyone’s making the same movie.

What were the big editing challenges?
The creative process in editorial is very different from animation, and I was floored by how malleable this thing was. I wasn’t prepared for that. You could change a scene completely in editorial, and I was blown away at what they could accomplish. It took a long time because we came back with over three hours of material in the first assembly, and we had to crush that down to 90 minutes. So we had to lose a huge amount, and what we kept had to be really condensed, and the narrative would shift a lot. We’d take comedic bits and make them more serious and vice versa.

Visual effects play a key role. Can you talk about working on them with VFX supervisor Erik Nash.
I love working with VFX, and they were huge in this. I believe there are less than 30 shots in the whole film that don’t have some VFX. And apart from creating Buck and most of the other dogs and animals, we had some very complex visual effects scenes, like the avalanche and the sledding sequence.

L-R: Director Chris Sanders and writer Iain Blair

We had VFX people on set at all times. Erik was always there supervising the reference. He’d also advise us on camera angles now and then, and we’d work very closely with him all the time. The cameras were hooked up to send data to our recording units so that we always knew what lens was on what camera at what focal length and aperture, so later the VFX team knew exactly how to lens the scenes with all the set extensions and how to light them.

The music and sound also play a key role, especially for Buck, right?
Yes, because music becomes Buck’s voice. The dogs don’t talk like they do in Lion King, so it was critical. John Powell wrote a beautiful score that we recorded on the Newman Stage at Fox, and then we mixed at 5 Cat Studios.

Where did you do the DI, and how important is it to you?
We did it at Technicolor with colorist Mike Hatzer, and I’m pretty involved. Janusz did the first pass and set the table, and then we fine-tuned it, and I’m very happy with the rich look we got.

Do you want to direct another live-action film?
Yes. I’m much more comfortable with the idea now that I know what goes into it. It’s a challenge, but a welcome one.

What’s next?
I’m looking at all sorts of projects, and I love the idea of doing another hybrid like this.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Behind the Title: Blue Bolt VFX supervisor Richard Frazer

“If we have done our job well, the viewers should never notice the work and instead just be enjoying the storytelling.”

Name: Richard Frazer

Company: London’s BlueBolt

Can you describe your company?
For the last four years, I’ve worked at BlueBolt, a Soho-based visual effects company in London. We work on high-end TV and feature films, and our main area of specialty is creating CG environments and populating them. BlueBolt is a privately owned company run by two women, which is pretty rare. They believe in nurturing good talent and training them up to help break through the glass ceiling, if an artist is up for it.

What’s your job title?
I joined as a lead compositor with a view to becoming a 2D supervisor, and now I am one of the studio’s main core VFX supervisors.

What does that entail?
It means I oversee all of the visual effects work for a specific TV show or movie — from script stage to final delivery. That includes working with the director and DP in preproduction to determine what they would like to depict on the screen. We then work out what is possible to shoot practically, or if we need to use visual effects to help out.

I’ll then often be on the set during the shoot to make sure we correctly capture everything we need for post work. I’ll work with the VFX producer to calculate the costs and time scales of the VFX work. Finally, I will creatively lead our team of talented artists to create those rendered images and make sure it all fits in with the show in a visually seamless way.

What would surprise people the most about what falls under that title?
The staggering amount of time and effort involved by many talented people to create something that an audience should be totally unaware exists. If we have done our job well, the viewers should never notice the work and instead just be enjoying the storytelling.

How long have you been working in VFX?
For around a decade. I started out as a rotoscope artist in 2008 and then became a compositor. I did my first supervisor job back in 2012.

How has the VFX industry changed in the time you’ve been working?
A big shift has been just how much more visual effects work there is on TV shows and how much the standard of that work has improved. It used to be that TV work was looked down on as the poor cousin of feature film work. But shows like Game of Thrones have set audience expectations so much higher now. I worked on nothing but movies for the first part of my career, but the majority of my work now is on TV shows.

Did a particular film inspire you along this path in entertainment?
I grew up on ‘80s sci-fi and horror, so movies like Aliens and The Thing were definitely inspirations. This was back when effects were almost all done practically, so I wanted to get into model-making or prosthetics. The first time I remember being blown away by digital VFX work was seeing Terminator 2 at the cinema. I’ve ended up doing the type of work I dreamed of as a kid, just in a digital form.

Did you go to film school?
No, I actually studied graphic design. I worked for some time doing animation, video editing and motion graphics. I taught myself compositing for commercials using After Effects. But I always had a love of cinema and decided to try and specializing in this area. Almost all of what I’ve learned has been on the job. I think there’s no better training than just throwing yourself at the work, absorbing everything you can from the people around you and just being passionate about what you do.

What’s your favorite part of the job?
Each project has its own unique set of challenges, and every day involves creative problem-solving. I love the process of translating what only exists in someone’s imagination and the journey of creating those images in a way that looks entirely real.

I also love the mix of being at the offices one day creating things that only exist in a virtual world, while the next day I might be on a film set shooting things in the real world. I get to travel to all kinds of random places and get paid to do so!

What’s your least favorite?
There are so many moving parts involved in creating a TV show or movie — so many departments all working together trying to complete the task at hand, as well as factors that are utterly out of your control. You have to have a perfectly clear idea of what needs to be done, but also be able to completely scrap that and come up with another idea at a moment’s notice.

If you didn’t have this job, what would you be doing instead?
Something where I can be creative and make things that physically exist. I’m always in awe of people who build and craft things with their hands.

Can you name some recent projects you have worked on?
Recent work has included Peaky Blinders, The Last Kingdom and Jamestown, as well as a movie called The Rhythm Section.

What is the project that you are most proud of?
I worked on a movie called Under the Skin a few years ago, which was a very technically and creatively challenging project. It was a very interesting piece of sci-fi that people seem to either love or hate, and everyone I ask seems to have a slightly different interpretation of what it was actually about.

What tools so you use day to day?
Almost exclusively Foundry Nuke. I use it for everything from drawing up concepts to reviewing artists’ work. If there’s functionality that I need from it that doesn’t exist, I’ll just write Python code to add features.

Where do you find inspiration now?
In the real world, if you just spend the time observing it in the right way. I often find myself distracted by how things look in certain light. And Instagram — it’s the perfect social media for me, as it’s just beautiful images, artwork and photography.

What do you do to de-stress from it all?
The job can be quite mentally and creatively draining and you spend a lot of time in dark rooms staring at screens, so I try to do the opposite of that. Anything that involves being outdoors or doing something physical — I find cycling or boxing are good ways to unwind.

I recently went on a paragliding trip in the French Alps, which was great, but I found myself looking at all these beautiful views of sunsets over mountains and just analyzing how the sunlight was interacting with the fog and the atmospheric hazing. Apparently, I can never entirely turn off that part of my brain.

Kent Zambrana joins design house ATK PLN as senior producer

Dallas-based design studio ATK PLN has added Kent Zambrana as senior producer. Zambrana has over a decade of experience in production, overseeing teams of artists working on live action, animation and design projects. Over the years, he has worked at a number of creative shops across the agency and production sides of the business where he developed media campaigns for omni-channel video ecosystems, interactive projects and future tech.

Adds Zambrana, “ATK PLN’s offerings across design, animation and live action fit nicely within my skill set. I’m looking forward to leveraging my production expertise and direct-to-brand and agency perspectives to better serve their clients and continue to grow their offerings.”

Zambrana, who studied radio, television and film at the University of Texas in Austin, started his pro career in Los Angeles, where he spent four years producing work across The Simpsons properties, including the television series, theme park ride, digital platforms and promotional campaigns. He brought that experience with him and returned to Texas where the became a supervising producer at Invodo, building out the video content library of the startup’s digital video platform. He then landed as head of production, producing animation, live action, 2D and 3D work. When the company was acquired by Industrial Color in 2018, he led both the Dallas and Austin offices as senior director of production before joining The Marketing Arm to lead their in-house production shop.

How does Zambrana relax? He can be found rehearsing, recording and performing with his indie pop band, Letting Up Despite Great Faults.

Video Coverage: HPA Tech Retreat’s making of The Lost Lederhosen

By Randi Altman

At the HPA Tech Retreat in Rancho Mirage, California, the Supersession was a little different this year. Under the leadership of Joachim (JZ) Zell — who you might know from his day job as VP of technology at EFILM — the Supersession focused on the making of the short film, The Lost Lederhosen, in “near realtime,” in the desert. And postPerspective was there, camera in hand, to interview a few of the folks involved.

Watch our video coverage here.

While production for the film began a month before the Retreat — with Steve Shaw, ASC, directing and DP Roy H. Wagner Jr., ASC, lending his cinematography talents — some scenes were shot the morning of the session with data transfer taking place during lunch and post production in the afternoon. Peter Moss, ASC, and Sam Nicholson, ASC, also provided their time and expertise. After an active day of production, cloud-based post and extreme collaboration, the Supersession ended with the first-ever screening of The Lost Lederhosen, the story of Helga and her friend Hans making their way to Los Angeles, Zell and the HBA (Hollywood Beer Alliance). Check out HPA’s trailer here.

From acquisition to post (and with the use of multiple camera formats, framefrates and lenses), the film’s crew were volunteers and includes creatives and technologists from companies such as AWS, Colorfront, Frame.io, Avid, Blackmagic, Red, Panavision, Zeiss, FilmLight, SGO, Stargate, Unreal Engine, Sohonet and many more. One of the film’s goals was to use the cloud as much as possible in order to test out that particular workflow. While there were some minor hiccups along the way, the film got made — at the HPA Tech Retreat — and these industry pros got smarter about working in the cloud, something that will be increasingly employed going forward.

While we were were only able to chat with a handful of those pros involved, like any movie, the list of credits and thank you’s are too extensive to mention here — there were dozens of individuals and companies who donated their services and time to make this possible.

Watch our video coverage here.

(A big thank you and shout out to Twain Richardson for editing our videos.)

Main Image Caption: AWS’ Jack Wenzinger and EFILM’s Joachim Zell

Matt Shaw on cutting Conan Without Borders: Ghana and Greenland

By Randi Altman

While Conan O’Brien was airing his traditional one-hour late night talk show on TBS, he and his crew would often go on the road to places like Cuba, South Korea and Armenia for Conan Without Borders — a series of one-hour specials. He would focus on regular folks, not celebrities, and would embed himself into the local culture… and there was often some very mediocre dancing, courtesy of Conan. The shows were funny, entertaining and educational, and he enjoyed doing them.

Conan and Matt on the road.

In 2019, Conan and his crew, Team Coco, switched the nightly show from one hour to a new 30-minute format. The format change allowed them to produce three to four hour-long Conan Without Borders specials per year. Two of the places the show visited last year were Ghana and Greenland. As you might imagine, they shoot a lot of footage, which all must be logged and edited, often while on the road.

Matt Shaw is one of the editors on Conan, and he went on the road with the show when it traveled to Greenland. Shaw’s past credits include Deon Cole’s Black Box and The Pete Holmes Show (both from Conan O’Brien’s Conaco production company) and The Late Late Show with James Corden (including Carpool Karaoke). One of his first gigs for Team Coco was editing Conan Without Borders: Made in Mexico. That led to a full-time editing gig on Conan on TBS and many fun adventures.

We reached out to Shaw to find out more about editing these specials and what challenges he faced along the way.

You recently edited Conan Without Borders — the Greenland and Ghana specials. Can you talk about preparing for a job like that? What kind of turnaround did you have?
Our Ghana special was shot back in June 2019, with the original plan to air in August, but it was pushed back to November 7 because of how fast the Greenland show came up.

In terms of prep for a show like Ghana, we mainly just know the shooting specs and will handle the rest once the crew actually returns. For the most part, that’s the norm. Ideally, we’ll have a working dark week (no nightly Conan show), and the three editors — me, Rob Ashe and Chris Heller — will take the time to offload, sync and begin our first cuts of everything. We’ll have been in contact with the writers on the shoot to get an idea of what pieces were shot and their general notes from the day.

With Greenland, we had to mobilize and adjust everything to accommodate a drastically different shoot/delivery schedule. The Friday before leaving, while we were prepping the Ghana show to screen for an audience, we heard there might be something coming up that would push Ghana back. On Monday, we heard the plan was to go to Greenland on Wednesday evening, after the nightly show, and turn around Greenland in place of Ghana’s audience screening. We had to adjust the nightly show schedule to still have a new episode ready for Thursday while we were in Greenland.

How did you end up on the Greenland trip?
Knowing we’d only have six days from returning from Greenland to having to finish the show broadcast, our lead editor, Rob Ashe, suggested we send an editor to work on location. We were originally looking into sending footage via Aspera from a local TV studio in Nuuk, Greenland, but we just wouldn’t have been able to turn it around fast enough. We decided about two days before the trip began that I’d go and do what I could to offload, backup, sync and do first cuts on everything.

How much footage did you have per episode, and what did they shoot on?
Ghana had close to 17 hours of material shot over five days on Sony Z450s at 4K XAVC, 29.97. Greenland was closer to 12 hours shot over three days on Panasonic HPX 250s, P2 media recording at 1080 60i.

We also used iPhone/iPad/GoPro footage picked up by the rest of the crew as needed for both shows. I also had a DJI Osmo pocket camera to play with when I had a chance, and we used some of that footage during the montage of icebergs.

So you were editing segments while they were still shooting?
In Greenland, I was cutting daily in the hotel. Midday, I’d get a drop of cards, offload, sync/group and the first cuts on everything. We had a simple offline edit workflow set up where I’d upload my cuts to Frame.io and email my project files to the team — Rob and Chris — in Burbank. They would then download and sync the Frame.io file to a top video layer in the timeline and continue cutting down, with any additional notes from the writers.

Generally, I’d have everything from Day One uploaded by the start of Day Two, etc. It seemed to work out pretty well to set us up for success when we returned. I was also getting notes on requests to help cut a few highlights from our remotes and to put on Team Coco’s Instagram account.

On our return day, we flew to Ilulissat for an iceberg expedition. We had about two hours on the ground before having to return to the airport and fly to Kangerlussuaq, where our chartered plane was waiting to take us back to California. On the flight back, I worked for another four hours or so to sort through the remaining segments and prep everything so we could hit the ground running the following morning. During the flight home, we screened some drone footage from the iceberg trip for Conan, and it really got everyone excited.

What are the challenges of working on the road and with such tight turnarounds?
The night we left for Greenland was preceded by a nightly show in Burbank. After the show ended, we hopped on a plane to fly eight hours to Kangerlussuaq for customs, then another to Nuuk. The minute we landed, we were filming for about three hours before checking into the hotel. I grabbed the morning’s camera cards, went to my room and began cutting. By the time I went to bed, I had cuts done of almost everything from the first day. I’m a terrible sleeper on planes, so the marathon start was pretty insane.

Outside of the little sleep, our offload speeds were slower because we were using different cameras than usual — for the sake of traveling lighter — because the plane we flew in had specific weight restrictions. We actually had to hire local crew for audio and B and C camera because there wasn’t enough room for everyone in the plane to start.

In general, I think the overall trip went as smooth as it could have. It would be interesting to see how it would play out for a longer shoot schedule.

What editing system did you use? What was your setup like? What kind of storage were you using?
On the road I had my MacBook Pro (2018 model), and we rented an identical backup machine in case mine died. For storage, we had four 1TB G-Tech USB-C drives and a 4TB G-RAID to back everything up. I had a USB-3.0 P2 card reader as well and multiple backup readers. A Bluetooth mouse and keyboard rounded out the kit, so I could travel with everything in a backpack.

We had to charter a plane in order to fly directly to Greenland. With such a tight turnaround between filming and delivering the actual show, this was the only way to actually make the special happen. Commercial flights fly only a few days per week out of neighboring countries, and once you’re in Greenland, you either have to fly or take a boat from city to city.

Matt Shaw editing on plane.

On the plane, there was a conference table in the back, so I set up there with one laptop and the G-RAID to continue working. The biggest trouble on the plane was making sure everything stayed secure on the table while taking off and making turns. There were a few close calls when everything started to slide away, and I had to reach to make sure nothing was disconnected.

How involved in the editing is Conan? What kind of feedback did you get?
In general, if Conan has specific notes, the writers will hear them during or right after a shoot is finished. Or we’ll test-screen something after a nightly show taping and indirectly get notes from the writers then.

There will be special circumstances, like our cold opens for Comic-Con, when Conan will come to edit and screen a close-to-final cut. And there just might be a run of jokes that isn’t as strong, but he lets us work with the writers to make what we all think is the best version by committee.

Can you point to some of the more challenging segments from Greenland or Ghana?
The entire show is difficult with the delivery-time constraints while handling the nightly show. We’ll be editing the versions for screening sometimes up to 10 minutes before they have to screen for an audience as well as doing all the finishing (audio mix, color as needed, subtitling and deliverables).

For any given special, we’re each cutting our respective remotes during the day while working on any new comedy pieces for that day’s show, then one or two of us will split the work on the nightly show, while the other keeps working with the travel show writers. In the middle of it all, we’ll cut together a mini tease or an unfinished piece to play into that night’s show to promote the specials, so the main challenge is juggling 30 things at a time.

For me, I got to edit this 1980s-style action movie trailer based on an awesome poster Conan had painted by a Ghanaian artist. We had puppets built, a lot of greenscreen and a body double to composite Conan’s head onto for fight scenes. Story-wise, we didn’t have much of a structure to start, but we had to piece something together in the edit and hope it did the ridiculous poster justice.

The Thursday before our show screened for an audience was the first time Mike Sweeney (head writer for the travel shows) had a chance to look at any greenscreen footage and knew we were test-screening it the following Monday or Tuesday. It started to take shape when one of our graphics/VFX artists, Angus Lyne, sent back some composites. In the end, it came together great and killed with the audience and our staff, who had already seen anything and everything.

Our other pieces seem to have a linear story, and we try to build the best highlights from any given remote. With something like this trailer, we have to switch our thought process to really build something from scratch. In the case of Greenland and Ghana, I think we put together two really great shows.

How challenging is editing comedy versus drama? Or editing these segments versus other parts of Conan’s world?
In a lot of the comedy we cut, the joke is king. There are always instances when we have blatant continuity errors, jump cuts, etc., but we don’t have to kill ourselves trying to make it work in the moment if it means hurting the joke. Our “man on the street” segments are great examples of this. Obviously, we want something to be as polished and coherent as possible, but there are cases when it just isn’t best, in our opinion, and that’s okay.

That being said, when we do our spoofs of whatever ad or try to recreate a specific style, we’re going to do everything to make that happen. We recently shot a bit with Nicholas Braun from Succession where he’s trying to get a job from Conan during his hiatus from Succession. This was a mix of improv and scripted, and we had to match the look of that show. It turned out well and funny and is in the vein of Succession.

What about for the Ghana show?
For Ghana, we had a few segments that were extremely serious and emotional. For example, Conan and Sam Richardson visited Osu Castle, a major slave trade port. This segment demands care and needs to breathe so the weight of it can really be expressed, versus earlier in the show, when Conan was buying a Ghana shirt from a street vendor, and we hard-cut to him wearing a shirt 10 sizes too small.

And Greenland?
Greenland is a place really affected by climate change. My personal favorite segment I’ve cut on these travel specials is the impact the melting icecaps could have on the world. Then there is a montage of the icebergs we saw, followed by Conan attempting to stake a “Sold” sign on an iceberg, signifying he had bought property in Greenland for the US. Originally, the montage had a few jokes within the segment, but we quickly realized it’s so beautiful we shouldn’t cheapen it. We just let it be beautiful.

Comedy or drama, it’s really about being aware of what you have in front of you and what the end goal is.

What haven’t I asked that’s important?
For me, it’s important to acknowledge how talented our post team is to be able to work simultaneously on a giant special while delivering four shows a week. Being on location for Greenland also gave me a taste of the chaos the whole production team and Team Coco goes through, and I think everyone should be proud of what we’re capable of producing.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

The Den editorial boutique launches in Los Angeles

Christjan Jordan, editor of award-winning work for clients including Amazon, GEICO and Hulu, has partnered with industry veteran Mary Ellen Duggan to launch The Den, an independent boutique editorial house in Los Angeles.

Over the course of his career, Jordan has worked with Arcade Edit, Cosmo Street and Rock Paper Scissors, among others. He has edited such spots as Alexa Loses Her Voice for Amazon, Longest Goal Celebration Ever for GEICO, #notspecialneeds for World Down Syndrome Day out of Publicis NY and Super Bowl 2020 ads Tom Brady’s Big Announcement for Hulu and Famous Visitors for Walmart. Jordan’s work has been recognized by the Cannes Lions, AICE, AICP, Clio, D&AD, One Show and Sports Emmy awards.

Yes, with Mary Ellen, agency producers are guided by an industry veteran that knows exactly what agencies and clients are looking for,” says Jordan. “And for me, I love fostering young editors. It’s an interesting time in our industry and there is a lot of fresh creative talent.”

In her career, Duggan has headed production departments at both KPB and Cliff Freeman on the East Coast and, most recently, Big Family Table in Los Angeles. In addition, she has freelanced all over the country.

“The stars aligned for Christjan and I to work together,” says Duggan. “We had known each other for years and had recently worked on a Hulu campaign together. We had a similar vision for what we thought the editorial experience should be. A high end boutique editorial that is nimble, has a roster of diverse talent, and a real family vibe.”

Veteran producer Rachel Seitel has joined as partner and head of business development. The Den will be represented by Diane Patrone at The Family on the East Coast and by Ezra Burke and Shane Harris on the West Coast.

The Den’s founding roster also features editor Andrew Ratzlaff and junior editor Hannelore Gomes. The staff works on Avid Media Composer and Adobe Premiere.

ILM’s virtual production platform used on The Mandalorian

To bring The Mandalorian to life, Industrial Light & Magic (ILM) and Epic Games — along with production technology partners Fuse, Lux Machina, Profile Studios, Nvidia and ARRI — have introduced a new way to shoot VFX-heavy projects in collaboration with Jon Favreau’s Golem Creations.

The new virtual production workflow allows filmmakers to capture a significant amount of complex visual effects shots in-camera using realtime game engine technology (Epic’s Unreal Engine) and LED screens to represent dynamic photoreal digital landscapes and sets with creative flexibility previously unimaginable.

Also part of the news, ILM has made its new end-to-end virtual production solution, ILM StageCraft, available for use by filmmakers, agencies and showrunners worldwide.

Over 50 percent of The Mandalorian Season 1 was filmed using this new methodology, eliminating the need for location shoots entirely. Instead, actors in The Mandalorian performed in an immersive and massive 20-foot-high by 270-degree semicircular LED video wall and ceiling with a 75-foot-diameter performance space, where the practical set pieces were combined with digital extensions on the screens.

Digital 3D environments created by ILM played back interactively on the LED walls, edited in realtime during the shoot, which allowed for pixel-accurate tracking and perspective-correct 3D imagery rendered at high resolution via systems powered by Nvidia GPUs.

L-R: Jon Favreau and Richard Bluff

The environments were lit and rendered from the perspective of the camera to provide parallax in real time, as if the camera were really capturing the physical environment with accurate interactive light on the actors and practical sets, giving showrunner Favreau; executive producer/director Dave Filoni; visual effects supervisor Richard Bluff; cinematographers Greig Fraser and Barry “Baz” Idoine and the episodic directors the ability to make concrete creative choices for visual effects-driven work during photography and achieve realtime in-camera composites on set.

The technology and workflow required to make in-camera compositing and effects practical for on-set use combined the ingenuity of all the partners involved.

“We’ve been experimenting with these technologies on my past projects and were finally able to bring a group together with different perspectives to synergize film and gaming advances and test the limits of realtime, in-camera rendering,” explains Favreau adding, “We are proud of what was achieved and feel that the system we built was the most efficient way to bring The Mandalorian to life.”

“Merging our efforts in the space with what Jon Favreau has been working toward using virtual reality and game engine technology in his filmmaking finally gave us the chance to execute the vision,” says Rob Bredow, executive creative director and head of ILM. “StageCraft has grown out of the culmination of over a decade of innovation in the virtual production space at ILM. Seeing our digital sets fully integrated, in real time on stage, providing the kind of in-camera shots we’ve always dreamed of while also providing the majority of the lighting was really a dream come true.”

Bluff adds, “Working with Kim Libreri and his Unreal team, Golem Creations, and the ILM StageCraft team has opened new avenues to both the filmmakers and my fellow key creatives on The Mandalorian, allowing us to shoot principal photography on photoreal, virtual sets that are indistinguishable from their physical counterparts while incorporating physical set pieces and props as needed for interaction. It’s truly a game-changer.”

ILM StageCraft’s production tools provide filmmakers with the combination of traditional filmmaking equipment and methodologies with all of the advantages of a fully digital workflow. With ILM StageCraft, a production can acquire many in-camera finals, allowing filmmakers immediate and complete creative control of work typically handed off and reinterpreted in post, improving the quality of visual effects shots with perfectly integrated elements and reducing visual effects requirements in post, which is a major benefit considering today’s compressed schedules.

LVLY adds veteran editor Bill Cramer

Bill Cramer, an editor known for his comedy and dialogue work, among other genres, has joined the editorial roster at LVLY, a content creation and creative studio based in New York City.

Cramer joins from Northern Lights. Prior to that he had spent many years at Crew Cuts, where he launched his career and built a strong reputation for his work on many ads and campaigns. Clients included ESPN, GMC, LG, Nickelodeon, Hasbro, MLB, Wendy’s and American Express. Check out his reel.

Cramer reports that he wasn’t looking to make a move but that LVLY’s VP/MD, Wendy Brovetto, inspired him. “Wendy and I knew of each other for years, and I’d been following LVLY since they did their top-to-bottom rebranding. I knew that they’re doing everything from live -action production to podcasting, VFX, design, VR and experiential, and I recognized that joining them would give me more opportunities to flex as an editor. Being at LVLY gives me the chance to take on any project, whether that’s a 30-second commercial, music video or long-form branded content piece; they’re set up to tackle any post production needs, no matter the scale.”

“Bill’s a great comedy/dialogue editor, and that’s something our clients have been looking for,” says Brovetto. “Once I saw the range of his work, it was an easy decision to invite him to join the LVLY team. In addition to being a great editor, he’s a funny guy, and who doesn’t need more humor in their day?”

Cramer, who works on both Avid Media Composer and Adobe Premiere, joins an editorial roster that includes Olivier Wicki, J.P. Damboragian, Geordie Anderson, Noelle Webb, Joe Siegel and Aaron & Bryan.

Senior colorist Tony D’Amore joins Picture Shop

Burbank’s Picture Shop has beefed up its staff with senior colorist Tony D’Amore, who will also serve as a director of creative workflow. In that role, he will oversee a team focusing on color prep and workflow efficiency.

Originally from rural Illinois, D’Amore made the leap to the West Coast to pursue an education, studying film and television at UCLA. He started his career in color in the early ‘90s, gaining valuable experience in the world of post. He has been working closely with color and post workflow since.

While D’Amore has experience working on Autodesk Lustre and FilmLight Baselight, he primarily grades in Blackmagic DaVinci Resolve. D’Amore has contributed color to several Emmy Award-winning shows nominated in the category of “Outstanding Cinematography.”

D’Amore has developed new and efficient workflows for Dolby Vision HDR and HDR10, coloring hundreds of hours of episodic programming for networks including CBS, ABC and Fox, as well as cable and streaming platforms such as HBO, Starz, Netflix, Hulu and Amazon.

D’Amore’s most notable project to date is having colored a Marvel series simultaneously for IMAX and ABC delivery. His list of color credits include, Barry (HBO), Looking for Alaska (Hulu), Legion (FX), Carnival Row (Amazon), Power (Starz), Fargo (FX), Elementary (CBS), Hanna (Amazon), and a variety of Marvel series, including Jessica Jones, Daredevil, The Defenders, Luke Cage and Iron Fist. All of these are available on streaming platforms

Behind the Title: Dell Blue lead editor Jason Uson

This veteran editor started his career at LA’s Rock Paper Scissors, where he spent four years learning the craft from editors such as Bee Ottinger and Angus Wall. After freelancing at Lost Planet, Spot Welders and Nomad, he held staff positions at Cosmo Street, Harpo Films and Beast Editorial before opening Foundation Editorial his own post boutique in Austin.

NAME: Jason Uson

COMPANY: Austin, Texas-based Dell Blue

Can you describe what Dell Blue does?
Dell Blue is the in-house agency for Dell Technologies.

What’s your job Title?
Senior Lead Creative Editor

What does that entail?
Outside of the projects that I am editing personally, there are multiple campaigns happening simultaneously at all times. I oversee all of them and have my eyes on every edit, fostering and mentoring our junior editors and producers to help them grow in their careers.

I’ve helped establish and maintain the process regarding our workflow and post pipeline. I also work closely with our entire team of creatives, producers, project managers and vendors from the beginning of each project and follow it through from production to post. This enables us to execute the best possible workflow and outcome for every project.

To add another layer to my role, I am also directing spots for Dell when the project is right.

Alienware

That’s a lot! What else would surprise people about what falls under that title?
The number of hours that go into making sure the job gets done and is the best it can be. Editing is a process that takes time. Creating something of value that means something is an art no matter how big or small the job might be. You have to have pride in every aspect of the process. It shows when you don’t.

What’s your favorite part of the job?
I have two favorites. The first is the people. I know that sounds cliché, but it’s true. The team here at Dell is truly something special. We are family. We work together. Play together. Happy Hour together. Respect, support and genuinely care for one another. But, ultimately, we care about the work. We are all aligned to create the best work possible. I am grateful to be surrounded by such a talented and amazing group of humans.

The second, which is equally important to me, is the process of organizing my project, watching all the footage and pulling selects. I make sure I have what I need and check it off my list. Music, sound effects, VO track, graphics and anything else I need to get started. Then I create my first timeline. A blank, empty timeline. Then I take a deep breath and say to myself, “Here we go.” That’s my favorite.

Do you have a least favorite?
My least favorite part is wrapping a project. I spend so much time with my clients and creatives and we really bond while working on a project together. We end on such a high note of excitement and pride in what we’ve done and then, just like that, it’s over. I realize that sounds a bit dramatic. Not to worry, though, because lucky for me, we all come back together in a few months to work on something new and the excitement starts all over again.

What is your most productive time of day?
This also requires a two-part answer. The first is early morning. This is my time to get things done, uninterrupted. I go upstairs and make a fresh cup of coffee. I open my deck doors. I check and send emails, and get my personal stuff done. This clears out all of my distractions for the day before I jump into my edit bay.

The second part is late at night. I get to replay all of the creative decisions from the day and explore other options. Sometimes, I get lucky and find something I didn’t see before.

If you didn’t have this job, what would you be doing instead?
That’s easy. I’d be a chef. I love to cook and experiment with ingredients. And I love to explore and create an amazing dining experience.

I see similarities between editors and chefs. Both aim to create something impactful that elicits an emotional response from the “elements” they are given. For chefs, the ingredients, spices and techniques are creatively brought together to bring a dish to life.

For editors, the “elements” that I am given, in combination with the use of my style, techniques, sound design, graphics and music etc. all give life to a spot.

How early did you know this would be your path?
I had originally moved to Los Angeles with dreams of becoming an actor. Yes, it’s groundbreaking, I know. During that time, I met editor Dana Glauberman (The Mandalorian, Juno, Up in the Air, Thank You for Smoking, Creed II, Ghostbusters: Afterlife). I had lunch with her at the studios one day in Burbank and went on a tour of the backlot. I got to see all the edit bays, film stages, soundstages and machine rooms. To me, this was magic. A total game-changer in an instant.

While I was waiting on that one big role, I got my foot in the door as a PA at editing house Rock Paper Scissors. One night after work, we all went for drinks at a local bar and every commercial on TV were the ones (editors) Angus Wall and Adam Pertofsky had worked on within the last month, and I was blown away. Something clicked.

This entire creative world behind the scenes was captivating to me. I made the decision at that moment to lean in and go for it. I asked the assistant editor the following morning if he would teach me — and I haven’t looked back. So, Dana, Angus and Adam… thank you!

Can you name some of your recent projects?
I edited the latest global campaign for Alienware called Everything Counts, which was directed by Tony Kaye. More recently, I worked on the campaign for Dell’s latest and greatest business PC laptop that launches in March 2020, which was directed by Mac Premo.

Dell business PC

Side note: I highly recommend Googling Mac Premo. His work is amazing.

What project are you most proud of?
There are two projects that stand out for me. The first one is the very first spot I ever cut — a Budweiser ad for director Sam Ketay and the Art Institute of Pasadena. During the edit, I thought, “Wow, I think I can do this.” It went on to win a Clio.

The second is the latest global campaign for Alienware, which I mentioned above. Director Tony Kaye is a genius. Tony and I sat in my edit bay for a week exploring and experimenting. His process is unlike any other director I have worked with. This project was extremely challenging on many levels. I honestly started looking at footage in a very different way. I evolved. I learned. And I strive to continue to grow every day.

Name three pieces of technology you can’t live without.
Wow, good question. I guess I’ll be that guy and say my phone. It really is a necessity.

Spotify, for sure. I am always listening to music in my car and trying to match artists with projects that are not even in existence yet.

My Bose noise cancelling headphones.

What social media channels do you follow?
I use Facebook and LinkedIn — mainly to stay up to date on what others are doing and to post my own updates every now and then.

I’m on Instagram quite a bit. Outside of the obvious industry-related accounts I follow, here are a few of my random favorites:

@nuts_about_birds
If you love birds as much as I do, this is a good one to follow.

@sergiosanchezart
This guy is incredible. I have been following his work for a long time. If you are looking for a new tattoo, look no further.

@andrewhagarofficial
I was lucky enough to meet Andrew through my friend @chrisprofera and immediately dove into his music. Amazing. Not to mention his dad is Sammy Hagar. Enough said.

@kaleynelson
She’s a talented photographer based in LA. Her concert stills are impressive.

@zuzubee
I love graffiti art and Zuzu is one of the best. Based is Austin, she has created several murals for me. You can see her work all over the city, as well as installations during SXSW and Austin City Limits, on Bud Light cans, and across the US.

Do you listen to music at work? What types?
I do listen to music when I work but only when I’m going through footage and pulling selects. Classical piano is my go-to. It opens my mind and helps me focus and dive into my footage.

Don’t get me wrong, I love music. But if I am jamming to my favorite, Sammy Hagar, I can’t drive…I mean dive… into my footage. So classical piano for me.

How do you de-stress from it all?
This is an understatement, but there are a few things that help me out. Sometimes during the day, I will take a walk around the block. Get a little vitamin D and fresh air. I look around at things other than my screen. This is something (editors) Tom Muldoon and John Murray at Nomad used to do every day. I always wondered why. Now I know. I come back refreshed and with my mind clear and ready for the next challenge.

I also “like” to hit the gym immediately after I leave my edit bay. Headphones on (Sammy Hagar, obviously), stretch it out and jump on the treadmill for 30 minutes.

All that is good and necessary for obvious reasons, but getting back to cooking… I love being in the kitchen. It’s therapy for me. Whether I am chopping and creating in the kitchen or out on the grill, I love it. And my wife appreciates my cooking. Well, I think she does at least.

Photo Credits: Dell PC and Jason Uson images – Chris Profera

Adam Milano joins Hecho Studios from Live Nation

Hecho Studios in LA has named Adam Milano as head of development, a new title at the company. He makes the move from Live Nation where he served as SVP of production. Milano will report directly to Hecho Studios’ president, Briony McCarthy.

In addition to creating and producing content, Hecho has 11 editorial bays, two audio suites, full finishing capabilities with color and picture, and an animation and motion graphics team.

During his time at Live Nation, Milano produced the GLAAD-awarded and Emmy-nominated Believer documentary with Imagine Dragons’ frontman Dan Reynolds. He also worked across Live Nation’s slate of music-driven projects, including The Afterparty, a hip-hop comedy directed by Ian Edelman, which premiered on Netflix in 2018, and Gaga: Five Foot Two.

Milano’s previous posts include SVP of gilm at Simon Cowell’s SYCO Entertainment, where he helped Cowell launch the US division and SYCO’s scripted film/television division. While there, he produced the One Direction movie, This is Us. Milano began his career at Sony Pictures Entertainment, working from 2002-2012 as VP of production.

“Adam is both radically in tune with culture and a consummate experienced creator and producer,” says McCarthy. “As our first head of development, he’ll focus on developing our upcoming slate of reality, pop culture and music content and support our newly formed branded content division.”

Milano will lead the charge in diversifying Hecho’s originals entertainment and branded content productions into more scripted, reality and music-based programming in the context of current culture.

PBS celebrates 50 years with new on-air graphics

LA-based Nathaniel Howe Studios (NHS) has partnered with PBS and creative consultancy Lippincott to create a new on-air graphics package to coincide with the public broadcaster’s updated identity. This includes a refreshed logo, bolder color palette and custom typeface. The new on-air look for PBS — home to shows such as Masterpiece, Nature and Frontline — will roll out throughout 2020 as the network celebrates its 50th anniversary.

PBS looked to NHS to translate its new identity for modern screens while providing brand coherency at both the national and local levels with its 300-plus member stations — the studio called on Adobe’s Creative Suite to create the look. “Nathaniel and his team took our multi-platform vision to heart and developed a broad range of inspired ideas,” says Don Wilcox, VP, multi-platform marketing and content at PBS.

“The design and animation play a supporting role, framing the content and delivering all the key information effortlessly within the new ‘digital-first’ brand architecture,” explains NHS founder/CD Howe, adding that he really enjoyed getting to work with people deeply connected to the PBS brand. “Some of our clients have been with PBS for over 20 years. It was rewarding to serve a brand that is so loved across this country, one that does so much good through storytelling, and to find the balance of respecting its history while subtly evolving for the future. In the process, we also got to meet so many diverse people across the country and help to solve their creative challenges.”

Howe explains that the PBS logo provided the perfect framework to keep the visual system focused while reinforcing the brand in a subtle yet unified way. “Its new flat design also lent itself well to the motion theory behind the package, which favors minimal design elements, gentle key frames and purposeful applications of accent colors to complement the hero PBS blue.”

NHS kicked off this massive project during the early phases of the rebrand strategy, working closely with PBS and Lippincott to help translate the updated identity for digital and broadcast screens. To address the unique needs of PBS’ member stations, a process that included multiple phases of testing and feedback, NHS delivered a customizable Adobe After Effects tool kit and led a nationwide on-boarding process. This included the production of video tutorials and webinars as well as in-studio training programs and presentations for PBS summits and conferences.

“Our greatest challenge was solving almost endless co-branding scenarios within an After Effects toolkit and maintaining balance between unification and local market self-expression,” explains Howe. “This project took place over the course of a year, so we had to keep the focus locked and the fire lit throughout. And we also had to fight off the challenge of adding extra design elements or complexity for the sake of it. Simplicity was the key here.”

According to Howe, the vitals of the PBS rebrand live within a master tool kit that is quick and easy to use for everyone. “The beauty of Lippincott’s minimalistic branding system came into play here as it enabled us to eliminate technical limitations, standardize the graphics creation process, and speed up workflows across the board.”

Howe is no stranger to PBS. For over a decade, he has collaborated with the channel on on-air graphics promos for several Ken Burns documentaries (Jackie Robinson, Prohibition), the Indian Summers series and the PBS Arts Fall Festival. He also helmed the brand refresh of PBS’ long-running anthology series, Great Performances, and sizzle reels for network summits.

“We simply wanted this package to generate excitement around PBS while honoring the integrity of the brand and the value it offers in our cluttered media landscape,” concludes Howe. “As a team, our hearts were aligned from the outset — and as a new father, I was personally inspired by the thought-provoking and educational nature of the content PBS offers to such a broad-reaching audience.”

Amazon’s The Expanse Season 4 gets HDR finish

The fourth season of the sci-fi series The Expanse was finished in HDR for the first time streaming via Amazon Prime Video. Deluxe Toronto handled end-to-end post services, including online editorial, sound remixing and color grading. The series was shot on ARRI Alexa Minis.

In preparation for production, cinematographer Jeremy Benning, CSC, shot anamorphic test footage at a quarry that would serve as the filming stand-in for the season’s new alien planet, Ilus. Deluxe Toronto senior colorist Joanne Rourke then worked with Benning, VFX supervisor Bret Culp, showrunner Naren Shankar and series regular Breck Eisner to develop looks that would convey the location’s uninviting and forlorn nature, keeping the overall look desaturated and removing color from the vegetation. Further distinguishing Ilus from other environments, production chose to display scenes on or above Ilus in a 2.39 aspect ratio, while those featuring Earth and Mars remained in a 16:9 format.

“Moving into HDR for Season 4 of our show was something Naren and I have wanted to do for a couple of years,” says Benning. “We did test HDR grading a couple seasons ago with Joanne at Deluxe, but it was not mandated by the broadcaster at the time, so we didn’t move forward. But Naren and I were very excited by those tests and hoped that one day we would go HDR. With Amazon as our new home [after airing on Syfy], HDR was part of their delivery spec, so those tests we had done previously had prepared us for how to think in HDR.

“Watching Season 4 come to life with such new depth, range and the dimension that HDR provides was like seeing our world with new eyes,” continues Benning. “It became even more immersive. I am very much looking forward to doing Season 5, which we are shooting now, in HDR with Joanne.”

Rourke, who has worked on every season of The Expanse, explains, “Jeremy likes to set scene looks on set so everyone becomes married to the look throughout editorial. He is fastidious about sending stills each week, and the intended directive of each scene is clear long before it reaches my suite. This was our first foray into HDR with this show, which was exciting, as it is well suited for the format. Getting that extra bit of detail in the highlights made such a huge visual impact overall. It allowed us to see the comm units, monitors, and plumes on spaceships as intended by the VFX department and accentuate the hologram games.”

After making adjustments and ensuring initial footage was even, Rourke then refined the image by lifting faces and story points and incorporating VFX. This was done with input provided by producer Lewin Webb; Benning; cinematographer Ray Dumas, CSC; Culp or VFX supervisor Robert Crowther.

To manage the show’s high volume of VFX shots, Rourke relied on Deluxe Toronto senior online editor Motassem Younes and assistant editor James Yazbeck to keep everything in meticulous order. (For that they used the Grass Valley Rio online editing and finishing system.) The pair’s work was also essential to Deluxe Toronto re-recording mixers Steve Foster and Kirk Lynds, who have both worked on The Expanse since Season 2. Once ready, scenes were sent in HDR via Streambox to Shankar for review at Alcon Entertainment in Los Angeles.

“Much of the science behind The Expanse is quite accurate thanks to Naren, and that attention to detail makes the show a lot of fun to work on and more engaging for fans,” notes Foster. “Ilus is a bit like the wild west, so the technology of its settlers is partially reflected in communication transmissions. Their comms have a dirty quality, whereas the ship comms are cleaner-sounding and more closely emulate NASA transmissions.”

Adds Lynds, “One of my big challenges for this season was figuring out how to make Ilus seem habitable and sonically interesting without familiar sounds like rustling trees or bird and insect noises. There are also a lot of amazing VFX moments, and we wanted to make sure the sound, visuals and score always came together in a way that was balanced and hit the right emotions story-wise.”

Foster and Lynds worked side by side on the season’s 5.1 surround mix, with Foster focusing on dialogue and music and Lynds on sound effects and design elements. When each had completed his respective passes using Avid ProTools workstations, they came together for the final mix, spending time on fine strokes, ensuring the dialogue was clear, and making adjustments as VFX shots were dropped in. Final mix playbacks were streamed to Deluxe’s Hollywood facility, where Naren could hear adjustments completed in real time.

In addition to color finishing Season 4 in HDR, Rourke also remastered the three previous seasons of The Expanse in HDR, using her work on Season 4 as a guide and finishing with Blackmagic DaVinci Resolve 15. Throughout the process, she was mindful to pull out additional detail in highlights without altering the original grade.

“I felt a great responsibility to be faithful to the show for the creators and its fans,” concludes Rourke. “I was excited to revisit the episodes and could appreciate the wonderful performances and visuals all over again.”

DP Chat: Watchmen cinematographer Greg Middleton

By Randi Altman

HBO’s Watchmen takes us to new dimensions in this recent interpretation of the popular graphic novel. In this iteration, we spend a lot of our time in Tulsa, Oklahoma, getting to know Regina King’s policewoman Angela Abar, her unconventional family and a shadowy organization steeped in racism called the Seventh Kavalry. We also get a look back — beautiful in black and white — at Abar’s tragic family back story. It was created and written for TV by Lost veteran Damon Lindelof.

Greg Middleton

Greg Middleton, ASC, CSC, who also worked on Game of Thrones and The Killing, was the series cinematographer. We reached out to him to find out about his process, workflow and where he gets inspiration.

When were you brought on to Watchmen, and what type of looks did the showrunner want from the show?
I joined Watchmen after the pilot for Episode 2. A lot of my early prep was devoted to discussions with the showrunner and producing directors on how to develop the look from the pilot going forward. This included some pilot reshoots due to changes in casting and the designing and building of new sets, like the police precinct.

Nicole Kassell (director of Episodes 1, 2 and 8) and series production designer Kristian Milstead and I spent a lot of time breaking down the possibilities of how we could define the various worlds through color and style.

How was the look described to you? What references were you given?
We based the evolution of the look of the show on the scripts, the needs of the structure within the various worlds and on the graphic novel, which we commonly referred to as “the Old Testament.”

As you mention, it’s based on a graphic novel. Did the look give a nod to that? If so, how? Was that part of the discussion?
We attempted to break down the elements of the graphic novel that might translate well and those that would not. It’s an interesting bit of detective work because a lot of the visual cues in the comic are actually a commentary on the style of comics at the time it was published in 1985.

Those cues, if taken literally, would not necessarily work for us, as their context would not be clear. Things like color were very referential to other comics of the time. For example, they used only secondary color instead of primaries as was the norm. The graphic novel is also a film noir in many ways, so we got some of our ideas based on that.

What did translate well were compositional elements — tricks of transition like match cuts and the details of story in props, costumes and sets within each frame. We used some split diopters and swing shift lenses to give us some deep focus effects for large foreground objects. In the graphic novel, of course, everything is in focus, so those type of compositions are common!

This must have been fun because of the variety of looks the series has — the black-and-white flashbacks, the stylized version of Tulsa, the look of the mansion in Wales (Europa), Vietnam in modern day. Can you talk about each of the different looks?
Yes, there were so many looks! When we began prep on the series with the second episode, we were also simultaneously beginning to film the scenes in Wales for the “blond man” scenes. We knew that that storyline would have its own particular feel because of the location and its very separateness from the rest of the world.

A more classic traditional proscenium-like framing and style seemed very appropriate. Part of that intent was designed to both confuse and to make very apparent to the audience that we were definitely in another world. Cinematographer Chris Seager, BSC, was filming those scenes as I was still doing tests for the other looks and the rest of our show in Atlanta.

We discussed lenses, camera format, etc. The three major looks we had to design that we knew would go together initially were our “Watchmen” world, the “American hero story” show within the show, and the various flashbacks to 1921 Tulsa and World War I. I was determined to make sure that the main world of the show did not feel overly processed and colorized photographically. We shot many tests and developed a LUT that was mostly film-like. The other important aspects to creating a look are, of course, art direction and production design, and I had a great partner in Kristian Milstead, the production designer who joined the show after the pilot.

This was a new series. Do you enjoy developing the look of a show versus coming on board after the look was established?
I enjoy helping to figure out how to tell the story. For series, helping develop the look photographically in the visual strategy is a big part of that. Even if some of those are established, you still do similar decision-making for shooting individual scenes. However, I much prefer being engaged from the beginning.

So even when you weren’t in Wales, you were providing direction?
As I mentioned earlier, Chris Seager and I spoke and emailed regarding lenses and those choices. It was still early for us in Atlanta, but there were preliminary decisions to be made on how the “blond man” (our code name for Jeremy Irons) world would be compared to our Watchmen world. What I did was consult with my director, Nicole Kassell, on her storyboards for her sequences in Wales.

Were there any scenes or looks that stood out as more challenging than others? Can you describe?
Episode 106 was a huge challenge. We have a lot of long takes involving complex camera moves and dimmer cues as a camera would circle or travel between rooms. Also, we developed the black-and-white look to feel like older black-and-white film.
One scene in June’s apartment involved using the camera on a small Scorpio 10-foot crane and a mini libre head to accomplish a slow move around the room. Then we had to push between her two actors toward the wall as an in-camera queue of a projected image of the black-and-white movie Trust in the Law reveals itself with a manual iris.

This kind of shot ends up being a dance with at least six people, not including the cast. The entire “nostalgia” part of the episode was done this way. And none of this would’ve been possible without incredible cast being able to hit these incredibly long takes and choreograph themselves with the camera. Jovan Adepo and Danielle Deadwyler were incredible throughout the episode.

I assume you did camera tests. Why did you choose the ARRI Alexa? Why was it right for this? What about lenses, etc.?
I have been working with the Alexa for many years now, so I was aware of what I could do with the camera. I tested a couple of others, but in the end the Alexa Mini was the right choice for us. I also needed a camera that was small so I could go on and off of a gimbal or fit into small places.

How did you work with the colorist? Who was that on this show? Were you in the suite with them?
Todd Bochner was our final colorist at Sim in LA. I shot several camera tests and worked with him in the suite to help develop viewing LUTs for the various worlds of the show. We did the same procedure for the black and white. In the end, we mimicked some techniques similar to black-and-white film (like red filters), except for us, it was adjusting the channels accordingly.

Do you know what they used on the color?
Yes, it was Blackmagic DaVinci Resolve 16.

How did you get interested in cinematography?
I was always making films as a kid, and then in school and then in university. In film school, at some point splitting apart the various jobs, I seemed to have some aptitude for the cinematography, so after school I decided to try making my focus. I came to it more out of a love of storytelling and filmmaking and less about photography.

Greg Middleton

What inspires you? Other films?
Films that move me emotionally.

What’s next for you?
A short break! I’ve been very fortunate to have been working a lot lately. A film I shot just before Watchmen called American Woman, directed by Semi Chellas, should be coming out this year.

And what haven’t I asked that’s important?
I think the question all filmmakers should ask themselves is, “Why am I telling this story, and what is unique about the way in which I’m telling it?”


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Review: GoPro’s Hero 8 and GoPro Max 360 cameras

By Brady Betzel

Every year GoPro outdoes themselves and gives users more reasons why you should either upgrade your old GoPro action camera or invest in the GoPro ecosphere for the first time. Late last year, GoPro introduced the GoPro Hero 8 (for review the Black Edition) and the GoPro Max — a.k.a. the re-imagined Fusion 360 camera.

If you aren’t sure whether you want to buy one or both of the new GoPro cameras, you should take a look at the GoPro TradeUp program where you can send in any camera with an original retail value of $99.99 or more and you will receive $100 off the Hero 8 or $50 off the Hero 7. Check out the TradeUp program. That at least will take the sting off of the $399.99 or higher price.

GoPro Hero 8 Black Edition
Up first, I’ll take a look at the GoPro Hero 8 Black Edition. If you own the Hero 7 Black Edition you will be familiar with many of the Hero 8 features. But there are some major improvements from the Hero 7 to the Hero 8 that will make you think hard about upgrading. The biggest update, in my opinion, is the increase in the max bit rate to 100 Mb/s. With that increase comes better quality video (more data = more information = more details). But GoPro also allows you to use the HEVC (H.265) codec to compress videos, which is a much improved version of the antiquated H.264. You can get into the weeds on the H.264 vs H.265 codecs over at Frame.io’s blog where they have some really great (and nerdy) info.

Anyway, with the bit rate increased — any video from the GoPro Hero 8 Black Edition has the potential to be better than the Hero 7. I say “potential” because if the bit rate doesn’t need to be that high, the GoPro won’t force it to be — it’s a variable bit rate codec that only uses data if it needs to.

Beyond the increased bit rate, there are some other great features. The menu system has been improved even further than the 7, making it easier to get shooting without setting a bunch of advanced settings. There are presets made to set up your GoPro quickly and easily, depending on what you are filming: Standard (1080, 60, Wide), Activity (2.7K, 60, SuperView) and Cinematic (4K, 30, Linear). Live Streaming has been upped from 720p to 1080p, but we are still stuck with streaming from the GoPro through your phone natively to the sites YouTube and Facebook. You can stream to other sites like Twitch by setting up a RTMP URL — but Instagram is still off the list. This is unfortunate because, I think live Instagram streams/stories would be their biggest feature.

Hypersmooth 2.0 is an improved version of Hypersmooth 1.0 (still on the Hero 7). You can stabilize even more if you enable the “Boost” function, which adds more stabilization but at the cost of a larger crop on your video. Hypersmooth is an incredible, gimbal-replacing stabilization that GoPro has engineered with the help of automatic Horizon levelling, GPS and other telemetry data.

Coming soon will be external attachments called “Mods,” which will include a Media Mod adding an HDMI output, 3.5mm microphone jack and two cold shoe mounts; a Display Mod adding a flip up screen to see yourself from the front (think vlogging); and a Light Mod, which will add an LED to the camera for lighting. These will require the Media Mod ($79.99) to be purchased in addition to the Light ($49.99) and/or Display Mod ($79.99). Keep an eye on the GoPro store for info on pre-orders and purchases.

GoPro cameras have always taken great pictures and video… when the lighting is perfect. Even with the Hero 8, low light is the Achilles Heel of GoPro — the video starts to become muddy and hard to view. The Hero 8 shares the same camera sensor as the Hero 7 and even the same GP1 processor, but the Hero 8 was able to squeeze some more tech out of it with features like Timewarp 2.0 and Hypersmooth 2.0. So you will get similar images and videos out of both cameras at a base level, but with the added technology in the Hero 8 Black Edition, if you have the extra $100 you should get the latest version. Overall, the Hero 8 Black Edition is physically thinner (albeit a little taller), the battery compartment is no longer on the bottom and the Hero 8 now has a built-in mount! No more need for extra cages if you don’t need them!

GoPro Max
So last time I used GoPro’s 360 camera creation it was the Fusion. The Fusion was a hard-to-use, bulky, 360 camera that required two separate memory cards. It was not my favorite camera to test, in fact it felt like it needed another round of beta testing. Fast forward to today, and the recently released GoPro Max, which is what the Fusion should have been. While I think the Max is a marked improvement over the Fusion, it is not necessarily for everyone. If you specifically need to make 360 content or want a new GoPro, but also want to keep your options open to filming in 360 degrees, then the Max is for you.

The GoPro Max costs $499.99, so it’s not much more than a Hero 8 Black and it can shoot in 360 degrees. One of the best features is the ability to shoot like a traditional GoPro (i.e. one lens). Unfortunately, you are limited to 1080p/1440p 60/30/24fps — and maybe worst of all no slow-mo. You don’t always need to shoot in 360 with the Max, but you don’t quite get the full line-up of frame rates offered by the traditional Hero 8 Black Edition.

In addition, the bit rate of the Max — maxes out (all pun intended) at 78Mb/s not the 100Mb/s like the Hero 8. But if you want to shoot in 360, the GoPro Max is pretty simple to get running. It will even capture 360 audio with its six-mic array, shoot 24fps or 30fps at 5.6K resolution for stitched spherical video and, best of all, editing the spherical video is much easier in the latest GoPro phone app update. Unfortunately, editing on a Windows computer is not as easy as on the phone.

I am using a computer with Windows 10, Intel i7 6-core CPU, 32 GB memory and an Nvidia RTX 2080 GPU — so not a slow laptop. You can find all of the Windows software and plugins for 360 video on GoPro’s website. It seems like GoPro is trying to bury software for computers because it wasn’t easy to find these Windows apps. Nonetheless, you will want to download the GoPro Max Exporter and the Adobe GoPro FX Reframe plugins if you plan on editing your footage. Before I go any further, if you want to watch a video tutorial, I suggest you watch Abe Kislevitz’s tutorial on the GoPro Max workflow in Adobe Premiere. He is really good at explaining the workflow. Abe works at GoPro in media production and always has great videos, check his website out while you’re at it.

Moving on, in MacOS, iOS and Android, you can work with GoPro Max’s native 360 file format via the GoPro built apps. The 360 file format is a Google-created mapping known as EAC or Equal Area Cubemap. Get all the nerdy bits here.

Unfortunately, in Windows you need to convert the 360 videos into the “old school” equirectangular format that we’ve been used to. To do this, you run your 360 videos through the GoPro Exporter, which can use the power of both the CPU and GPU to create new files. This extra step stinks, I can’t sugar coat it. But with better compression and mapping structures come growing pains… I guess. GoPro is supposedly building a Windows-based reframer and exporter like the MacOS version, but after trolling some GoPro forums it seems like they have a lot of work to do and/or came to the conclusion that most users are going to use their phones or a MacOS-based system.

Either way, the process to reframe your 360 video and export as 1920×1080 or 1080×1920 goes: convert your 360 files to a more usable Cineform, H.265 or H.264 Quicktime > import into Adobe Premiere > create a sequence that will match your largest output size (i.e. 1920×1080) > apply the GoPro FX Reframe plugin and identify your output size (i.e. 1920×1080) > keyframe rotation, edit, reframe, etc. > export.

It’s not awful as long as you know what you are going to get yourself into. But take that with a grain of salt; I am a professional video editor that works in video 10-12 hours a day at least five days a week. I am very comfortable with this kind of stuff. It seems like it would take some more work if I wasn’t an editor by trade. If I was to try and explain this to my wife or kids, they may roll their eyes at me and just ask me to do it… understandably. If you want to upload to a site like YouTube without editing, you will still need to convert the 360 videos to something that is more manageable for YouTube like H.264 or H.265.

When working in Premiere with the GoPro Reframe plugin I actually found the effect controls overlay intuitive when panning and tilting. The hard part when keyframing panning and tilting in Premiere is adjusting the bezier curve keyframe graph. It’s not exactly intuitive, which makes it a challenge to get smooth starts and ends to your pans and tilts… even with Easy Ease In/Out set. With a little work you can get it done, but I really hope GoPro gets their Windows-based GoPro Max 360 editor up and running so I don’t have to use Adobe.

But after I did a couple of keyframed pans and tilts to my footage I took while riding It’s a Small World at Disneyland with my three sons, I exported a quick H.264. For about 50 seconds of footage took four minutes to export. Not exactly speedy. But playing back my footage in a 1920×1080 timeline in Premiere at ¼ resolution when using the GoPro plugin was actually pretty smooth.

You can check out the video I edited with the GoPro FX Reframer plugin in Premiere on my YouTube channel: https://youtu.be/frhJ3T8fzmE. I then wanted to take my video of our Tea Cup ride at Disneyland straight from the camera and upload it to YouTube. Unfortunately, you still need to convert the video in the GoPro Max Exporter to an H.264 or the like but this time it only took 1:45 to export a two-minute 4K video. You can check out the 4K-360 Tea Cup ride on YouTube.

Final Thoughts
Are the GoPro Hero 8 Black Edition and GoPro Max 360 camera worth an upgrade or new purchase? I definitely think the Hero 8 is worth it. GoPro always makes great cameras that are waterproof down to 33 feet, take great pictures when there is enough available light, can create amazing hyperlapses and timewarps incamera with little work, fine tune your shutter speed or flat-color footage with ProTune settings, and all of this can fit in your pocket.

I used to take a DSLR to Disneyland, but with the Hypersmooth 2.0 and the improved HDR images that come out of the Hero 8, this is the only camera you will need.

The GoPro Max is a different story. I like what the GoPro Max does technically; it films 360 video easily and can double as a 1440p Hero camera. But I found myself fumbling a little with the Max because of its larger footprint when compared to the Hero 8 (it’s definitely smaller than the old Fusion 360 camera though). And when editing on your phone or doing a straight upload, the videos are relatively easy to process, as long as you don’t mind the export time. Unfortunately, using Premiere to edit these videos is a little rough; it’s better than with the Fusion but it’s not simple.

Maybe I’m just getting older and don’t want to give in to the VR/360 video yet, or maybe it’s just not as smooth of a process as I would have hoped for. But if you want to work in 360 and don’t mind a lack of slow motion, I wouldn’t tell you to not buy the GoPro Max. It’s a great camera with durable lenses and exterior case.

Check out the videos on YouTube and see what amazing people like Abe Kislevitz are doing; they may just show you what you need to see. And check out www.gopro.com for more info including when those new Hero 8 Mods will be available.


 

Destin Daniel Cretton talks directing Warner’s Just Mercy

By Iain Blair

An emotionally powerful and thought-provoking true story, Just Mercy is the latest film from award-winning filmmaker Destin Daniel Cretton (The Glass Castle, Short Term 12), who directed the film from a screenplay he co-wrote. Based on famed lawyer and activist Bryan Stevenson’s memoir, “Just Mercy: A Story of Justice and Redemption,” which details his crusade to defend, among others, wrongly accused prisoners on death row, it stars Michael B. Jordan and Oscar winners Jamie Foxx and Brie Larson.

The story starts when, after graduating from Harvard, Stevenson (Jordan) — who had his pick of lucrative jobs — instead heads to Alabama to defend those wrongly condemned or who were not afforded proper representation, with the support of local advocate Eva Ansley (Larson).

One of his first cases is that of Walter McMillian (Foxx), who, in 1987, was sentenced to die for the murder of an 18-year-old girl, despite evidence proving his innocence. In the years that follow, Stevenson becomes embroiled in a labyrinth of legal and political maneuverings as well as overt racism as he fights for Walter, and others like him, with the odds — and the system — stacked against them.

This case becomes the main focus of the film, whose cast also includes Rob Morgan as Herbert Richardson, a fellow prisoner who also sits on death row; Tim Blake Nelson as Ralph Myers, whose pivotal testimony against Walter McMillian is called into question; Rafe Spall as Tommy Chapman, the DA who is fighting to uphold Walter’s conviction and sentence; O’Shea Jackson Jr. as Anthony Ray Hinton, another wrongly convicted death row inmate whose cause is taken up by Stevenson; and Karan Kendrick as Walter’s wife, Minnie McMillian.

Cretton’s behind-the-scenes creative team included DP Brett Pawlak, co-writer Andrew Lanham, production designer Sharon Seymour, editor Nat Sanders and composer Joel P. West, all of whom previously collaborated with the director on The Glass Castle.

Destin Daniel Crettin

I spoke with the director about making the film, his workflow and his love of post.

When you read Brian’s book, did you feel compelled to take this on?
I did. His voice and the way he tells the story about these characters, who seem so easy to judge at first. Then he starts peeling off all the layers, and the way he uses humor in certain areas and devastation in others. Somehow it still makes you feel hopeful and inspired to do something about all the injustice – all of it just hit me so hard, and I felt I had to be involved in it some way.

Did you work very closely with him on the film?
I did. Before we even began writing a word, we went to meet him in Montgomery, and he introduced us to the real Anthony Ray Hinton and a bunch of lawyers working on cases. Brian was with us through the whole writing process, filling in the blanks and helping us piece the story together. We did a lot of research, and we had the book, but it obviously couldn’t include everything. Brian gave us all the transcripts of all the hearings, and a lot of the lines were taken directly from those.

This is different from most other courtroom dramas, as the trial’s already happened when the movie begins. What sort of film did you set out to make?
We set out to make the book in as compelling a way as possible. And it’s a story about this young lawyer who’s trying to convince the system and state they made a terrible mistake, with all the ups and downs, and just how long it takes him to succeed. That’s the drama.

What were the main challenges in pulling it all together?
Telling a very intense, true story about people, many of whom are still alive and still doing the work they were doing then. So accuracy was a huge thing, and we all really felt the burden and responsibility to get it right. I felt it more so than any film I’ve ever done because I respect Brian’s work so much. We’re also telling stories about people who were very vulnerable.

Trying to figure out how to tell a narrative that still moved at the right pace and gave you an emotional ride, but which stayed completely accurate to the facts and to a legal process that moves incredibly slowly was very challenging. A big moment for me were when Brian first saw the film and gave me a big hug and thank you; he told me it was not for how he was portrayed, but for how we took care of his clients. That was his big concern.

What did Jamie and Michael bring to their roles?
They’ve been friends for a long time, so they already had this great natural chemistry, and they were able to play through scenes like two jazz musicians and bring a lot of stuff that wasn’t there on the page.

I heard you actually shot in the south. How tough was the shoot?
Filming in some of the real locations really helped. We were able to shoot in Montgomery — such as the scenes where Brian’s doing his morning jogs, the Baptist church where MLK Jr. was the pastor, and then the cotton fields and places where Walter and his family actually lived. Being there and feeling the weight of history was very important to the whole experience. Then we shot the rest of the film in Atlanta.

Where did you post?
All in LA on the Warner lot.

Do you like the post process?
I love post and I hate it (laughs). And it depends on whether you’re finding a solution to a problem or you’re realizing you have a big problem. Post, of course, is where you make the film and where all the problems are exposed… the problems with all the choices I made on set. Sometimes things are working great, but usually it’s the problems you’re having to face. But working with a good post team is so fulfilling, and you’re doing the final rewrite, and we solved so many things in post on this.

Talk about editing with your go-to Nat Sanders, who got an Oscar nom for his work (with co-editor Joi McMillon) on Moonlight and also cut If Beale Street Could Talk.
Nat wasn’t on set. He began cutting material here in LA while we shot on location in Atlanta and Alabama, and we talked a lot on the phone. He did the first assembly which was just over three hours long. All the elements were there but shaping all the material and fine-tuning it all took nearly a year as we went through every scene, talking them out.

Finding the correct emotional ride and balance was a big challenge, as this has so many emotional highs and lows and you can easily tire an audience out. We had to cut some storylines that were working, but we were sending people on another down when they needed something lighter. The other part of it was performance, and you can craft so much of that in the edit; our leads gave us so many takes and options to play with. Dealing with that is one of Nat’s big strengths. Both of us are meticulous, and we did a lot of test screenings and kept making adjustments.

Writer Iain Blair (left) and director Destin Daniel Crettin.

Nat and I both felt the hardest scene to cut and get right was Herb’s execution scene, because of the specific tone needed. If you went too far in one direction, it felt too much, but if you went too far the other way, it didn’t quite hit the emotional beat it needed. So that took a lot of time, playing around with all the cross-cutting and the music and sound to create the right balance.

All period films need VFX. What was entailed?
Crafty Apes did them, and we did a lot of fixes, added period stuff and did a lot of wig fixes — more than you’d think (laughs). We weren’t allowed to shoot at the real prison, so we had to create all the backdrops and set extensions for the death row sequences.

Can you talk about the importance of sound and music.
It’s always huge for me, and I’ve worked with my composer, Joel, and supervising sound editor/re-recording mixer Onnalee Blank, who was half of the sound team, since the start. For both of them, it was all about finding the right tone to create just the right amount of emotion that doesn’t overdo it, and Joel wrote the score in a very stripped-down way and then got all these jazz musicians to improvise along with the score.

Where did you do the DI and how important is it to you?
That’s huge too, and we did it at Light Iron with colorist Ian Vertovec. He’s worked with my DP on almost every project I’ve done, and he’s so good at grading and giving you a very subtle palette.

What’s next?
We’re currently on preproduction on Shang-Chi and the Legend of the Ten Rings, featuring Marvel’s first Asian superhero. It’s definitely a change of pace after this.


 

Colorist Chat: Nice Shoes’ Maria Carretero on Super Bowl ads, more

This New York-based colorist, who worked on four Super Bowl spots this year, talks workflow, inspiration and more.

Name: Maria Carretero

Company: Nice Shoes

What kind of services does Nice Shoes offer?
Nice Shoes is a creative studio with color, editorial, animation, VFX, AR and VR services. It’s a full-service studio with offices in NYC, Chicago, Boston, Minneapolis and Toronto, as well as remote locations throughout North America.

Michelob Ultra’s Jimmy Works It Out

As a colorist, what would surprise people the most about what falls under that title?
I think people are surprised when they discover that there is a visual language in every single visual story that connects your emotions through all the imagery that we’ve collected in our brains. This work gives us the ability to nudge the audience emotionally over the course of a piece. Color grading is rooted in a very artistic base — core, emotional aspects that have been studied in art and color theory that make you explore cinematography in such an interesting way.

What system do you work on?
We use FilmLight Baselight as our primary system, but the team is also versed in Blackmagic Resolve.

Are you sometimes asked to do more than just color on projects?
Sometimes. If you have a solid relationship with the DP or the director, they end up consulting you about palettes, optics and references, so you become an active part of the creativity in the film, which is very cool. I love when I can get involved in projects from the beginning.

What’s your favorite part of the job?
My favorite moment is when you land on the final look and you see that the whole film is making visual sense and you feel that the story, the look and the client are all aligned — that’s magic!

Any least favorites?
No, I love coloring. Sometimes the situation becomes difficult because there are technical issues or disagreements, but it’s part of the work to push through those moments and make things work

If you didn’t have this job, what would you be doing instead?
I would probably be a visual artist… always struggling to keep the lights on. I’m kidding! I have so much respect for visual artists, I think they should be treated better by our society because without art there is no progress.

How early did you know this would be your path?
I was a visual artist for seven years. I was part of Nives Fernandez’s roster, and all that I wanted at that time was to try to tell my stories as an artist. I was freelancing in VFX to get some money that helped me survive, and I landed on the VFX side, and from there to color was a very easy switch. When I landed in Deluxe Spain 16 years ago and started to explore color, I quickly fell in love.

It’s why I like to say that color chose me.

Avocados From Mexico: Shopping Network

You recently worked on a number of Super Bowl spots. Can you talk a bit about your work on them, and any challenges relating to deadlines?
This year I worked on four Super Bowl spots Michelob Ultra PureGold: 6 for 6 Pack, Michelob Ultra: Jimmy Works It Out, Walmart: United Towns and Avocados From Mexico: Shopping Network.

Working on these kinds of projects is definitely a really interesting experience. The deadlines are tight, the pressure is enormous, but at the same time, the amount of talent and creativity involved is gigantic, so if you survive (laughs) you always will be a better professional. As a colorist I love to be challenged. I love dealing with difficult situations where all your resources and your energy is being put to the test.

Any suggestions for getting the most out of a project from a color perspective?
Thousands! Technical understanding, artistic involvement, there are so many… But definitely trying to create something new, special, different; embracing the challenges and pushing beyond the boundaries are the keys to delivering good work.

How do you prefer to work with the DP or director?
I like working with both. Debating with any kind of artist is the best. It’s really great to be surrounded by someone that uses a common “language.” As I mentioned earlier, I love when there’s the opportunity to get the conversation going at the beginning of a project so that there’s more opportunity for collaboration, debate and creativity.

How do you like getting feedback in terms of the look? Photos, films, etc.?
Every single bit of information is useful. I love when they verbalize what they’re going for using stories, feelings — when you can really feel they’re expressing personality with the film.

Where do you find inspiration? Art? Photography?
I find inspiration in living! There are so many things that surround us that can be a source of inspiration. Art, landscapes, the light that you remember from your childhood, a painting, watching someone that grabs your attention on a train. New York is teeming with more than enough life and creativity to keep any artist going.

Name three pieces of technology you can’t live without.
The Tracker, Spotify and FaceTime.

This industry comes with tight deadlines. How do you de-stress from it all?
I have a sense of humor and lots of red wine (smiles).

London’s Molinare launches new ADR suite

Molinare has officially opened a new ADR suite in its Soho studio in anticipation of increased ADR output and to complement last month’s CAS award-winning ADR work on Fleabag. Other recent ADR credits for the company include Good Omens, The Capture and Strike Back. Molinare sister company Hackenbacker also picked up some award love with a  a BAFTA TV Craft and an AMPS award for Killing Eve.

Molinare and Hackenbacker’s audio setup includes nine mixing theaters, three of which have Dolby 5.1/7.1 Theatrical or Commercials & Trailers Certification, and one has full Dolby Atmos home entertainment mix capability.

Molinare works on high-end TV dramas, feature films, feature documentaries and TV reality programming. Recent audio credits include BBC One’s Dracula, The War of the Worlds from Mammoth Screen and Worzel Gummidge. Hackenbacker has recently worked on HBO’s Avenue 5 for returning director Armando Iannucci and Carnival Film’s Downton Abbey and has contributed to the latest season of Peaky Blinders.

Sohonet intros ClearView Pivot for 4K remote post

Sohonet is now offering ClearView Pivot, a solution for realtime remote editing, color grading, live screening and finishing reviews at full cinema quality. The new solution will provide connectivity and collaboration services for productions around the world.

ClearView Pivot offers 4K HDR with 12-bit color depth and 4:4:4 chroma sampling for full-color-quality video streaming with ultra-low latency over the Sohonet’s private media network, which avoids the extreme compression required due to contention and latency of public internet connections.

“Studios around the world need a realtime 4K collaboration tool that can process video at lossless color fidelity using the industry-standard JPEG 2000 codec between two locations across a network like ours. Avoiding the headache of the current ‘equipment only’ approach is the only scalable solution,” explains Sohonet CEO Chuck Parker.

Sohonet says its integrated solution is approved by ISE (Independent Security Evaluators) — the industry’s gold standard for security. Sohonet’s solution provides an encrypted stream between each endpoint and provides an auditable usage trail for every solution. The Soho Media Network ( SMN) connection offers ultra-low latency (measured in milliseconds), and the company says that unlike equipment-only solutions that require the user to navigate firewall and security issues and perform a “solution check” before each session, ClearView Pivot works immediately. As a point-to-multipoint solution, the user can also pivot easily from one endpoint to the next to collaborate with multiple people at the click of a button or even to stream to multiple destinations at the same time.

Sohonet has been working closely with productions on lots and on locations over the past few years in the ongoing development of ClearView Pivot. In those real-world settings, ClearView Pivot has been put through its paces with trials across multiple departments, and the color technologies have been fully inspected and approved by experts across the industry.

Sundance Videos: Editor to Editor

Our own Brady Betzel headed out to Park City this year to talk to a few editors whose films were being screened at the Sundance Film Festival.

As an editor himself, Betzel wanted to know about the all-important workflow, but also about how they got their start in the business and how important it is to find a balance between work life and personal life.

Among those he sat down with were Scare Me editor Patrick Lawrence, Boys State editors Jeff Gilbert and Connor Hall, Save Yourselves! editor Sofi Marshall, Aggie editor Gil Seltzer, Miss Juneteenth editor Courtney Ware, Black Bear editor Matthew L. Weiss, Spree editor Benjamin Moses Smith and Dinner in America writer/director/editor Adam Carter Rehmeier.

Click here to see them all.

MPI restores The Wizard of Oz in 4K HDR

By Barry Goch

The classic Victor Fleming-directed film The Wizard of Oz, which was released by MGM in 1939 and won two of its six Academy Award nominations, has been beautifully restored by Burbank’s Warner Bros. Motion Picture Imaging (MPI).

Bob Bailey

To share its workflow on the film, MPI invited a group of journalists to learn about the 4K UHD HDR restoration of this classic film. The tour guide for our high-tech restoration journey was MPI’s VP of operations and sales Bob Bailey, who walked us through the entire restoration process — from the original camera negative to final color.

The Wizard of Oz, which starred Judy Garland, was shot on a Technicolor three-strip camera system. According to Bailey, it ran three black and white negatives simultaneously. “That is why it is known as three-strip Technicolor. The magazine on top of the camera was triple the width of a normal black and white camera because it contained each roll of negative to capture your red, green and blue records,” explained Bailey.

“When shooting in Technicolor, you weren’t just getting the camera. You would rent a package that included the camera, a camera crew with three assistants, the film, the processing and a Technicolor color consultant.”

George Feltenstein, SVP of theatrical catalog marketing for Warner Bros. Home Entertainment, spoke about why the film was chosen for restoration. “The Wizard of Oz is among the crown jewels that we hold,” he said. “We wanted to embrace the new 4K HDR technology, but nobody’s ever released a film that old using this technology. HDR, or high dynamic range, has a color range that is wider than anything that’s come before it. There are colors [in The Wizard of Oz] that were never reproducible before, so what better a film to represent that color?”

Feltenstein went on to explain that this is the oldest film to get released in the 4K format. He hopes that this is just the beginning and that many of the films in Warner Bros.’ classic library will also be released on 4K HDR and worked on at MPI under Bailey’s direction.

The Process
MPI scanned each of the three-strip Technicolor nitrate film negatives at 8K 16-bit, composited them together and then applied a new color grain. The film was rescanned with the Lasergraphics Director 10K scanner. “We have just under 15 petabytes of storage here,” said Bailey. “That’s working storage, because we’re working on 8K movies since [some places in the world] are now broadcasting 8K.”

Steven Anastasi

Our first stop was to look at the Lasergraphics Director. We then moved on to MPI’s climate-controlled vault, where we were introduced to Steven Anastasi, VP of technical operations at Warner Bros. Anastasi explained that the original negative vault has climate-controlled conditions with 25% humidity at 35 degrees Fahrenheit, which is the combination required for keeping these precious assets safe for future generations. He said there are 2 million assets in the building, including picture and sound.

It was amazing to see film reels for 2001: A Space Odyssey sitting on a shelf right in front of me. In addition to the feature reels, MPI also stores millions of negatives captured throughout the years by Warner productions. “We also have a very large library,” reported Anastasi. “So the original negatives from the set, a lot of unit photography, head shots in some cases and so forth. There are 10 million of these.”

Finally, we were led into the color bay to view the film. Janet Wilson, senior digital colorist at MPI, has overseen every remaster of The Wizard of Oz for the past 20 years. Wilson used a FilmLight Baselight X system for the color grade. The grading suite housed multiple screens: a Dolby Pulsar for the Dolby Vision pass, a Sony X300 and a Panasonic EZ1000 OLED 4K HDR.

“We have every 4K monitor manufactured, and we run the film through all of them,” said Bailey. “We painstakingly go through the process from a post perspective to make sure that our consumers get the best quality product that’s available out in the marketplace.”

“We want the consumer experience on all monitors to be something that’s taken into account,” added Feltenstein. “So we’ve changed our workflow by having a consumer or prosumer monitor in these color correction suites so the colorist has an idea of what people are going to see at home, and that’s helped us make a better product.”

Our first view of the feature was a side-by-side comparison of the black and white scanned negative and the sepia color corrected footage. The first part of the film, which takes place in Kansas, was shot in black and white, and then a sepia look was applied to it. The reveal scene, when Dorothy passes through the door going into Oz, was originally shot in color. For this new release, the team generated a matte so Wilson could add this sepia area to the inside of the house as Dorothy transitioned into Oz.

“So this is an example of some of the stuff that we could do in this version of the restoration,” explained Wilson. “With this version, you can see that the part of the image where she’s supposed to be in the monochrome house is not actually black and white. It was really a color image. So the trick was always to get the interior of the house to look sepia and the exterior to look like all of the colors that it’s supposed to. Our visual effects team here at MPI — Mike Moser and Richie Hiltzik — was able to draw a matte for me so that I could color inside of the house independently of the exterior and make them look right, which was always a really tricky thing to do.”

Wilson referred back to the Technicolor three-strip, explaining that because you’ve got three different pieces of film — the different records — they’re receiving the light in different ways. “So sometimes one will be a little brighter than the other. One will be a little darker than the other, which means that the Technicolor is not a consistent color. It goes a little red, and then it goes a little green, and then it goes a little blue, and then it goes a little red again. So if you stop on any given frame, it’s going to look a little different than the frames around it, which is one of the tricky parts of color correcting technical art. When that’s being projected by a film projector, it’s less noticeable than when you’re looking at it on a video monitor, so it takes a lot of little individual corrections to smooth those kinds of things out.”

Wilson reported seeing new things with the 8K scan and 4K display. “The amount of detail that went into this film really shows up.” She said that one of the most remarkable things about the restoration was the amazing detail visible on the characters. For the first time in many generations, maybe ever, you can actually see the detail of the freckles on Dorothy’s face.

In terms of leveraging the expanded dynamic range of HDR, I asked Wilson if she tried to map the HDR, like in kind of a sweet spot, so that it’s both spectacular yet not overpowering at the same time.

“I ended up isolating the very brightest parts of the picture,” she replied. “In this case, it’s mostly the sparkles on their shoes and curving those off so I could run those in, because this movie is not supposed to have modern-day animation levels of brightness. It’s supposed to be much more contained. I wanted to take advantage of brightness and the ability to show the contrast we get from this format, because you can really see the darker parts of the picture. You can really see detail within the Wicked Witch’s dress. I don’t want it to look like it’s not the same film. I want it to replicate that experience of the way this film should look if it was projected on a good print on a good projector.”

Dorothy’s ruby slippers also presented a challenge to Wilson. “They are so red and so bright. They’re so light-reflective, but there were times when they were just a little too distracting. So I had to isolate this level at the same track with slippers and bring them down a little bit so that it wasn’t the first and only thing you saw in the image.”

If you are wondering if audio was part of this most recent restoration, the answer is no, but it had been remastered for a previous version. “As early at 1929, MGM began recording its film music using multiple microphones. Those microphonic angles allowed the mixer to get the most balanced monophonic mix, and they were preserved,” explained Feltenstein. “Twenty years ago, we created a 5.1 surround mix that was organically made from the original elements that were created in 1939. It is full-frequency, lossless audio, and a beautiful restoration job was made to create that track so you can improve upon what I consider to be close to perfection without anything that would be disingenuous to the production.”

In all, it was an amazing experience to go behind the scenes and see how the wizards of MPI created a new version of this masterpiece for today and preserved it for future generations.

This restored version of The Wizard of Oz is a must-see visual extravaganza, and there is no better way to see it than in UHD, HDR, Dolby Vision or HDR10+. What I saw in person took my breath away, and I hope every movie fan out there can have the opportunity to see this classic film in its never-before-seen glory.

The 4K version of The Wizard of Oz is currently available via an Ultra HD Blu-ray Combo Pack and digital.


Barry Goch is a finishing artist at LA’s The Foundation as well as a UCLA Extension Instructor, Post Production. You can follow him on Twitter at @Gochya

Behind the Title: Harbor sound editor/mixer Tony Volante

“As re-recording mixer, I take all the final edited elements and blend them together to create the final soundscape.”

Name: Tony Volante

Company: Harbor

Can you describe what Harbor does?
Harbor was founded in 2012 to serve the feature film, episodic and advertising industries. Harbor brings together production and post production under one roof — what we like to call “a unified process allowing for total creative control.”

Since then, Harbor has grown into a global company with locations in New York, Los Angeles and London. Harbor hones every detail throughout the moving-image-making process: live-action, dailies, creative and offline editorial, design, animation, visual effects, CG, sound and picture finishing.

What’s your job title?
Supervising Sound Editor/Re-Recording Mixer

What does that entail?
I supervise the sound editorial crew for motion pictures and TV series along with being the re-recording mixer on many of my projects. I put together the appropriate crew and schedule along with helping to finalize a budget through the bidding process. As re-recording mixer, I take all the final edited elements and blend them together to create the final soundscape.

What would surprise people the most about what falls under that title?
How almost all the sound that someone hears in a movie has been replaced by a sound editor.

What’s your favorite part of the job?
Creatively collaborating with co-workers and hearing it all come together in the final mix.

What is your most productive time of day?
Whenever I can turn off my emails and can concentrate on mixing.

If you didn’t have this job, what would you be doing instead?
Fishing!

When did you know this would be your path?
I played drums in a rock band and got interested in sound at around 18 years old. I was always interested in the “sound” of an album along with the musicality. I found myself buying records based on who had produced and engineered them.

Can you name some recent projects?
Fosse/Verdo (FX) and Boys State, which just one Grand Jury Prize at Sundance.

How has the industry changed since you began working?
Technology has improved workflows immensely and has helped us with the creative process. It has also opened up the door to accelerating schedules to the point of sacrificing artistic expression and detail.

Name three pieces of technology you can’t live without
Avid Pro Tools, my iPhone and my car’s navigation system.

How do you de-stress from it all?
I stand in the middle of a flowing stream fishing with my fly rod. If I catch something that’s a bonus!

Sony adds 4K HDR reference monitors to Trimaster range

Sony is offering a new set of high-grade 4K HDR monitors as part of its Trimaster range. The PVM-X2400 (24-inch) and the PVM-X1800 (18.4-inch) professional 4K HDR monitors were demo’d at the BSC Expo 2020 in London. They will be available in the US starting in July.

The monitors provide ultra-high definition with a resolution of 3840×2160 pixels and a brightness of all-white luminance of 1000 cd/m2. For optimum film production, their wide color gamut matches the BVM-HX310 Trimaster HX master monitor. This means both monitors feature accurate color reproduction and greyscale, which helps filmmakers make critical imaging decisions and deploy faithful color matching throughout the workflow.

The monitors, which are small and portable, are designed to expand footprints in 4K HDR production, including applications such as on-set monitoring, nonlinear video editing, studio wall monitoring and rack-mount monitoring in OB trucks or machine rooms.

The monitors also feature new Black Detail High/Mid/Low, which helps maintain accurate color reproduction by reducing the brightness of the backlight to reproduce the correct colors and gradations in low-luminance areas. Another new function, Dynamic Contrast Drive, changes backlight luminance to adapt to each scene or frame when transferring images from PVM-X2400/X1800 to an existing Sony OLED monitor.  This functionality allows filmmakers to check the highlight and low-light balance of the contents with both bright and dark scenes.

Other features include:
• Dynamic contrast ratio of 1,000,000:1 by Dynamic Contrast Drive, a new backlight driving system that dynamically changes the backlight luminance to adapt for each frame of a scene.
• 4K/HD scopes with HDR scales that are waveform/vector.
• Quad View display and User 3D LUT functionality.
• 12G/6G/3G/HD-SDI with auto configuration.

An online editor’s first time at Sundance

By Brady Betzel

I’ve always wanted to attend the Sundance Film Festival, and my trip last month did not disappoint. Not only is it an iconic industry (and pop-culture) event, but the energy surrounding it is palpable.

Once I got to Park City and walked Main Street — with the sponsored stores (Canon and Lyft among others) and movie theaters, like the Egyptian — I started to feel an excitement and energy that I haven’t felt since I was making videos in high school and college… when there were no thoughts of limits and what I should or shouldn’t do.

A certain indescribable nervousness and love started to bubble up. Sitting in the luxurious Park City Burger King with Steve Hullfish (Art of the Cut) and Joe Herman (Cinemontage) before my second screening of Sundance 2020: Dinner in America, I was thinking how I was so lucky to be in a place that is packed with creatives. It sounds cliché and trite, but it really is reinvigorating to surround yourself with positive energy — especially if you can get caught up in cynicism like me.

It brought me back to my college classes, taught by Daniel Restuccio (another postPerspective writer), at California Lutheran University, where we would cut out pictures from magazines, draw pictures, blow up balloons, eat doughnuts and do whatever we could to get our ideas out in the open.

While Sundance occasionally felt like an amalgamation of the thirsty-hipster Coachella crowd mixed with a high school video production class (but with million-dollar budgets), it still had me excited to create. Sundance 2020 in Park City was a beautiful resurgence of ideas and discussions about how we as an artistic community can offer accessibility to everyone and anyone who wants to tell their own story on screen.

Inclusiveness Panel
After arriving in Park City, my first stop was a panel hosted by Adobe called “Empowering Every Voice in Film and the World.” Maybe it was a combination of the excitement of Sundance and the discussion about accessibility, but it really got me thinking. The panel was expertly hosted by Adobe’s Meagan Keane and included producer, director Yance Ford (Disclosure: Trans Lives on Screen, Oscar-nominated for Strong Island); editor Eileen Meyer (Crip Camp); editor Stacy Goldate (Disclosure: Trans Lives on Screen); and director Crystal Kayiza (See You Next Time).

I walked away feeling inspired and driven to increase my efforts in accessibility. Eileen said one of her biggest opportunities came from the Karen Schmeer Film Editing Fellowship, a year-long fellowship for emerging documentary editors.

Yance drove home the idea of inclusivity and re-emphasized the idea of access to equipment. But it’s not simply about access — you also have to make a great story and figure out things like distribution. I was really struck by all the speakers on-stage, but Yance really spoke to me. He feels like the voice we need when representing marginalized groups and to see more content from these creatives. The more content we see the better.

Crystal spoke about the community needing to tell stories that don’t necessarily have standard plot points and stakes. The idea to encourage people to create their stories and for those that are in power to help and support these stories and trust the filmmakers, regardless of whether you identify with the ideas and themes.

Rebuilding Paradise

Screenings
One screening I attended was Rebuilding Paradise, directed by Ron Howard. He was at the premiere, along with some of the people who lost everything in the Paradise, California fires. In the first half of November 2018, there were several fires that raged out of control in California. One surrounded the city of Simi Valley and worked its way toward the Pacific Coast. (It was way too close for my comfort in Simi Valley. We eventually evacuated but were fine.)

Another fire was in the town of Paradise, which burnt almost the entire city to the ground. Watching Rebuilding Paradise filled me with great sadness for those who lost family members and their homes. Some of the “found footage” was absolutely breathtaking. One in particular was of a father racing out of what appears to be hell, surrounded by flames, in his car with his child asking if they were going to die. Absolutely incredible and heart wrenching.

Dinner in America

Another film I saw was Dinner in America, as referenced earlier in this piece. I love a good dark comedy/drama, so when I got a ticket to Adam Carter Rehmeier’s Dinner in America I was all geared up. Little did I know it would start off with a disgruntled 20-something throwing a chair through a window and lighting the front sidewalk on fire. Kudos to composer John Swihart, who took a pretty awesome opening credit montage and dropped the heat with his soundtrack.

Dinner in America is a mid-‘90s Napoleon Dynamite cross-pollinated with the song “F*** Authority” by Pennywise. Coincidentally, Swihart composed the soundtrack for Napoleon Dynamite. Seriously, the soundtrack to Dinner in America is worth the ticket price alone, in my opinion. It adds so much to one of the main character’s attitude. The parallel editing mixed with the fierce anti-authoritarianism love story, lived by Kyle Gallner and Emily Skeggs, make for a movie you probably won’t forget.

Adam Rehmeier

During the Q&A at the end, writer, director and editor Rehmeier described how he essentially combined two ideas that led to Dinner in America. As I watched the first 20 minutes, it felt like two separate movies, but once it came together it really paid off. Much like the cult phenomenon Napoleon Dynamite, Dinner in America will resonate with a wide audience. It’s worth watching when it comes to a theater (or streaming platform) near you. In the meantime, check out my video interview with him.

Adobe Productions
During Sundance, Adobe announced an upcoming feature for Premiere called “Productions.” While in Park City, I got a small demo of the new Productions at Adobe’s Sundance Production House. It took about 15 minutes before I realized that Adobe has added the one feature that has set Avid Media Composer apart for over 20 years — bin locking. Head’s up Avid, Adobe is about to release multi-user workflow that is much easier to understand and use than on previous iterations of Premiere.

The only thing that caught me off guard was the nomenclature — Productions and Projects. Productions is the title, but really a “Production” is a project, and what they call a “project” is a bin. If you’re familiar with Media Composer, you can create a project and inside have folders and bins. Bins are what house media links, sequences, graphics and everything else. In the new Productions update, a “Production” will house all of your “Projects” (i.e. a Project with bins).

Additionally, you will be able to lock “Projects.” This means that in a multi-user environment (which can be something like a SAN or even an Avid Nexis), a project and media can live on the shared server and be accessed by multiple users. These users can be named and identified inside of the Premiere Preferences. And much like Blackmagic’s DaVinci Resolve, you can update the “projects” when you want to — individually or all projects at once. On its face, Productions looks like the answer to what every editor has said is one of the only reasons Avid is still such a powerhouse in “Hollywood” — the ability to work relatively flawlessly among tons of editors simultaneously. If what I saw works the way it should, Adobe is looking to take a piece of the multi-user environment pie Avid has controlled for so long.

Summing Up
In the end, the Sundance Film Festival 2020 in Park City was likely a once-in-a-lifetime experience for me. From seeing celebrities, meeting other journalists, getting some free beanies and hand warmers (it was definitely not 70 degrees like California), to attending parties hosted by Canon and Light Iron — Sundance can really reinvigorate your filmmaking energy.

It’s hard to keep going when you get burnt out by just how hard it is to succeed and break through the barriers in film and multimedia creation. But seeing indie films and meeting like-minded creatives, you can get excited to create your own story. And you realize that there are good people out there, and sometimes you just have to fly to Utah to find them.

Walking down Main Street, I found a coffee shop named Atticus Coffee and Tea House. My oldest son’s name is Atticus, so I naturally had to stop in and get him something, I ended up getting him a hat and me a coffee. It was good. But what I really did was sit out front pretending to shoot b-roll and eavesdropping on some conversations. It really is true that being around thoughtful energy is contagious. And while some parts of Sundance feel like a hipster-popularity contest, there are others who are there to improve and absorb culture from all around.

The 2020 Sundance Film Festival’s theme in my eyes was to uplift other people’s stories. As Harper Lee wrote in “To Kill a Mockingbird” when Atticus Finch is talking with Scout: “First of all, if you learn a simple trick, Scout, you’ll get along a lot better with all kinds of folks. You never really understand a person until you consider things from his point of view . . . until you climb into his skin and walk around in it.”


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Alkemy X adds all-female design collective Mighty Oak

Alkemy X has added animation and design collective Mighty Oak to its roster for US commercial representation. Mighty Oak has used its expertise in handmade animation techniques and design combined with live action for brands and networks, including General Electric, Netflix, Luna Bar, HBO, Samsung NBC, Airbnb, Conde Nast, Adult Swim and The New York Times.

Led by CEO/EP Jess Peterson, head of creative talent Emily Collins and CD Michaela Olsen, the collective has garnered over 3 billion online views. Mighty Oak’s first original short film, Under Covers, premiered at the 2019 Sundance Film Festival. Helmed by Olsen, the quirky stop-motion short features handmade puppets and forced-perspective sets to glimpse into the unsuspecting lives and secrets that rest below the surface of a small town.

“I was immediately struck by the extreme care that Mighty Oak takes on each and every frame of their work,” notes Alkemy X EP Eve Ehrich. “Their handmade style and fresh approach really make for dynamic, memorable animation, regardless of the concept.”

Mighty Oak’s Peterson adds, “We are passionate about collaborating with our clients from the earliest stages, working together to craft original character designs and creating work that is memorable and fun.”

Post house DigitalFilm Tree names Nancy Jundi COO

DigitalFilm Tree (DFT) has named Nancy Jundi as chief operating officer. She brings a wealth of experience to her new role, after more than 20 years working with entertainment and technology companies.

Jundi has been an outside consultant to DFT since 2014 and will now be based in DFT’s Los Angeles headquarters, where she joins founder and CEO Ramy Katrib in pioneering new offerings for DFT.

Jundi began her career in investment banking and asset protection before segueing into the entertainment industry. Her experience includes leading sales and marketing at Runway during its acquisition by The Post Group (TPG). She then joined that team as director of marketing and communications to unify the end-to-end post facilities into TPG’s singular brand narrative. She later co-founded Mode HQ (acquired by Pacific Post) before transitioning into technology and SaaS companies.

Since 2012, Jundi has served as a consultant to companies in industries as varied as financial technology, healthcare, eCommerce, and entertainment, with brands such as LAbite, Traffic Zoom and GoBoon. Most recently, she served as interim senior management for CareerArc.

“Nancy is simply one of the smartest and most creative thinkers I know,” says Katrib. “She is the rare interdisciplinary – the creative and technological thinker that can exist in both arenas with clarity and tenacity.”

Marriage Story director Noah Baumbach

By Iain Blair

Writer/director Noah Baumbach first made a name for himself with The Squid and the Whale, his 2005 semi-autobiographical, bittersweet story about his childhood and his parents’ divorce. It launched his career, scoring him an Oscar nomination for Best Original Screenplay.

Noah Baumbach

His latest film, Marriage Story, is also about the disintegration of a marriage — and the ugly mechanics of divorce. Detailed and emotionally complex, the film stars Scarlett Johansson and Adam Driver as the doomed couple.

In all, Marriage Story scooped up six Oscar nominations — Best Picture, Best Actress, Best Actor, Best Supporting Actress, Best Original Screenplay and Best Original Score. Laura Dern walked away with a statue for her supporting role.

The film co-stars Dern, Alan Alda and Ray Liotta. The behind-the-scenes team includes director of photography Robbie Ryan, editor Jennifer Lame and composer Randy Newman.

Just a few days before the Oscars, Baumbach — whose credits also include The Meyerwitz Stories, Frances Ha and Margot at the Wedding — talked to me about making the film and his workflow.

What sort of film did you set out to make?
It’s obviously about a marriage and divorce, but I never really think about a project in specific terms, like a genre or a tone. In the past, I may have started a project thinking it was a comedy but then it morphs into something else. With this, I just tried to tell the story as I initially conceived it, and then as I discovered it along the way. While I didn’t think about tone in any general sense, I became aware as I worked on it that it had all these different tones and genre elements. It had this flexibility, and I just stayed open to all those and followed them.

I heard that you were discussing this with Adam Driver and Scarlett Johansson as you wrote the script. Is that true?
Yes, but it wasn’t daily. I’d reached out to both of them before I began writing it, and luckily they were both enthusiastic and wanted to do it, so I had them as an inspiration and guide as I wrote. Periodically, we’d get together and discuss it and I’d show them some pages to keep them in the loop. They were very generous with conversations about their own lives, their characters. My hope was that when I gave them the finished script it would feel both new and familiar.

What did they bring to the roles?
They were so prepared and helped push for the truth in every scene. Their involvement from the very start did influence how I wrote their roles. Nicole has that long monologue and I don’t know if I’d have written it without Scarlett’s input and knowing it was her. Adam singing “Being Alive” came out of some conversations with him. They’re very specific elements that come from knowing them as people.

You reunited with Irish DP Robbie Ryan, who shot The Meyerowitz Stories. Talk about how you collaborated on the look and why you shot on film?
I grew up with film and feel it’s just the right medium for me. We shot The Meyerowitz Stories on Super 16, and we shot this on 35mm, and we had to deal with all these office spaces and white rooms, so we knew there’d be all these variations on white. So there was a lot of discussion about shades and the palette, along with the production and costume designers, and also how we were going to shoot these confined spaces, because it was what the story required.

You shot on location in New York and LA. How tough was the shoot?
It was challenging, but mainly because of the sheer length of many of the scenes. There’s a lot of choreography in them, and some are quite emotional, so everyone had to really be up for the day, every day. There was no taking it easy one day. Every day felt important for the movie.

Where did you do the post?
All in New York. I have an office in the Village where I cut my last two films, and we edited there again. We mixed on the Warner stage, where I’ve mixed most of my movies. We recorded the music and orchestra in LA.

Do you like the post process?
I really love it. It’s the most fun and the most civilized part of the whole process. You go to work and work on the film all day, have dinner and go home. Writing is always a big challenge, as you’re making it up as you go along, and it can be quite agonizing. Shooting can be fun, but it’s also very stressful trying to get everything you need. I love working with the actors and crew, but you need a high level of energy and endurance to get through it. So then post is where you can finally relax, and while problems and challenges always arise, you can take time to solve them. I love editing, the whole rhythm of it, the logic of it.

_DSC4795.arw

Talk about editing with Jennifer Lame. How did that work?
We work so well together, and our process really starts in the script stage. I’ll give her an early draft to get her feedback and, basically, we start editing the script. We’ll go through it and take out anything we know we’re not going to use. Then during the shoot she’ll sometimes come to the set, and we’ll also talk twice a day. We’ll discuss the day’s work before I start, and then at lunch we’ll go over the previous day’s dailies. So by the time we sit down to edit, we’re really in sync about the whole movie. I don’t work off an assembly, so she’ll put together stuff for herself to let me know a scene is working the way we designed it. If there’s a problem, she’ll let me know what we need.

What were the big editing challenges?
Besides the general challenges of getting a scene right, I think for some of the longer ones it was all about finding the right rhythm and pacing. And it was particularly true of this film that the pace of something early on could really affect something later. Then you have to fix the earlier bit first, and sometimes it’s the scene right before. For instance, the scene where Charlie and Nicole have a big argument that turns into a very emotional fight is really informed by the courtroom scene right before it. So we couldn’t get it right until we’d got the courtroom scene right.

A lot of directors do test screenings. Do you?
No, I have people I show it to and get feedback, but I’ve never felt the need for testing.

VFX play a role. What was involved?
The Artery did them. For instance, when Adam cuts his arm we used VFX in addition to the practical effects, and then there’s always cleanup.

Talk about the importance of sound to you as a filmmaker, as it often gets overlooked in this kind of film.
I’m glad you said that because that’s so true, and this doesn’t have obvious sound effects. But the sound design is quite intricate, and Chris Scarabosio (working out of Skywalker Sound), who did Star Wars, did the sound design and mix; he was terrific.

A lot of it was taking the real-world environments in New York and LA and building on that, and maybe taking some sounds out and playing around with all the elements. We spent a lot of time on it, as both the sound and image should be unnoticed in this. If you start thinking, “That’s a cool shot or sound effect,” it takes you out of the movie. Both have to be emotionally correct at all times.

Where did you do the DI and how important is it to you?
We did it at New York’s Harbor Post with colorist Marcy Robinson, who’s done several of my films. It’s very important, but we didn’t do anything too extreme, as there’s not a lot of leeway for changing the look that much. I’m very happy with the look and the way it all turned out.

Congratulations on all the Oscar noms. How important is that for a film like this?
It’s a great honor. We’re all still the kids who grew up watching movies and the Oscars, so it’s a very cool thing. I’m thrilled.

What’s next?
I don’t know. I just started writing, but nothing specific yet.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

DejaEdit collaborative editing platform available worldwide

DejaSoft has expanded the availability of its DejaEdit collaborative editing solution for Avid Media Composer, Avid Nexis and EditShare workflows. Already well-established in Scandinavia and parts of Europe, the software-defined network solution is now accessible across the UK, Europe, Latin America, Middle East, Asia, Africa, China and North America.

DejaEdit allows editors to transfer media files and timelines automatically and securely around the world without having to be online continuously. It effectively acts as a media file synchronizer for multiple remote Avid systems.

DejaEdit allows multi-site post facilities to work as one, enabling multiple remote editors to work together, allowing media exchanges with VFX houses and letting editors easily migrate between office and home or mobile-based editing installations throughout the lifecycle of a project.

DejaEdit is available in two applications: Client and Nexus. The Client version works directly with Media Composer, whereas the Nexus variant further enables synchronization with projects stored on Nexis or EditShare storage systems.

DejaSoft and the DejaEdit platform are a collaboration between CEO Clas Hakeröd and CTO Nikolai Waldman, both editors and post pros and founders of boutique post facility Can Film based in Sweden.

The tool is already being used by Oscar-nominated editor Yorgos Mavropsaridis, ACE, of The Favourite, The Lobster and recently Suicide Tourist; Scandinavian producer Daniel Lägersten, who has produced TV series such as Riverside and The Spiral; editor Rickard Krantz, who used it on The Perfect Patient (aka Quick), which has been nominated for Sweden’s Guldbagge Award (similar to a BAFTA) for editing; and post producer Anna Knochenhauer, known for her work on Euphoria featuring Alicia Vikander, The 100-Year-Old Man Who Climbed Out the Window and Disappeared, Lilya 4-Ever and Together.

Bill Baggelaar promoted at Sony Pictures, Sony Innovation Studios

Post industry veteran Bill Baggelaar has been promoted to executive VP and CTO, technology development at Sony Pictures and executive VP and general manager of Sony Innovation Studios. Prior to joining Sony Pictures almost nine years ago, he spent 13 years at Warner Bros. as VP of technology/motion picture imaging and head of technology/feature animation. His new role will start in earnest on April 1.

“I am excited for this new challenge that combines roles as both CTO of Sony Pictures and GM of Sony Innovation Studios,” says Baggelaar. “The CTO’s office works both inside the studio and with the industry to develop key standards and technologies that can be adopted across the various lines of business. Sony Innovation Studios is developing groundbreaking tools, methods and techniques for realtime volumetric virtual production — or as we like to say, “the future of movie magic” — with a level of fidelity and quality that is best in class. With the technicians, engineers and artisans at Sony Innovation Studios combined with our studio technology team, we will be able to bring new experiences and technologies to all areas of production and delivery.”

Baggelaar’s promotion is part of a larger announcement by Sony, which involves a new team established within Sony Pictures — the Entertainment Innovation & Technology Group, Sony Pictures Entertainment, which encompasses the following departments: Sony Innovation Studios (SIS), Technology Development, IP Acceleration and Branded Integration.

The group is headed by Yasuhiro Ito, executive VP, Entertainment Innovation & Technology Group. Don Eklund will be leaving his post as EVP /CTO of technology development at the end of March. Eklund has had a long history with SPE and has been in his current role since 2017, establishing the foundation of the studio’s technology development activities.

“This new role combines my years of experience in production, post and VFX; my work with the broader industry and organizations; and my work with Sony companies around the world over the past eight and a half years — along with my more recent endeavors into virtual production — to create a truly unique opportunity for technical innovation that only Sony can provide,” concludes Baggelaar, who will report directly to Ito.

Kevin Lau heads up advertising, immersive at Digital Domain

Visual effects studio Digital Domain has brought on Kevin Lau as executive creative director of advertising, games and new media. In this newly created position, Lau will oversee all short-form projects and act as a creative partner for agencies and brands.

Lau brings over 18 years of ad-based visual effects and commercial production experience, working on campaigns for brands such as Target, Visa and Sprint.

Most recently, he was the executive creative director and founding partner at Timber, an LA-based studio focused on ads (GMC, Winter Olympics) and music videos (Kendrick Lamar’s Humble). Prior to that, he held creative director positions at Mirada, Brand New School and Superfad. Throughout his career, his work has been honored with multiple awards including Clios, AICP Awards, MTV VMAs and a Cannes Gold Lion for Sprint’s “Now Network” campaign via Goodby.

Lau, who joins Digital Domain EPs Nicole Fina and John Canning as they continue to build the studio’s short-form business, will help unify the vision for the advertising, games and new media/experiential groups, promoting a consistent voice across campaigns.

Lau joins the team as the new media group prepares to unveil its biggest project to date: Time’s The March, a virtual reality recreation of the 1963 March on Washington for Jobs and Freedom. Digital Domain’s experience with digital humans will play a major role in the future of both groups as they continue to build on the photoreal cinematics and in-game characters previously created for Activision, Electronic Arts and Ubisoft.

Talking with 1917’s Oscar-nominated sound editing team

By Patrick Birk

Sam Mendes’ 1917 tells the harrowing story of Lance Corporals Will Schofield and Tom Blake, following the two young British soldiers on their perilous trek across no man’s land to deliver lifesaving orders to the Second Battalion of the Devonshire Regiment.

Oliver Tarney

The story is based on accounts of World War I by the director’s grandfather, Alfred Mendes. The production went to great lengths to create an immersive experience, placing the viewer alongside the protagonists in a painstakingly recreated world, woven together seamlessly, with no obvious cuts. The film’s sound department had to rise to the challenge of bringing this rarely portrayed sonic world to life.

We checked in with supervising sound editor Oliver Tarney and ADR/dialogue supervisor Rachael Tate, who worked out of London’s Twickenham Studios. Both Tarney and Tate are Oscar-nominated in the Sound Editing category. Their work was instrumental in transporting audiences to a largely forgotten time, helping to further humanize the monochrome faces of the trenches. I know that I will keep their techniques — from worldizing to recording more ambient Foley — in mind on the next project I work on.

Rachael Tate

A lot of the film is made up of quiet, intimate moments punctuated by extremely traumatic events. How did you decide on the most key sounds for those quiet moments?
Oliver Tarney: When Sam described how it was going to be filmed, it was expected that people would comment on how it was made from a technical perspective. But for Sam, it’s a story about the friendship between these two men and the courage and sacrifice that they show. Because of this, it was important to have those quieter moments when you aren’t just engaged in full-tilt action the whole time.

The other factor is that the film had no edits — or certainly no obvious edits (which actually meant many edits) — and was incredibly well-rehearsed. It would have been a dangerous thing to have had everything playing aggressively the whole way through. I think it would have been very fatiguing for the audience to watch something like that.

Rachael Tate: Also, you can’t rely on a cut in the normal way to inform pace and energy, so you are using things like music and sound to sort of ebb and flow the energy levels. So after the plane crash, for example, you’ll notice it goes very quiet, and also with the mine collapse, there’s a huge section of very little sound, and that’s on purpose so your ears can reacclimatize.

Absolutely, and I feel like that’s a good way to go — not to oversaturate the audience with the extreme end of the sound design. In other interviews, you said that you didn’t want it to seem overly processed.
Tarney: Well, we didn’t want the weapons to sound heroic in any way. We didn’t want it to seem like they were enjoying what they were doing. It’s very realistic; it’s brutal and harsh. Certainly, Schofield does shoot at people, but it’s out of necessity rather than enjoying his role there. In terms of dynamics, we broke the film up into a series of arcs, and we worked out that some would be five minutes, some would be nine minutes and so on.

In terms of the guns, we went more naturalistic in our recordings. We wanted the audience to feel everything from their perspective — that’s what Sam wanted with the entire film. Rather than having very direct recordings, we split our energies between that and very ambient recordings in natural spaces to make it feel more realistic. The distance that enemy fire was coming from is much more realistic than you would normally play in a film, and the same goes for the biplane recordings. We had microphones all across airfields to get that lovely phase-y kind of sound. For the dogfight with the planes, we sold the fact that you’re watching Blake and Schofield watching the dogfight rather than being drawn directly to the dogfight. I guess it was trying to mirror the visual, which would stick with the two leads.

Tate: We did the same with the crowd. We tried to keep it more realistic by using half actual territorial army guys, along with voice actors, rather than just being a crowdy-sounding crowd. When we put that into the mix, we also chose which bits to focus on — Sam described it as wanting it to be like a vignette, like an old photo. You have the brown edging that fades away in the corners. He wanted you to zoom in on them so much that the stuff around them is there, but at the level they would hear it. So, if there’s a crowd on the screen further back from them, in reality you wouldn’t really hear it. In most films you put something in everyone’s mouth, but we kept it pared right back so that you’re just listening to their voices and their breaths. This is similar to how it was done with the guns and effects.

You said you weren’t going for any Hollywood-type effects, but I did notice that there are some psychoacoustic cues, like when a bomb goes in the bunker, and I think a tinnitus-type effect.
Tarney: There are a few areas where you have to go with a more conventional film language. When the plane’s very close — on the bridge perhaps — once he’s being fired upon, we start going into something that’s a little more conventional, and then we set the lingo back into him. It was that thing that Sam mentioned, which was subjectivity, objectivity; you can flip between them a little bit, otherwise it becomes too linear.

Tate: It needed to pack a punch.

Foley plays a massive part in this production. Assuming you used period weaponry and vehicles?
Tarney: Sam was so passionate about this project. When you visited the sets, the detail was just beautiful. They set the bar in terms of what we had to achieve realism-wise. We had real World War I rifles and machine guns, both British and German, and biplanes. We also did wild track Foley at the first trench and the last trench: the muddy trench and then the chalk one at the end.

Tate: We even put Blakeys on the boots.

Tarney: Yes, we bought various boots with different hobnails and metal tips.

That’s what a Blakey is?
Tate: The metal things that they put in the bottom of their shoes so that they didn’t slip around.

Tarney: And we went over the various surfaces and found which worked the best. Some were real hobnail boots, and some had metal stuck into them. We still wanted each character to have a certain personality; you don’t want everything sounding the same. We also recorded them without the nails, so when we were in a quieter part of the film, it was more like a normal boot. If you’d had that clang, clang, clang all the way through the film…

Tate: It would throw your attention away from what they were saying.

Tarney: With everything we did on the Foley, it was important to keep focus on them the whole time. We would work in layers, and as we would build up to one of the bigger events, we’d start introducing the heavier, more detailed Foley and take away the more diffuse, mellow Foley.

You only hear webbing and that kind of stuff at certain times because it would be too annoying. We would start introducing that as they went into more dangerous areas. You want them to feel conspicuous, too — when they’re in no man’s land, you want the audience to think, “Wow, there are two guys, alone, with absolutely no idea what’s out there. Is there a sniper? What’s the danger?” So once you start building up that tension, you make them a little bit louder again, so you’re aware they are a target.

How much ADR did the film require? I’m sure there was a lot of crew noise in the background.
Tate: Yes, there was a lot of crew noise — there were only two lines of “technical” ADR, which is when a line needs to be redone because the original could not be used/cleaned sufficiently. My priority was to try and keep as much production as possible. Because we started a couple of weeks after shooting started, and as they were piecing it together, it was as if it was locked. It’s not the normal way.

With this, I had the time to go deep and spectrally remove all the crew feet from the mics because they had low-end thuds on their clip mics, which couldn’t be avoided. The recordist, Stuart Wilson, did a great job, giving me a few options with the clip mics, and he was always trying to get a boom in wherever he could.

He had multiple lavaliers on the actors?
Tate: Yes, he had up to three on both those guys most of the time, and we went with the one on their helmets. It was like a mini boom. But, occasionally, they would get wind on them and stuff like that. That’s when I used iZotope RX 7. It was great having the time to do it. Ordinarily people might say, “Oh no, let’s ADR all the breaths there,” but I could get the breaths out. When you hear them breathing, that’s what they were doing at the time. There’s so much performance in them, I would hate to get them standing in a studio in London, you know, in jeans, trying to recreate that feeling.

So even if there’s slight artifacting, the littlest bit, you’d still go with that over ADR?
Tate: Absolutely. I would hope there’s not too much there though.

Tarney: Film editor Lee Smith and Sam have such a great working relationship; they really were on the same page putting this thing together. We had a big decision to make early on: Do we risk being really progressive and organize Foley recording sessions whilst they were still filming? Because, if everything was going according to plan, they were going to be really hungry for sound since there was no cutting once they had chosen the takes. If it didn’t go to plan, then we’d be forever swapping out seven-minute takes, which would be a nightmare to redo. We took a gamble and budgeted to spend the resources front heavy, and it worked out.

Tate: Lee Smith used to be a sound guy, which didn’t hurt.

I saw how detailed they were with the planning. The model of the town for figuring out the trajectory of the flair for lighting, for example.
Tate: They also mapped out the trenches so they were long enough to cover the amount of dialogue the actors were going to say — so the trenches went on for 500 yards. Before that, they were on theater stages with cardboard boxes to represent trenches, walking through them again and again. Everything was very well-planned.

Apart from dialogue and breaths, were there any pleasant surprises from the production audio that you were able to use in the final cut?
Tate: In the woods, toward the end of the film, Schofield stumbles out of the river and hears singing, and the singing that you hear is the guy doing it live. That’s the take. We didn’t get him in to sing and then put it on; that’s just his clip mic, heavily affected. We actually took his recording out into the New Forest, which is south of London.

A worldizing-type technique?
Tate: Yes, we found a remote part, and we played it and recorded it from different distances, and we had that woven against the original with a few plugins on it for the reverbs.

Tarney: We don’t know if Schofield is concussed and if he’s hallucinating. So we really wanted it to feel sort of ethereal, sort of wafting in and out on the wind — is he actually hearing this or not?

Tate: Yeah, we played the first few lines out of sequence, so you can’t really catch if there’s a melody. Just little bits on the breeze so that you’re not even quite sure what you’re hearing at that point, and it gradually comes to a more normal-sounding tune.

Tarney: Basically, that’s the thing with the whole film; things are revealed to the audience as they’re revealed to the lead characters.

Tate: There are no establishing shots.

Were there any elements of the sound design you wouldn’t expect to be in there that worked for one reason or another?
Tarney: No, there’s nothing… we were pretty accurate. Even the first thing you hear in the film — the backgrounds that were recorded in April.

Tate: In the field.

Tarney: Rachael and I went to Ypres in Belgium to visit the World War I museum and immerse ourselves in that world a little bit.

Tate: We didn’t really know that much about World War I. It wasn’t taught in my school, so I really didn’t know anything before I started this; we needed to educate ourselves.

Can you talk about the loop groups and dialing down to the finest details in terms of the vocabulary used?
Tate: Oh, God, I’ve got so many books, and we got military guys for that sort of flat way they operate. You can’t really explain that fresh to a voice actor and get them to do it properly. But the voice actors helped those guys perform and get out of their shells, and the military guys helped the voice actors in showing them how it’s done.

I gave them all many sheets of key words they could use, or conversation starters, so that they could improvise but stay on the right track in terms of content. Things like slang, poems from a cheap newspaper that was handed out to the soldiers. There was an officer’s manual, so I could tell them the right equipment and stuff. We didn’t want to get anything wrong.

That reminds me of this series of color photographs taken in the early 1900s in Russia. Automatically, it brings you so much closer to life at that point in time. Do you feel like you were able to achieve that via the sound design of this film?
Tarney: I think the whole project did that. When you’ve watched a film every day for six months, day in and day out, you can’t help but think about that era more, and it’s slightly embarrassing that it’s one generation past your grandparents.

How much more worldizing did you do, apart from the nice moment with the song?
Tarney: The Foley that you hear in the trench at the beginning and in the trench at the end is a combination between worldizing and sound designer Mike Fentum’s work. We both went down about three weeks before we started because Stuart Wilson gave us a heads up that they were wrapping at that location, so we spoke to the producer, and he gave us access.

So, in terms of worldizing, it’s not quite worldizing in the conventional sense of taking a recording and then playing it in a space. We actually went to the space and recorded the feet in that space, and the Foley supervisor Hugo Adams went to Salisbury Plain (the chalk trench at the end), and those were the first recordings that we edited and gave to Lee Smith. And then, we would get the two Foley artists that we had — Andrea King and Sue Harding — to top that with a performed pass against a screen. The whole film is layered between real recordings and studio Foley, and it’s the blend of natural presence and the performed studio Foley, with all the nuance and detail that you get from that.

Tate: Similarly, the crowd that we recorded out on a field in the back lot of Shepperton, with a 50 array; we did as much as we could without a screen with them just acting and going through the motions. We had an authentic World War I stretcher, which we used with hilarious consequences. We got them to run up and down carrying their friends on stretchers and things like that and passing enormous tables to each other and stuff so that we had the energy of it. There is something about recording outside and that sort of natural slap that you get off the buildings. It was embedded with production quite seamlessly really, and you can’t really get the same from a studio. We had to do the odd individual line in there, but most of it was done out in a field.

When need be, were you using things like convolution reverbs, such as Audio Ease Altiverb, in the mix?
Tarney: Absolutely. As good as the recordings were, it’s only when you put it against picture that you really understand what it is you need to achieve. So we would definitely augment with a lot — Altiverb is a favorite. Re-recording mixer Mark Taylor and I, we would use that a lot to augment and just change perspective a little bit more.

Can you talk about the Atmos mix and what it brought to the film?
Tarney: I’ve worked on many films with Atmos, and it’s a great tool for us. Sam’s very performance-orientated and would like things to be more screen-focused. The minute you have to turn around, you’ve lost that connection with the lead characters. So, in general, we kept things a little more front-loaded than we might have done with another director, but I really liked the results. It’s actually all the more shocking when you hear the biplane going overhead when they’re in no man’s land.

Sam wanted to know all the way through, “Can I hear it in 5.1, 7.1 and Atmos?” We’d make sure that in the three mixes — other than the obvious — we had another
plane coming over from behind. There’s not a wild difference in Atmos. The low end is nicer, and the discrete surrounds play really well, but it’s not a showy kind of mix in that sense. That would not have been true to everything we were trying to achieve, which was something real.

So Sam Mendes knows sound?
Tarney: He’s incredibly hungry to understand everything, in the best way possible. He’s very good at articulating what he wants and makes it his business to understand everything. He was fantastic. We would play him a section in 5.1, 7.1 and Atmos, and he would describe what he liked and disliked about each format, and we would then try to make each format have the same value as the other ones.


Patrick Birk is a musician and sound engineer at Silver Sound, a boutique sound house based in New York City.

Quantum to acquire Western Digital’s ActiveScale business  

Quantum has entered into an agreement with Western Digital Technologies, a subsidiary of Western Digital Corp., to acquire its ActiveScale object storage business. The addition of the ActiveScale product line and engineers brings object storage software and erasure coding technology to Quantum’s portfolio and helps the company to expand in the object storage market.

The acquisition will extend the company’s role in storing and managing video and unstructured data using a software-defined approach. The transaction is expected to close by March 31, 2020. Financial terms of the transaction were not disclosed.

What are the benefits of object storage software?
• Scalability: Allows users to store, manage and analyze billions of objects and exabytes of capacity.
• Durable: ActiveScale object storage offers up to 19 nines of data durability using patented erasure coding protection technologies.
• Easy to Manage at Scale: Because object storage has a flat namespace (compared to a hierarchical file system structure), managing billions of objects and hundreds of petabytes of capacity is easier than using traditional network attached storage. This, according to Quantum, reduces operational expenses.

Quantum has been offering object storage and selling and supporting the ActiveScale product line for over five years. Object storage can be used as an active-archive tier of storage — where StorNext file storage is used for high-performance ingest and processing of data, object storage acts as lower cost online content repository, and tape acts as the lowest cost cold storage tier.

For M&E, object storage is used as a long-term content repository for video content, in movie and TV production, in sports video, and even for large corporate video departments. Those working in movie and TV production require very high performance ingest, edit, processing, rendering of their video files, which typically is done with a file system like StorNext. Once content is finished, it is preserved in an object store, with StorNext data management handling the data movement between file and object tiers.

“Object storage software is an obvious fit with our strategy, our go-to-market focus and within our technology portfolio,” says Jamie Lerner, president/CEO of Quantum. “We are committed to the product, and to making ActiveScale customers successful, and we look forward to engaging with them to solve their most pressing business challenges around storing and managing unstructured data. With the addition of the engineers and scientists that developed the erasure-coded object store software, we can deliver on a robust technical roadmap, including new solutions like an object store built on a combination of disk and tape.”

Visible Studios produces, posts Dance Monkey music video

If you haven’t heard about the Dance Monkey song by Tones and I, you soon will.  Australia’s Visible Studios provided production and post on the video to go with the song that has hit number one in more than 30 countries, went seven times platinum and remained at the top of the charts in Australia for 22 weeks. The video has been viewed on YouTube more than half a billion times.

Visible Studios, a full production and post company, is run by producer Tim Whiting and director and editor Nick Kozakis. The company features a team of directors, scriptwriters, designers, motion graphic artists and editors working on films, TV commercials and music videos.

For Dance Monkey, Visible Studios worked directly with Tones and I to develop the idea for the video. The video, which was shot on Red cameras at the beginning of the song’s meteoric rise, was completed in less than a week and on a small budget.

“The Dance Monkey music video was made on an extremely quick turnaround,” says Whiting. “[Tones] was blowing up at the time, and they needed the music video out fast. The video was shot in one day, edited in two, with an extra day and a half for color and VFX.”  Visible Studios called on Blackmagic Resolve studio for edit, VFX and color.

Dance Monkey features the singer dressed as Old Tones, an elderly man whisked away by his friends to a golf course to dance and party. On the day of production, the sun was nowhere to be found, and each shot was done against a gray and dismal background. To fix this, the team brought in a sky image as a matte and used Resolve’s match move tool, keyer, lens blur and power windows to turn gray footage to brilliant sunshine.

“In post we decided to replace the overcast skies with a cloudy blue sky. We ended up doing this all in Resolve’s color page and keyed the grass and plants to make them more lush, and we were there,” says Whiting.

Editor/directors Kozakis and Liam Kelly used Resolve for the entire editing process. “Being able to edit 6K raw footage smoothly on a 4K timeline, at a good quality debayer, means that we don’t have to mess around with proxies and that the footage gets out of the way of the editing process. The recent update for decompression and debayer on Nvidia cards has made this performance even better,” Kozakis says.

 

Missing Link, The Lion King among VES Award winners

The Visual Effects Society (VES), the industry’s global professional honorary society, held its 18th Annual VES Awards, the yearly celebration that recognizes outstanding visual effects artistry and innovation in film, animation, television, commercials, video games and special venues.

Comedian Patton Oswalt served as host for the 9th time to the more than 1,000 guests gathered at the Beverly Hilton to celebrate VFX talent in 25 awards categories. The Lion King was named the photoreal feature winner, garnering three awards. Missing Link was named top animated film, winning two awards. The Mandalorian was named best photoreal episode and garnered two awards, with Game of Thrones and Stranger Things 3 also winning two awards each. Hennessy: The Seven Worlds topped the commercial field with two wins.

Andy Serkis presented the VES Award for Creative Excellence to visual effects supervisor Sheena Duggal. Joey King presented the VES Visionary Award to director-producer-screenwriter Roland Emmerich. VFX supervisor Pablo Helman presented the Lifetime Achievement Award to director/producer/screenwriter Martin Scorsese, who accepted via video from New York. Scorsese’s The Irishman also picked up two awards, including Outstanding Supporting Visual Effects in a Photoreal Feature.

Presenters also included: directors J.J. Abrams, Jon Favreau, Rian Johnson and Josh Cooley.

Winners of the 18th Annual VES Awards are as follows:

Outstanding Visual Effects in a Photoreal Feature

THE LION KING

Robert Legato

Tom Peitzman

Adam Valdez

Andrew R. Jones

 

Outstanding Supporting Visual Effects in a Photoreal Feature

THE IRISHMAN

Pablo Helman

Mitchell Ferm

Jill Brooks

Leandro Estebecorena

Jeff Brink

 

Outstanding Visual Effects in an Animated Feature

MISSING LINK

Brad Schiff

Travis Knight

Steve Emerson

Benoit Dubuc

 

Outstanding Visual Effects in a Photoreal Episode

THE MANDALORIAN; The Child

Richard Bluff

Abbigail Keller

Jason Porter

Hayden Jones

Roy K. Cancino

 

Outstanding Supporting Visual Effects in a Photoreal Episode

CHERNOBYL; 1:23:45

Max Dennison

Lindsay McFarlane

Clare Cheetham

Paul Jones

Claudius Christian Rauch

 

Outstanding Visual Effects in a Real-Time Project

Control

Janne Pulkkinen

Elmeri Raitanen

Matti Hämäläinen

James Tottman

 

Outstanding Visual Effects in a Commercial

Hennessy: The Seven Worlds

Carsten Keller

Selçuk Ergen

Kiril Mirkov

William Laban

 

Outstanding Visual Effects in a Special Venue Project

Star Wars: Rise of the Resistance

Jason Bayever

Patrick Kearney

Carol Norton

Bill George

 

Outstanding Animated Character in a Photoreal Feature

ALITA: BATTLE ANGEL; Alita

Michael Cozens

Mark Haenga

Olivier Lesaint

Dejan Momcilovic

 

Outstanding Animated Character in an Animated Feature

MISSING LINK; Susan

Rachelle Lambden

Brenda Baumgarten

Morgan Hay

Benoit Dubuc

 

Outstanding Animated Character in an Episode or Real-Time Project

STRANGER THINGS 3; Tom/Bruce Monster

Joseph Dubé-Arsenault

Antoine Barthod

Frederick Gagnon

Xavier Lafarge

 

Outstanding Animated Character in a Commercial

Cyberpunk 2077; Dex

Jonas Ekman

Jonas Skoog

Marek Madej

Grzegorz Chojnacki

 

Outstanding Created Environment in a Photoreal Feature

THE LION KING; The Pridelands

Marco Rolandi

Luca Bonatti

Jules Bodenstein

Filippo Preti

 

Outstanding Created Environment in an Animated Feature

TOY STORY 4; Antiques Mall

Hosuk Chang

Andrew Finley

Alison Leaf

Philip Shoebottom

 

Outstanding Created Environment in an Episode, Commercial, or Real-Time Project

GAME OF THRONES; The Iron Throne; Red Keep Plaza

Carlos Patrick DeLeon

Alonso Bocanegra Martinez

Marcela Silva

Benjamin Ross

 

Outstanding Virtual Cinematography in a CG Project

THE LION KING

Robert Legato

Caleb Deschanel

Ben Grossmann

AJ Sciutto

 

Outstanding Model in a Photoreal or Animated Project

THE MANDALORIAN; The Sin; The Razorcrest

Doug Chiang

Jay Machado

John Goodson

Landis Fields IV

 

Outstanding Effects Simulations in a Photoreal Feature

STAR WARS: THE RISE OF SKYWALKER

Don Wong

Thibault Gauriau

Goncalo Cababca

François-Maxence Desplanques

 

Outstanding Effects Simulations in an Animated Feature

FROZEN 2

Erin V. Ramos

Scott Townsend

Thomas Wickes

Rattanin Sirinaruemarn

 

Outstanding Effects Simulations in an Episode, Commercial, or Real-Time Project

STRANGER THINGS 3; Melting Tom/Bruce

Nathan Arbuckle

Christian Gaumond

James Dong

Aleksandr Starkov

 

Outstanding Compositing in a Feature

THE IRISHMAN

Nelson Sepulveda

Vincent Papaix

Benjamin O’Brien

Christopher Doerhoff

 

Outstanding Compositing in an Episode

GAME OF THRONES; The Long Night; Dragon Ground Battle

Mark Richardson

Darren Christie

Nathan Abbott

Owen Longstaff

 

Outstanding Compositing in a Commercial

Hennessy: The Seven Worlds

Rod Norman

Guillaume Weiss

Alexander Kulikov

Alessandro Granella

 

Outstanding Special (Practical) Effects in a Photoreal or Animated Project

THE DARK CRYSTAL: THE AGE OF RESISTANCE; She Knows All the Secrets

Sean Mathiesen

Jon Savage

Toby Froud

Phil Harvey

 

Outstanding Visual Effects in a Student Project

THE BEAUTY

Marc Angele

Aleksandra Todorovic

Pascal Schelbli

Noel Winzen

 

Review: Neat Video 5 noise reduction plugin

By Brady Betzel

One of the best (and most underrated) tricks in an online editor’s tool kit is to have good image restoration techniques. Removing digital video imperfections — from flicker to digital video noise — is not easy, and not easy to do well. That is, unless you have good noise reduction software like Neat Video.

While Neat Video might not be that well-known, once you see how simply (or intricatly) Neat Video 5 works inside of apps like Blackmagic’s DaVinci Resolve, it will be hard to forget the company’s name.

(While the software was recently updated to 5.1.5 — with expanded GPU support as well as support for new versions of Resolve, Adobe and Nuke — nothing really changes for this review. You can check out a detailed list of the updates here.)

Neat Video 5 is a noise reduction plugin. In a Windows OS environment, Neat Video is compatible with apps like Adobe After Effects, Adobe Premiere Pro, DaVinci Resolve, Avid Media Composer, Vegas, Magix, Edius, Virtual Dub, and the OFX-compatible apps Nuke, Fusion, Scratch, HitFilm, Mamba, Natron, Flame, Baselight and DustBuster. In a macOS environment, Neat Video 5 is compatible with After Effects, Premiere, Final Cut Pro X, Motion 5, OFX, Resolve and Media Composer. In Linux, the software is compatible with OFX-compatible apps and Resolve.

Neat Video 5 comes in three flavors: Demo, Home and Pro. The Demo version works in up to 1280×720 resolution with a watermark. Home is literally made for the home user: It will process video up to 1920×1080 resolutions, it will use up to one GPU, and it is for non-commercial use. The cost is just $74.90 for most apps (Resolve is $89.90). The Pro version has no resolution restrictions, will work on two or more GPUs simultaneously, and can be used commercially. The Pro version starts at $129.90 per app ($159.90 for Resolve). Because Neat Video 5 for OFX works with so many apps, it only comes in Pro ($249.90) and Studio ($349.90) versions. The Studio version adds the ability for a floating license. You can see all of the pricing details here.

If there is one line you should take away from this review, it is this: Neat Video 5 is by far the easiest and best noise reduction software I have used in any application to date. And while this review is focusing on the Resolve version of Neat Video 5, all other apps work in much the same way. You can find Neat Video’s software-specific Quick Start Guides to help. Once you install and register your Neat Video 5 license, removing digital video noise is as easy as applying Neat Video 5 to a node in the color tab, clicking on “Prepare Noise Profile,” clicking on “Auto Profile,” and clicking “Apply.” Then, unless you want to fine-tune your noise reduction, you are done. Obviously, I have somewhat simplified how Neat Video 5 works, but essentially it can be done in as little as three steps per clip, and the results are typically amazing. If they aren’t amazing, you can jump back into Neat Video 5 and manually adjust specifics until the noise reduction looks correct. But I will say that in about 90% of cases, the Auto Profiling will do all of the noise reduction work necessary.

For tinkerers, or for those who need to go far beyond an Auto Profile, you can manually adjust your settings. But taking a step back, Neat Video needs an area of your image that has a uniform color and noise profile to process how it removes noise. The automatic profiling will do its best to find an area, but it doesn’t always work. What you need to keep in mind when building a good noise profile inside of Neat Video is that the area being processed needs to be as uniform as possible (think dark night sky or a wall painted in one color) — meaning no prominent features, a high noise level (something in the high four area is better), the largest possible sample area and no warnings from Neat Video.

So, if your automatic profile doesn’t do the job, you can find an area of your image that meets the above requirements and then build a profile. From there you can use one of the Neat Video 5 features, like “Profile Check.” Profile Check will highlight details that aren’t being affected by Neat Video, giving you a clear representation of what noise is being reduced and whether you need to adjust your profile to better reduce video noise.

At this point you might be wondering where you tweak advanced settings. When you load Neat Video, you will be in Beginner mode. To get into Advanced mode, go to the Tools menu, where you will see a lot of advanced functions that can help you fine-tune your noise profile. And if you still can’t get a good noise reduction profile, you can try out the “Generic Profile,” which can help you build a profile even if your video doesn’t have a large enough area of uniform noise. There are also presets — such as like light flicker, moire flicker, repeat frame issues, dust and scratch filters (including scan lines), jitter of details, artifact removal filter and more — that can solve certain problems.

Neat Video 5 is faster than previous generations. As in previous versions, there is even a tool that inside of Neat Video preferences that will run your CPU and GPU through a benchmark to specify whether you should run on CPU only, GPU only, or a combo of both. In Neat Video 5, if you have trouble with a clip, you can use up to four “Variants” of noise reduction in the new playback window to see how each profile works with your clip.

In terms of playback and rendering, noise reduction is never fast. However, inside of Neat Video the new playback window will typically play back your footage to preview the noise reduction before you jump back into Resolve. Inside of Resolve, even in just 1080p, my sequence would crawl to just a few frames of playback per second. It is one of the most processor- and GPU-intensive tasks you will run on your computer.

In my testing I applied Neat Video 5 to the first node in my color correction tree, followed by a basic color correction in a one-minute timeline. I took those same clips and compared my Neat Video results to Resolve’s Temporal and Spatial noise reduction tools. In terms of visual results, Neat Video 5 was superior. If that’s not the case for you, then jump into YCbCr viewer mode inside of Neat Video 5, isolate each channel and tweak each channel individually so you won’t affect your overall noise reduction if it isn’t necessary. Not only did Neat Video 5 handle normal noise in the shadows well but on clips with very tight lines, it was able to keep a lot of the details while removing the noise. Resolve’s noise reduction tools had a harder time removing noise but keeping detail. Temporal noise reduction really didn’t do much, and while Spatial noise reduction did work it would heavily blur and distort the image — essentially not acceptable.

To get a good example of how Neat Video 5 slams a computer system, I exported 1080p MP4s. Resolve’s built-in Temporal noise reduction took 1:03, while the Spatial noise reduction took 1:05. The Neat Video 5 render of the same one-minute timeline took 3:51 — almost four times as long. I was curious how much longer a 4K render would take. Using 4K (UHD) media, I applied a simple color correction and on a previous serial node that applied Neat Video 5. I exported a 4K (UHD) MP4, which took 52 seconds without Neat Video 5 applied and 16:27 with Neat Video applied — at least 16 times more render time! So while Neat Video 5 is an amazing tool, there is a trade-off in high render times.

To find additional training on more advanced noise reduction techniques in Neat Video, check out the video tutorials. I find myself watching these just because of how much you can learn about noise reduction in general. They aren’t as exciting as watching Game of Thrones or The Handmaid’s Tale, but they will push your knowledge in noise reduction to the next level.

Summing Up
I’ve used Neat Video for a while, so when I was approached to review Version 5 I immediately said yes. Noise reduction is post skill that not many possess.

If you are an online editor or colorist looking to separate yourself from the pack, learn all the noise reduction techniques you can and definitely check out Neat Video 5. Not only can Neat Video 5 work automatically, but you can fine-tune your noise reduction as much as you want.

And when demoing your color correction services, think about using Neat Video 5 to remove camera noise, flickering and chroma issues; color correcting your footage; and, finally, adding some grain back into your shot. Not only will your footage look better, but you’ll have a technical workflow that will definitely impress clients. Just don’t forget to account for the extra render time.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Oscar-nominated Jojo Rabbit editor Tom Eagles: blending comedy and drama

By Daniel Restuccio

As an editor, Tom Eagles has done it all. He started his career in New Zealand cutting promos before graduating to assistant editor then editor on television series such as Secret Agent Men and Spartacus. Eventually he connected with up-and-coming director Taika Waititi and has worked with him on the series What We Do in the Shadows and the critically acclaimed feature Hunt for the Wilderpeople. Their most recent feature collaboration, 20th Century Fox’s Jojo Rabbit, earned Eagles BAFTA and Oscar nominations as well as an ACE Eddie Award win.

Tom Eagles

We recently caught up with him to talk about the unique storytelling style of Taika films, specifically Jojo Rabbit.

(Warning: If you haven’t seen the film yet, there might be some spoilers ahead.)

How did your first conversation with Taika go?
Fairly early on, unprompted, he gave me a list of his top five favorite films. The kind of scope and variety of it was startling, but they were also my top five favorite films. We talked about Stalker, from filmmaker Andrei Tarkovsky, and I was a massive Tarkovsky fan at the time. He also talked about Annie Hall and Bad Lands.

At that point in time, there weren’t a lot of people doing the type of work that Taika does: that mix of comedy and drama. That was the moment I thought, “I’ve got to work with this guy. I don’t know if I’m going to find anyone else like this in New Zealand.”

How is Jojo different than your previous collaboration on Hunt for the Wilderpeople?
We had a lot more to work with on Jojo. There’s a lot more coverage in a typical scene, while the Wilderpeople was three shots: a master and two singles. With Jojo, we just threw everything at it. Taika’s learned over the years that it’s never a bad thing to have another shot. Same goes for improv. It’s never a bad thing to have a different line. Jojo was a much bigger beast to work on.

Jojo is rooted in a moment in history, which people know well, and they’re used to a certain kind of storytelling around that moment. I think in the Czech Republic, where we shot, they make five World War II movies a year. They had a certain idea of how things should look, and we weren’t doing that. We were doing Taika’s take, so we weren’t doing desaturated, handheld, grim, kitchen sink realism. We were creating this whole other world. I think the challenge was to try and bring people along on that journey.

I saw an early version of the script, and the Hitler character wasn’t in the opening scene. How did that come about?
One of the great things about working with Taika is he always does pick-ups. Normally, it’s something that we figure out that we need during the process of the edit. He rewrote a bunch of different options for the ending of the movie, a few scenes dotted throughout and the opening of the film.

He shot three versions. In one, it was just Jojo on his own, trying to psych himself up. Then there were variations on how much Adolf we would have in the film. What we found when we screened the film up to that point was that people were on board with the film, but it sometimes took them a while to get there … to understand the tone of the film. The moment we put imaginary Adolf in that scene, it was like planting a flag and saying, “This is what this film is. It’s going to be a comedy about Hitler and Nazis, and you’re either with us or you’re walking out, but if you’re with us, you will find out it’s about a lot more than that.”

Some directors sit right at the editor’s elbow, overlooking every cut, and some go away and leave the editor to make a first cut. What was this experience like?
While I’ve experienced both, Taika’s definitely in the latter category. He’s interested in what you have to say and what you might bring to the edit. He also wants to know what people think, so we screen the film a lot. Across the board — it’s not just isolated to me, but anyone he works with — he just wants more ideas.

After the shooting finished, he gave me two weeks. He went and had a break and encouraged me to do what I wanted with the assemble, to cut scenes and to not be too precious about including everything. I did that, but I was still relatively cautious; there were some things I wanted him to see.

We experimented with various structures. We tried an archiving thing for the end of the film. There was a fantasy sequence in which Elsa is talking about the story of the Jews, and we see flights of fancy of what she thinks … a way for her to escape into fantasy. That was an idea of Taika’s. He just left me to it for a couple of weeks, and we looked at it and decided against it in the end. It was a fun process because when he comes back, he’s super fresh. You offer up one idea and he throws five back.

How long was the first cut?
I asked my assistant the other day, and he said it was about two hours and forty minutes, so I guess I have to go with that, which sounds long to me. That might have been the first compile that had all of the scenes in it, and what I showed Taika was probably half an hour shorter. We definitely had a lot to play with.

Do you think there’s going to be a director’s cut?
I think what you see is the director’s cut. There’s not a version of the film that has more stuff in it than we wanted in it. I think it is pretty much the perfect direction. I might have cut a little bit more because I think I just work that way. There were definitely things that we missed, but I wouldn’t put them back in because of what we gained by taking them out.

We didn’t lean that heavily on comedy once we transitioned into drama. The longer you’re away from Jojo and Elsa, that’s when we found that the story would flounder a little bit. It’s interesting because when I initially read the script, I was worried that we would get bored of that room, and that it would feel too much like a stage play. So we added all of this color and widened the world out. We had these scenes where Jojo goes out into the world, but actually the relationship between the two of them — that’s the story. Each scene in that relationship, the kind of gradual progression toward each other, is what’s moving the story forward.

This movie messes with your expectations, in terms of where you think it’s going or even how it’s saying it. How did you go about creating your own rhythms for that style of storytelling?
I was fortunate in that I already had Taika’s other films to lean on, so partly it was just trying to wrestle this genre into his world … into his kind of subgenre of Taika. It’s really just a sensibility a lot of the time. I was aware that I wanted a breathlessness to the pace of things, especially for the first half of the movie in order to match Jojo’s slightly ADD, overexcited character. That slows down a little bit when it needs to and when he’s starting to understand the world around him a little bit more.

Can you talk about the music?
Music also was important. The needle drops. Taika had a bunch of them already. He definitely had The Beatles and Bowie, and it was fleshing out a few more of those. I think I found the Roy Orbison piece. Temp music was also really important. It was quite hard to find stuff. Taika’s brief was: I don’t want it to sound like all the other movies in the genre. As much as we respected Schindler’s List, he didn’t want it to sound like Schindler’s List.

You edited on Avid Media Composer?
We cut on Avid, and it was the first time I really used ScriptSync. I had been wary of it, to be honest. I watch all the dailies through from head to tail and see the performances in context and feel how they affect me. Once that’s done, ScriptSync is great for comparing takes or swapping out a read on a line. Because we had so much improv on this film, we had to go through and enter all of that in manually. Sometimes we’d use PhraseFind to search on a particular word that I’d remembered an actor saying in an ad-lib. It’s a much faster and more efficient way of finding that stuff.

That said, I still periodically go back and watch dailies. As the film starts to solidify, so does what I’m looking for in the dailies, so I’ll always go back and see if there’s anything that I view differently with a new in mind.

You mentioned the difference between Wilderpeople and Jojo in terms of coverage. How much more coverage did you have? Were there multiple cameras?
There were two and sometimes three cameras (ARRI Alexa). Some scenes were single camera, so there was a lot more material mastered. Some directors get a bit iffy about two cameras, but we just rolled it.

If we had the option, we would almost always lean on the A camera, and part of the trick was to try and make it look like his other movies. We wanted the coverage plan to feel simple; it should still feel like a master, couple of mediums and a couple of singles, all in that very flat framing approach of his. Often, the characters are interacting with each other perpendicular to the camera in these fixed static wides.

Again, one of the things Taika was concerned with was that it should feel like his other movies. Just because we have a dolly, we don’t have to use it every time. We had all of those shots, we had those options, and often it was about pairing things back to try and stay in time.

Does he give you a lot of takes, and does he create different emotional variations within those takes?
We definitely had a lot of takes. And, yes, there would be a great deal of variety of performance, whether it’s him just trying to push an actor and get them to a specific place, or sometimes we just had options.

Was there an average — five takes, 10 takes?
It’s really hard to say. These days everyone just does rolling resets. You look at your bin and you think, “Ah, great, they did five takes, and there’s only three set-ups. How long is it going to take me?” But you open it up, and each take is like half an hour long, and they’re reframing on the fly.

With Scarlett Johansson, you do five takes max, probably. But with the kids it would be a lot of rolling resets and sometimes feeding them lines, and just picking up little lines here and there on the fly. Then with the comedians, it was a lot of improv, so it’s hard to quantify takes, but it was a ton of footage.

If you include the archive footage, I think we had 300 to 400 hours. I’m not sure how much of that was our material, but it would’ve been at least 100 hours.

I was impressed by the way you worked the “getting real” scenes: the killing of the rabbit and the hanging scene. How did you conceptualize and integrate those really important moments?
For the hanging scene, I was an advocate for having it as early in the movie as possible. It’s the moment in the film where we’ve had all this comedy and good times [regarding] Nazis, and then it drives home that this film is about Nazis, and this is what Nazis do.

I wanted to keep the rabbit scene fun to a degree because of where it sits in the movie. I know, obviously, it’s quite a freaky scene for a lot of people, but it’s kind of scary in a genre way for me.

Something about those woods always remind me of Stand by Me. That was the movie that was in my mind, and just the idea of those older kids, the bullies, being dicks. Moments like that and, much more so, the moment when Jojo finds Elsa; I thought of that sequence as a mini horror film within the film. That was really useful to let the scares drive it because we were so much in Jojo’s point of view. It’s taking those genres and interjecting a little bit of humor or a little bit of lightness into them to keep them in tone with Taika’s overall sensibility.

I read that you tried to steer clear of the sentimentality. How did you go about doing that?
It’s a question of taste with the performance(s) and things that other people might like. I will often feel I’m feeding the audience or demanding of the audience an emotional response. The scene where Jojo finds Rosie. We shot an option seeing Rosie hanging there. It just felt too much. It felt like it was really bludgeoning people over the head with the horror of the moment. It was enough to see the shoes. Every time we screened the movie and Jojo stands up, we see the shoes and everyone gasps. I think people have gotten the information that they need.


Dan Restuccio is a writer/director with Realwork Entertainment and part of the Visual Arts faculty at California Lutheran University. He is a former Disney Imagineer. You can reach him at dansweb451@gmail.com.

VFX-heavy Skyworth OLED TV spot via The-Artery

The-Artery created a spot for Skyworth’s latest version of its W81|W81 Pro Wallpaper OLED TV, which debuted last month at the “See the Wonder” event at CES 2020.

Created using The-Artery‘s newly opened Resolve-based color room and expanded design capabilities —spearheaded by colorist Stephen Picano and design director Lauren Indovina — the commercial features a couple swimming through space-like waters, children battling origami dragons while floating in a paper boat and a traveler treking through snowy tundras while glowing jellyfish float overhead. Publicis, Skyworth’s agency, wanted the ad to reflect “the wonder” of the company’s newest television model.

“The campaign, helmed by director Eli Sverdlov, was very director-led in a way that I’ve never seen before,” explains The-Artery’s EP/MD, Deborah Sullivan. “Of course, there was still ongoing dialogue with the client and agency, but the level of creative control that was entrusted is almost unheard of. Everything was open from start to finish, including the ideation phase, color grading and design — to name a few. Our team had a lot of fun jumping straight into the edit to develop and launch what we consider as a high-end conceptual throwback to the nineties.”

Sverdlov agrees: “Our flexible creative process was in a condensed schedule and required a very unique collaboration. We were practically creating the ideas and visuals while editing and sourcing footage.”

Due to the production’s long shooting schedule and tight deadlines, the visual effects were designed via Autodesk Flame in realtime, all under one roof, while filming took place in Serbia. Additional footage was carefully curated as well as color graded and cut to fit the tone and flow of the rest of the piece. The creature imagery such as the jellyfish was done via CG.

In addition to Flame and Resolve, The-Artery called on SideFX Houdini, Autodesk Maya, Maxon’s RedShift, Otoy’s Octane, Autodesk’s Arnold, Adobe After Effects and Maxon’s Cinema 4D.

Editor David Cea joins Chicago’s Optimus  

Chicago-based production and post house Optimus has added editor David Cea to its roster. With 15 years of experience in New York and Chicago, Cea brings a varied portfolio of commercial editing experience to Optimus.

Cea has cut spots for brands such as Bank of America, Chevrolet, Exxon, Jeep, Hallmark, McDonald’s, Microsoft and Target. He has partnered with many agencies, including BBDO, Commonwealth, DDB, Digitas, Hill Holliday, Leo Burnett, Mother and Saatchi & Saatchi.

“I grew up watching movies with my dad and knew I wanted to be a part of that magical process in some way,” explains Cea. “The combination of Goodfellas and Monty Python gave me all the fuel I needed to start my film journey. It wasn’t until I took an editing class in college that I discovered the part of filmmaking I wanted to pursue. The editor is the one who gets to shape the final product and bring out the true soul of the footage.”

After studying film at Long Island’s Hofstra University, Cea met Optimus editor and partner Angelo Valencia while working as his assistant at Whitehouse New York in 2005. Cea then moved on to hone his craft further at Cosmo Street in New York. Chicago became home for him in 2013 as he spent three years at Whitehouse. After heading back east for a couple of years, he returned to Chicago to put down roots.

While Avid Media Composer is Cea’s go-to choice for editing, he is also proficient in Adobe Premiere.

Framestore launches FPS preproduction services

VFX studio Framestore has launched FPS (Framestore Pre-production Services) for the global film and content production industries. An expansion of Framestore’s existing capability, FPS is available to clients in need of standalone preproduction support or an end-to-end production solution.

The move builds out and aligns the company’s previz, virtual production, techviz and postviz services with Framestore’s art department (which operates either as part of the Framestore workflow or as a stand-alone creative service), virtual production team and R&D unit, and integrates with the company’s VFX and animation teams. The move builds on work on films such as Gravity and the knowledge gained during the company’s eight-year London joint venture with visualization company The Third Floor. FPS is working on feature film projects as part of an integrated offering and as a standalone visualization partner, with more projects slated in the coming months.

The new team is led by Alex Webster, who joins as FPS managing director after running The Third Floor London. He will report directly to Fiona Walkinshaw, Framestore’s global managing director, film.

“This work aligns Framestore’s singular VFX and animation craft with a granular understanding of the visualization industry,” says Webster. “It marries the company’s extraordinary legacy in VFX with established visualization and emergent virtual production processes, supported by bleeding-edge technology and dedicated R&D resource to inform the nimble approach which our clients need. Consolidating our preproduction services represents a significant creative step forward.”

“Preproduction is a crucial stage for filmmakers,” says chief creative officer Tim Webber. “From mapping out environments to developing creatures and characters to helping plot action sequences it provides unparalleled freedom in terms of seeing how a story unfolds or how characters interact with the worlds we create. Bringing together our technical innovation with an understanding of filmmaking, we want to offer a bespoke service for each film and each individual to help tell compelling, carefully crafted stories.”

“Our clients’ needs are as varied as the projects they bring to us, with some needing a start-to-finish service that begins with concept art and ends in post while others want a bespoke, standalone solution to specific creative challenges, be that in early stage concepting, through layout and visualization or in final animation and VFX” says Framestore CEO William Sargent. “It makes sense to bring all these services in-house — even more so when you consider how our work in adjacent fields like AR, VR and MR has helped the likes of HBO, Marvel and Warner Bros. bring their IP to new, immersive platforms. What we’ll ultimately deliver goes well beyond previz and beyond visualization.”

Main Image: (L-R) Tim Webber, Fiona Walkinshaw and Alex Webster.

CAS Awards recognize GOT, Fleabag, Ford v Ferrari, more

The CAS Awards were held this past weekend, with the sound mixing team from Ford v Ferrari  — Steven A. Morrow, CAS, Paul Massey CAS, David Giammarco CAS, Tyson Lozensky, David Betancourt and Richard Duarte — taking home the Cinema Audio Society Award for Outstanding Sound Mixing Motion Picture – Live Action.

Game of Thrones – The Bells

Top honors for Motion Picture – Animated went to Toy Story 4 and the sound mixing team of Doc Kane CAS, Vince Caro CAS, Michael Semanick CAS, Nathan Nance, David Boucher and Scott Curtis. The CAS Award for Outstanding Sound Mixing Motion Picture – Documentary went to Making Waves: The Art of Cinematic Sound and the team of David J. Turner, Tom Myers, Dan Blanck and Frank Rinella.

Held in the Wilshire Grand Ballroom of the InterContinental Los Angeles Downtown, the awards were presented in seven categories for Outstanding Sound Mixing Motion Picture and Television and two Outstanding Product Awards. The evening saw CAS president Karol Urban pay tribute to recently retired CAS executive board member Peter R. Damski for his years of service to the organization. The contributions of re-recording mixer Tom Fleischman, CAS, were recognized as he received the CAS Career Achievement Award. Presenter Gary Bourgeois spoke to Fleischman’s commitment to excellence demonstrated in a career that spans over 40 years,  nearly 200 films and collaborations with dozens of notable directors.  

James Mangold

James Mangold received the CAS Filmmaker Award in a presentation that included remarks  by re-recording mixer Paul Massey, CAS, who was joined in the presentation by Harrison Ford. Mangold had even more to celebrate as he watched his sound team take top honors for Outstanding Achievement in Sound Mixing Motion Picture – Live Action. 

Here is the complete list of winners:

MOTION PICTURE – LIVE ACTION

Ford v Ferrari

Ford v Ferrari team

Production Mixer – Steven A. Morrow CAS 

Re-recording Mixer – Paul Massey CAS 

Re-recording Mixer – David Giammarco CAS 

Scoring Mixer – Tyson Lozensky

ADR Mixer – David Betancourt 

Foley Mixer – Richard Duarte

MOTION PICTURE – ANIMATED 

Toy Story 4

Original Dialogue Mixer – Doc Kane CAS

Original Dialogue Mixer – Vince Caro CAS

Re-recording Mixer – Michael Semanick CAS 

Re-recording Mixer – Nathan Nance

Scoring Mixer – David Boucher

Foley Mixer – Scott Curtis

 

MOTION PICTURE – DOCUMENTARY

Making Waves: The Art of Cinematic Sound

Production Mixer – David J. Turner 

Re-recording Mixer – Tom Myers 

Scoring Mixer – Dan Blanck

ADR Mixer – Frank Rinella

 

TELEVISION SERIES – 1 HOUR

Game of Thrones: The Bells

Production Mixer – Ronan Hill CAS 

Production Mixer –Simon Kerr 

Production Mixer – Daniel Crowley 

Re-recording Mixer – Onnalee Blank CAS 

Re-recording Mixer – Mathew Waters CAS 

Foley Mixer – Brett Voss CAS

TELEVISION SERIES – 1/2 HOUR 

TIE

Barry: ronny/lily

Production Mixer – Benjamin A. Patrick CAS 

Re-recording Mixer – Elmo Ponsdomenech CAS 

Re-recording Mixer – Jason “Frenchie” Gaya 

ADR Mixer – Aaron Hasson

Foley Mixer – John Sanacore CAS

 

Fleabag: Episode #2.6

Production Mixer – Christian Bourne 

Re-recording Mixer – David Drake 

ADR Mixer – James Gregory

 

TELEVISION MOVIE or LIMITED SERIES

Chernobyl: 1:23:45

Production Mixer – Vincent Piponnier 

Re-recording Mixer – Stuart Hilliker 

ADR Mixer – Gibran Farrah

Foley Mixer – Philip Clements

 

TELEVISION NON-FICTION, VARIETY or MUSIC SERIES or SPECIALS

David Bowie: Finding Fame

Production Mixer – Sean O’Neil 

Re-recording Mixer – Greg Gettens

 

OUTSTANDING PRODUCT – PRODUCTION

Sound Devices, LLC

Scorpio

 

OUTSTANDING PRODUCT – POST PRODUCTION 

iZotope

Dialogue Match

 

STUDENT RECOGNITION AWARD

Bo Pang

Chapman University

 

Main Image: Presenters Whit Norris, Elisha Cuthbert, Award winners Onnalee Blank, Ronan Hill and Brett Voss at the CAS Awards. (Tyler Curtis/ABImages) 

 

 

Filmic DoubleTake allows multicam workflows via iPhone 11

Filmic’s DoubleTake, is a new iOS app designed to turn an Apple iPhone 11 into a multicam studio. Available now from the Apple App Store, Filmic DoubleTake enables iPhone 11 users to capture video from two cameras, simultaneously from a single device, to create a multi-angle viewing experience.

Filmic DoubleTake allows content creators to use the multiple cameras in the iPhone 11, iPhone 11 Pro and iPhone 11 Pro Max — as well as the iPhone XR, iPhone XS and iPhone XS Max — to create the effect of using multiple camera angles in a shot.

According to Filmic, DoubleTake was designed for content creators of any skill level and for multiple genres of content — from professional broadcast-style news interviews to YouTubers capturing multiple angles during live events, concerts or any situation that requires more than one perspective to capture the moment.

Key features include:
• Multicam: Enables users to capture two different focal lengths of the same subject at the same time. DoubleTake uses the Ultra Wide lens (iPhone 11 Pro Max, 11 Pro and 11 only) and the Tele lens to capture both an establishing shot and a punch-in shot on a subject simultaneously. Or they can use any other combination of front and rear lenses for unrivaled multicam capture.

• Camera Visualization: Similar to a director’s viewfinder, DoubleTake’s camera picker-view enables users to visualize all available cameras on the device. Users can employ this view to help decide how to frame a shot and which cameras to select.

• Shot/Reverse Shot: Shot/Reverse Shot: Enables users to capture all the organic and intimate interaction between two actors or interviewer and interviewee. Traditionally, filmmakers must employ two cameras and place them in cumbersome “over the shoulder” locations. With DoubleTake, users can place one device between the actors, effectively placing the audience in the middle of the conversation.

• PiP or Discrete: The DoubleTake interface allows users to see both cameras of the video capture at the same time through the use of a picture-in-picture (PiP) window. The PiP window can be moved around the screen, tapped to zoom in or swiped away if distracting; the second video will continue to record. With DoubleTake, users can record videos as discrete files or as a composite video that includes the PiP window animated as it is seen on screen.

• Split-Screen: DoubleTake can also use any two cameras to create a 50/50 split-screen effect that is saved as a single video. This allows for capturing engaging interviews or any scenario in which two sides of the story require equal weighting on screen.

• Focus and Exposure Controls: DoubleTake enables users to set and lock focus and exposure on both cameras during multicam capture with Filmic unified reticle. Users can tap anywhere to set an area of interest with the reticle, then tap again to lock or unlock. Filmic camera switcher effortlessly moves between A and B cameras during a recording to adjust the focus and exposure for each, independently of one another.

Select video specifications include:
• Full-frame focus and exposure for smooth and easy automated focus and exposure adjustments
• Selectable broadcast frame rates:  24fps, 25fps and 30fps depending on project requirements
• 1080p video at high-bit-rate encoding for maximum quality. (Note: 1080p video is the maximum resolution supported by Apple today for multicam capture.)
• Composited PiP or separate discrete video files recorded as H.264 Mov files are saved to DoubleTake’s internal library, which supports batch export directly to the camera roll.

Wylie Stateman on Once Upon a Time… in Hollywood‘s Oscar nod for sound

By Beth Marchant

To director Quentin Tarantino, sound and music are primal forces in the creation of his idiosyncratic films. Often using his personal music collection to jumpstart his initial writing process and later to set a film’s tone in the opening credits, Tarantino always gives his images a deep, multi-sensory well to swim in. According to his music supervisor Mary Ramos, his bold use of music is as much a character as each film’s set of quirky protagonists.

Wylie Stateman – Credit: Andrea Resnick

Less showy than those memorable and often nostalgic set-piece songs, the sound design that holds them together is just as critically important to Tarantino’s aesthetic. In Once Upon a Time… in Hollywood it even replaces the traditional composed score. That’s one of many reasons why the film’s supervising sound editor Wylie Stateman, a long-time Tarantino collaborator, relished his latest Oscar-nominated project with the director (he previously received nominations for Django Unchained and Inglourious Basterds and has a lifetime total of nine Oscar nominations).

Before joining team Tarantino, Stateman sound designed some of the most iconic films of the ‘80s and ‘90s, including Tron, Footloose, Ferris Bueller’s Day Off (among 15 films he made with John Hughes), Born on the Fourth of July and Jerry Maguire. He also worked for many years with Oliver Stone, winning a BAFTA for his sound work on JFK. He went on to cofound the Topanga, California-based sound studio Twentyfourseven.

We talked to Stateman about how he interpreted Tarantino’s sound vision for his latest film — about a star having trouble evolving to new roles in Hollywood and his stuntman — revealing just how closely the soundtrack is connected to every camera move and cut.

How does Tarantino’s style as a director influence the way you approach the sound design?
I believe that sound is a very important department within the process of making any film. And so, when I met Quentin many years ago, I was meeting him under the guise that he wanted help and he wanted somebody who could focus their time, experience and attention on this very specific department called sound.

I’ve been very fortunate, especially on Quentin’s films, to also have a great production sound mixer and great rerecording mixers. We have both sides of the process in really tremendously skilled hands and tremendously experienced hands. Mark Ulano, our production sound mixer, won an Oscar for Titanic. He knows how to deal with dialogue. He knows how to deal with a complex set, a set where there are a lot of moving parts.

On the other side of that, we have Mike Minkler doing the final re-recording mixing. Mike, who I worked with on JFK, is tremendously skilled with multiple Oscars to his credit. He’s just an amazing creative in terms of re-recording mixing.

The role that I like to play as  supervising sound editor and designer, is how to speak to the filmmaker in terms of sound. For this film, we realized we could drive the soundtrack without a composer by using the chosen songs and KHJ radio, and select these bits and pieces from the shows of infamous DJ “Humble Harve,” or from the clips of all the other DJs on KHJ radio who really defined 1969 in Los Angeles.

And as the film shows, most people heard them over the car radio in car-centric LA.
The DJs were powerful messengers of popular culture. They were powerful messengers of what was happening in the minds and in the streets and in popular culture of that time. That was Quentin’s idea. When he wrote the script, he had written into it all of the KHJ radio segments, and he listens a lot, and he’s a real student of the filmmaking process and a real master.

On the student side, he’s constantly learning and he’s constantly looking and he’s constantly listening. On the master side, he then applies that to the characters that he wants to develop and those situations that he’s looking to be at the base and basis of his story. So, basically, Quentin comes to me for a better understanding of his intention in terms of sound, and he has a tremendous understanding to begin with. That’s what makes it so exciting.

When talking to Quentin and his editor Fred Raskin, who are both really deeply knowledgeable filmmakers, it can be quite challenging to stay in front of them and/or to chase behind them. It’s usually a combination of the two. But Quentin is a very generous collaborator, meaning he knows what he wants, but then he’s able to stop, listen and evaluate other ideas.

How did you find all of the clips we hear on the various radios?
Quentin went through hundreds of hours of archival material. And he has a tremendous working knowledge of music to begin with, and he’s also a real student of that period.

Can you talk about how you approached the other elements of specific, Tarantino-esque sound, like Cliff crunching on a celery stick in that bar scene?
Quentin’s movies are bold in the sense of some of the subject matter that he tackles, but they’re highly detailed and also very much inside his actors’ heads. So when you talk about crunching on a piece of celery, I interpret everything that Quentin imparts on his characters as having some kind of potential vocabulary in terms of sound. And that vocabulary… it applies to the camera. If the camera hides behind something and then comes out and reveals something or if the camera’s looking at a big, long shot — like Cliff Booth’s walk to George Spahn’s house down that open area in the Spahn Ranch — every one of those moves has a potential sound component and every editorial cut could have a vocabulary of sound to accompany it.

We also use those [combinations] to alter time, whether it’s to jump forward or jump back or just crash in. He does a lot of very explosive editing moves and all of that has an audio vocabulary. It’s been quite interesting to work with a filmmaker that sees picture and sound as sort of a romance and a dance. And the sound could lead the picture, or it could lag the picture. The sound can establish a mood, or it can justify a mood or an action. So it’s this constant push-pull.

Robert Bresson, the father of the French New Wave, basically said, “When the ear leads the eye, the eye becomes impatient. When the eye leads the ear, the ear becomes impatient. Use those impatiences.” So what I’m saying is that sound and pictures are this wonderful choreographed dance. Stimulate peoples’ ears and their eye is looking for something; stimulate their eyes and their ears are looking for something, and using those together is a really intimate and very powerful tool that Quentin, I think, is a master at.

How does the sound design help define the characters of Rick Dalton (Leonardo DiCaprio) and Cliff Booth (Brad Pitt)?
This is essentially a buddy movie. Rick Dalton is the insecure actor who’s watching a certain period — when they had great success and comfort — transition into a new period. You’re going from the John Wayne/True Grit way of making movies to Butch Cassidy and the Sundance Kid or Easy Rider, and Rick is not really that comfortable making this transition. His character is full of that kind of anxiety.

The Cliff Booth character is a very internally disturbed character. He’s an unsuccessful crafts/below-the-line person who’s got personal issues and is kind of typical of a character that’s pretty well-known in the filmmaking process. Rick Dalton’s anxious world is about heightened senses. But when he forgets his line during the bar scene in the Lancer set, the world doesn’t become noisy. The world becomes quiet. We go to silence because that’s what’s inside his head. He can’t remember the line and it’s completely silent. But you could play that same scene 180 degrees in the opposite direction and make him confused in a world of noise.

The year 1969 was very important in the history of filmmaking, and that’s another key to Rick’s and Cliff’s characters. If you look at 1969, it was the turning point in Hollywood when indie filmmaking was introduced. It was also the end of a great era of traditional studio fair and traditional acting, and was more defined by the looser, run-and-gun style of Easy Rider. In a way, the Peter Fonda/Dennis Hopper dynamic of Hopper’s film is somewhat similar to that of Rick Dalton and Cliff Booth.

I saw Easy Rider again recently and the ending hit me like a ton of bricks. The cultural panic, and the violence it invokes, is so palpable because you realize that clash of cultures never really went away; it’s still with us all these years later. Tarantino definitely taps into that tension in this film.
It’s funny that you say that because my wife and I went to the Cannes Film Festival with the team, and they were playing Easy Rider on the beach on a giant screen with a thousand seats in the sand. We walked up on it and we stood there for literally an hour and a half transfixed, just watching it. I hadn’t seen it in years.

What a great use of music and location photography! And then, of course, the story and the ending; it’s like, wow. It’s such a huge departure from True Grit and that generation that made that film. That’s what I love about Quentin, because he plays off the tension between those generations in so many ways in the film. We start out with Al Pacino, and they’re drinking whiskey sours, and then we go all the way through the gambit of what 1969 really felt like to the counterculture.

Was there anything unusual that you did in the edit to manipulate sound to make a scene work?
Sound design is a real design-level responsibility. We invent sound. We go to the libraries and we go to great lengths to record things in nature or wherever we can find it. In this case, we recorded all the cars. We apply a very methodical approach to sound.

Sound design, for me, is the art of shaping noise to suit the picture and to enhance the story and great sound lives somewhere between the science of audio and the subjectivity of storytelling. The science part is really well-known, and it’s been perfected over many, many years with lots of talented artists and artisans. But the story part is what excites me, and it’s what excites Quentin. So it becomes what we don’t do that’s so interesting, like using silence instead of noise or creating a soundtrack without a composer. I don’t think you miss having score music. When we couldn’t figure out a song, we made sound design elements. So, yeah, we would make tension sounds.

Shaping noise is not something I could explain to you with an “an eye of newt plus a tail of yak” secret recipe. It’s a feeling. It’s just working with audio, shaping sound effects and noise to become imperceptibly conjoined with music. You can’t tell where the sound design is beginning and ending and where it transfers into more traditional song or music. That is the beauty of Quentin’s films. In terms of sound, the audio has shapes that are very musical.

His deep-cut versions of songs are so interesting, too. Using “California Dreaming” by the Mamas and Papas would have been way too obvious, so he uses a José Feliciano cover of it and puts the actual Mamas and Papas into the film as walk-on characters.
Yeah. I love his choice of music. From Sharon and Roman listening to “Hush” by Deep Purple in the convertible, their hair flying, to going straight into “Son of a Lovin’ Man” after they arrive at the Playboy Mansion. Talk about 1969 and setting it off! It’s not from the San Francisco catalog; it’s just this lovely way that Quentin imagines time and can relate to it as sound and music. The world as it relates to sound is very different than the world of imagery. And the type of director that Quentin is, he’s a writer, he’s a director, and he’s a producer, so he really understands the coalescing of these disciplines.

You haven’t done a lot of interviews in the past. Why not?
I don’t do what I do to call attention to either myself or my work. Over the first 35 years of my career, there’s very little record of any conversation that I had outside of my team and directly with my filmmakers. But at this point in life, when we’re at the cusp of this huge streaming technology shift and everything is becoming more politically sensitive, with deep fakes in both image and audio, I think it’s time sound should have somebody step up and point out, “Hey, we are invisible. We are transitory.” Meaning, when you stop the electricity going to the speakers, the sound disappears, which is kind of an amazing thing. You can pause the picture and you can study it. Sound only exists in real time. It’s just the vibration in the air.

And to be clear, I don’t see motion picture sound as an art form. I see it, rather, as a form of art and it takes a long time to become a sculptor in sound who can work in a very simple style. After all, it’s the simplest lines that just blow your mind!

What blew your mind about this film, either while you worked on it or when you saw the finished product?
I really love the whole look of the film. I love the costumes, and I have great respect for the team that Quentin consistently pulls together. When I work on Quentin’s films, I never turn around and find somebody that doesn’t have a great idea or deep experience in their craft. Everywhere you turn, you bump into extraordinary talent.

Dakota Fanning’s scene at the Spahn Ranch… I mean, wow! Knocks my socks off. That’s really great stuff. It’s a remarkable thing to work with a director who has that kind of love for filmmaking and that allows for really talented people to also get in the sandbox and play.


Beth Marchant is a veteran journalist focused on the production and post community and contributes to “The Envelope” section of the Los Angeles Times. Follow her on Twitter @bethmarchant.