OWC 12.4

Category Archives: post production

Behind the Title: Matter Films president Matt Moore

NAME: Matt Moore

COMPANY: Phoenix and Los Angeles’ Matter Films
and OH Partners

CAN YOU DESCRIBE YOUR COMPANY?
Matter Films is a full-service production company that takes projects from script to screen — oes both pre-production and post in addition to producing content. We are joined by our sister company OH Partners, a full-service advertising agency.

WHAT’S YOUR JOB TITLE?
President of Matter Films and CCO of OH Partners

WHAT DOES THAT ENTAIL?
I’m lucky to be the only person in the company who gets to serve on both sides of the fence. Knowing that, I think that working with Matter and OH gives me a unique insight into how to meet our clients needs best. My number one job is to push both teams to be as innovative and outside of the box as possible. A lot of people do what we do, so I work on our points of differentiation.

Gila River Hotels and Casinos – Sports Partnership

I spend a lot of time finding talent and production partners. We want the most innovative and freshest directors, cinematographers and editors from all over the world. That talent must push all of our work to be the best. We then pair that partner with the right project and the right client.

The other part of my job is figuring out where the production industry is headed. We launched Matter Films because we saw a change within the production world — many production companies weren’t able to respond quickly enough to the need for social and digital work, so we started a company able to address that need and then some.

My job is to always be selling ideas and proposing different avenues we could pursue with Matter and with OH. I instill trust in our clients by using our work as a proof point that the team we’ve assembled is the right choice to get the job done.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
People assumed when we started Matter Films that we would keep everything in-house and have no outside partners, and that’s just not the case. Matter actually gives us even more resources to find those innovators from across the globe. It allows us to do more.

The variation in budget size that we accept at Matter Films would also surprise people. We’ll take on projects with anywhere from $1,000 to one million-plus budgets. We’ve staffed ourselves in such a way that even small projects can be profitable.

WHAT’S YOUR FAVORITE PART OF THE JOB?
It sounds so cliché, but I would have to say the people. I’m around people that I genuinely want to see every single day. I love when we all get together for our meetings, because while we do discuss upcoming projects, we also goof off and just hang out. These are the people I go into battle with every single day. I choose to go into the battle with people that I whole-heartedly care about and enjoy being with. It makes life better.

WHAT’S YOUR LEAST FAVORITE?
What’s tough is how fast this business changes. Every day there’s a new conference or event, and just when you think an idea you’ve had is cutting edge and brand new, you realize you have to keep going and push to be more innovative. Just when you get caught up, you’re already behind. The big challenge is how you’re going to constantly step up your game.

WHAT IS YOUR MOST PRODUCTIVE TIME OF THE DAY?
I’m an early morning person. I can get more done if I start before everybody else.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I was actually pre-med for two years in college with the desire to be a surgeon. When I was an undergrad, I got an abysmal grade on one of our exams and the professor pulled me aside and told me that a score that low proved that I truly did not care about learning the material. He allowed me to withdraw from the class to find something I was more passionate about, and that was life changing.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I found out in college. I genuinely just loved making a product that either entertained or educated people. I started in the news business, so every night I would go home after work and people could tell me about the news of the day because of what I’d written, edited and put on TV.

People knew about what was going on because of the stories that we told. I have a great love for telling stories and having others engage with that story. If you’re good at the job, peoples’ lives will be different as a result of what you create.

Barbuda Ocean Club

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
We just wrapped a large shoot in Maryland for Live Casino, and a different tourism project for a luxury property in Barbuda. We’re currently developing our work with Virgin, and we have a shoot for a technology company focused on developing autonomous driving and green energy upcoming as well. We’re all over the map with the range of work that we have in the pipeline.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
One of my favorite projects actually took place before Matter Films was officially around, but we had a lot of the same team. We did an environmentally sensitive project for Sedona, Arizona, called Sedona Secret 7. Our campaign told the millions of tourists who arrive there how to find other equally beautiful destinations in and around Sedona instead of just the ones everyone already knew.

It was one of those times when advertising wasn’t about selling something, but about saving something.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
My phone, a pair of Airpods and a laptop. The Matter Films team gave me Airpods for my birthday, so those are extra special!

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
My usage on Instagram is off the charts; it’s embarrassing. While I do look at everyone’s vacation photos or what workout they did that day, I also use Instagram as a talent sourcing tool for a lot of work purposes: I follow directors, animation studios and tons of artists that I either get inspiration from or want to work with.

A good percentage of people I follow are creatives that I want to work with at some point. I also reach out to people all the time for potential collaborations.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I love outdoor adventures. Some days I’ll go on a crazy hike here in Arizona or rent a four-wheeler and explore the desert or mountains. I also love just hanging out with my kids — they’re a great age.

Alaina Zanotti rejoins Cartel as executive producer

Santa Monica-based editorial and post studio Cartel has named Alaina Zanotti as executive producer to help with business development and to oversee creative operations along with partner and executive producer Lauren Bleiweiss. Additionally, Cartel has bolstered its roster with the signing of comedic editor Kevin Zimmerman.

Kevin Zimmerman

With more than 15 years of experience, Zanotti joins Cartel after working for clients that include BBDO, Wieden+Kennedy, Deutsch, Google, Paramount and Disney. Zanotti most recently served as senior executive producer at Method Studios, where she oversaw business development for global VFX and post. Prior to that stint, she joined Cartel in 2016 to assist the newly established post and editorial house’s growth. Previously, Zanotti spent more than a decade driving operations and raising brand visibility for Method and Company 3.

Editor Zimmerman joins Cartel following a tenure as a freelance editor, during which his comedic timing and entrepreneurial spirit earned him commercial work for Avocados From Mexico and Planters that aired during 2019’s Super Bowl.

Throughout his two-decade career in editorial, Zimmerman has held positions at Spot Welders, NO6, Whitehouse Post and FilmCore, with recent work for Sprite, Kia, hotels.com, Microsoft and Miller Lite, and a PSA for Girls Who Code. Zimmer has previously worked with Cartel partners Adam Robinson and Leo Scott.

OWC 12.4

Object Matrix and Arvato partner for managing digital archives

Object Matrix and Arvato Systems have partnered to help companies instantly access, manage, browse and edit clips from their digital archives.

Using Arvato’s production asset management platform, VPMS EditMate along with the media-focused object storage solution from Object Matrix, MatrixStore, the companies report that organizations can significantly reduce the time needed to manage media workflows, while making content easily discoverable. The integration makes it easy to unlock assets held in archive, enable creative collaboration and monetize archived assets.

MatrixStore is a media-focused private and hybrid cloud storage platform that provides instant access to all media assets. Built upon object-based storage technology, MatrixStore provides digital content governance through an integrated and automated storage platform supporting multiple media-based workflows while providing a secure and scalable solution.

VPMS EditMate is a toolkit built for managing and editing projects in a streamlined, intuitive and efficient manner, all from within Adobe Premiere Pro. From project creation and collecting media, to the export and storage of edited material, users benefit from a series of features designed to simplify the spectrum of tasks involved in a modern and collaborative editing environment.


Alkemy X adds Albert Mason as head of production

Albert Mason has joined VFX house Alkemy X as head of production. He comes to Alkemy X with over two decades of experience in visual effects and post production. He has worked on projects directed by such industry icons as Peter Jackson on the Lord of the Rings trilogy, Tim Burton on Alice in Wonderland and Robert Zemeckis on The Polar Express. In his new role at Alkemy X, he will use his experience in feature films to target the growing episodic space.

A large part of Alkemy X’s work has been for episodic visual effects, with credits that include Amazon Prime’s Emmy-winning original series, The Marvelous Mrs. Maisel, USA’s Mr. Robot, AMC’s Fear the Walking Dead, Netflix’s Maniac, NBC’s Blindspot and Starz’s Power.

Mason began his career at MTV’s on-air promos department, sharpening his production skills on top series promo campaigns and as a part of its newly launched MTV Animation Department. He took an opportunity to transition into VFX, stepping into a production role for Weta Digital and spending three years working globally on the Lord of the Rings trilogy. He then joined Sony Pictures Imageworks, where he contributed to features including Spider-Man 3 and Ghost Rider. He has also produced work for such top industry shops as Logan, Rising Sun Pictures and Greymatter VFX.

“[Albert’s] expertise in constructing advanced pipelines that embrace emerging technologies will be invaluable to our team as we continue to bolster our slate of VFX work,” says Alkemy X president/CEO Justin Wineburgh.


2019 HPA Award winners announced

The industry came together on November 21 in Los Angeles to celebrate its own at the 14th annual HPA Awards. Awards were given to individuals and teams working in 12 creative craft categories, recognizing outstanding contributions to color grading, sound, editing and visual effects for commercials, television and feature film.

Rob Legato receiving Lifetime Achievement Award from presenter Mike Kanfer. (Photo by Ryan Miller/Capture Imaging)

As was previously announced, renowned visual effects supervisor and creative Robert Legato, ASC, was honored with this year’s HPA Lifetime Achievement Award; Peter Jackson’s They Shall Not Grow Old was presented with the HPA Judges Award for Creativity and Innovation; acclaimed journalist Peter Caranicas was the recipient of the very first HPA Legacy Award; and special awards were presented for Engineering Excellence.

The winners of the 2019 HPA Awards are:

Outstanding Color Grading – Theatrical Feature

WINNER: “Spider-Man: Into the Spider-Verse”
Natasha Leonnet // Efilm

“First Man”
Natasha Leonnet // Efilm

“Roma”
Steven J. Scott // Technicolor

Natasha Leonnet (Photo by Ryan Miller/Capture Imaging)

“Green Book”
Walter Volpatto // FotoKem

“The Nutcracker and the Four Realms”
Tom Poole // Company 3

“Us”
Michael Hatzer // Technicolor

 

Outstanding Color Grading – Episodic or Non-theatrical Feature

WINNER: “Game of Thrones – Winterfell”
Joe Finley // Sim, Los Angeles

 “The Handmaid’s Tale – Liars”
Bill Ferwerda // Deluxe Toronto

“The Marvelous Mrs. Maisel – Vote for Kennedy, Vote for Kennedy”
Steven Bodner // Light Iron

“I Am the Night – Pilot”
Stefan Sonnenfeld // Company 3

“Gotham – Legend of the Dark Knight: The Trial of Jim Gordon”
Paul Westerbeck // Picture Shop

“The Man in The High Castle – Jahr Null”
Roy Vasich // Technicolor

 

Outstanding Color Grading – Commercial  

WINNER: Hennessy X.O. – “The Seven Worlds”
Stephen Nakamura // Company 3

Zara – “Woman Campaign Spring Summer 2019”
Tim Masick // Company 3

Tiffany & Co. – “Believe in Dreams: A Tiffany Holiday”
James Tillett // Moving Picture Company

Palms Casino – “Unstatus Quo”
Ricky Gausis // Moving Picture Company

Audi – “Cashew”
Tom Poole // Company 3

 

Outstanding Editing – Theatrical Feature

Once Upon a Time… in Hollywood

WINNER: “Once Upon a Time… in Hollywood”
Fred Raskin, ACE

“Green Book”
Patrick J. Don Vito, ACE

“Rolling Thunder Revue: A Bob Dylan Story by Martin Scorsese”
David Tedeschi, Damian Rodriguez

“The Other Side of the Wind”
Orson Welles, Bob Murawski, ACE

“A Star Is Born”
Jay Cassidy, ACE

 

Outstanding Editing – Episodic or Non-theatrical Feature (30 Minutes and Under)

VEEP

WINNER: “Veep – Pledge”
Roger Nygard, ACE

“Russian Doll – The Way Out”
Todd Downing

“Homecoming – Redwood”
Rosanne Tan, ACE

“Withorwithout”
Jake Shaver, Shannon Albrink // Therapy Studios

“Russian Doll – Ariadne”
Laura Weinberg

 

Outstanding Editing – Episodic or Non-theatrical Feature (Over 30 Minutes)

WINNER: “Stranger Things – Chapter Eight: The Battle of Starcourt”
Dean Zimmerman, ACE, Katheryn Naranjo

“Chernobyl – Vichnaya Pamyat”
Simon Smith, Jinx Godfrey // Sister Pictures

“Game of Thrones – The Iron Throne”
Katie Weiland, ACE

“Game of Thrones – The Long Night”
Tim Porter, ACE

“The Bodyguard – Episode One”
Steve Singleton

 

Outstanding Sound – Theatrical Feature

WINNER: “Godzilla: King of Monsters”
Tim LeBlanc, Tom Ozanich, MPSE // Warner Bros.
Erik Aadahl, MPSE, Nancy Nugent, MPSE, Jason W. Jennings // E Squared

“Shazam!”
Michael Keller, Kevin O’Connell // Warner Bros.
Bill R. Dean, MPSE, Erick Ocampo, Kelly Oxford, MPSE // Technicolor

“Smallfoot”
Michael Babcock, David E. Fluhr, CAS, Jeff Sawyer, Chris Diebold, Harrison Meyle // Warner Bros.

“Roma”
Skip Lievsay, Sergio Diaz, Craig Henighan, Carlos Honc, Ruy Garcia, MPSE, Caleb Townsend

“Aquaman”
Tim LeBlanc // Warner Bros.
Peter Brown, Joe Dzuban, Stephen P. Robinson, MPSE, Eliot Connors, MPSE // Formosa Group

 

Outstanding Sound – Episodic or Non-theatrical Feature

WINNER: “The Haunting of Hill House – Two Storms”
Trevor Gates, MPSE, Jason Dotts, Jonathan Wales, Paul Knox, Walter Spencer // Formosa Group

“Chernobyl – 1:23:45”
Stefan Henrix, Stuart Hilliker, Joe Beal, Michael Maroussas, Harry Barnes // Boom Post

“Deadwood: The Movie”
John W. Cook II, Bill Freesh, Mandell Winter, MPSE, Daniel Colman, MPSE, Ben Cook, MPSE, Micha Liberman // NBC Universal

“Game of Thrones – The Bells”
Tim Kimmel, MPSE, Onnalee Blank, CAS, Mathew Waters, CAS, Paula Fairfield, David Klotz

“Homecoming – Protocol”
John W. Cook II, Bill Freesh, Kevin Buchholz, Jeff A. Pitts, Ben Zales, Polly McKinnon // NBC Universal

 

Outstanding Sound – Commercial 

WINNER: John Lewis & Partners – “Bohemian Rhapsody”
Mark Hills, Anthony Moore // Factory

Audi – “Life”
Doobie White // Therapy Studios

Leonard Cheshire Disability – “Together Unstoppable”
Mark Hills // Factory

New York Times – “The Truth Is Worth It: Fearlessness”
Aaron Reynolds // Wave Studios NY

John Lewis & Partners – “The Boy and the Piano”
Anthony Moore // Factory

 

Outstanding Visual Effects – Theatrical Feature

WINNER: “The Lion King”
Robert Legato
Andrew R. Jones
Adam Valdez, Elliot Newman, Audrey Ferrara // MPC Film
Tom Peitzman // T&C Productions

“Avengers: Endgame”
Matt Aitken, Marvyn Young, Sidney Kombo-Kintombo, Sean Walker, David Conley // Weta Digital

“Spider-Man: Far From Home”
Alexis Wajsbrot, Sylvain Degrotte, Nathan McConnel, Stephen Kennedy, Jonathan Opgenhaffen // Framestore

“Alita: Battle Angel”
Eric Saindon, Michael Cozens, Dejan Momcilovic, Mark Haenga, Kevin Sherwood // Weta Digital

“Pokemon Detective Pikachu”
Jonathan Fawkner, Carlos Monzon, Gavin Mckenzie, Fabio Zangla, Dale Newton // Framestore

 

Outstanding Visual Effects – Episodic (Under 13 Episodes) or Non-theatrical Feature

Game of Thrones

WINNER: “Game of Thrones – The Bells”
Steve Kullback, Joe Bauer, Ted Rae
Mohsen Mousavi // Scanline
Thomas Schelesny // Image Engine

“Game of Thrones – The Long Night”
Martin Hill, Nicky Muir, Mike Perry, Mark Richardson, Darren Christie // Weta Digital

“The Umbrella Academy – The White Violin”
Everett Burrell, Misato Shinohara, Chris White, Jeff Campbell, Sebastien Bergeron

“The Man in the High Castle – Jahr Null”
Lawson Deming, Cory Jamieson, Casi Blume, Nick Chamberlain, William Parker, Saber Jlassi, Chris Parks // Barnstorm VFX

“Chernobyl – 1:23:45”
Lindsay McFarlane
Max Dennison, Clare Cheetham, Steven Godfrey, Luke Letkey // DNEG

 

Outstanding Visual Effects – Episodic (Over 13 Episodes)

Team from The Orville – Outstanding VFX, Episodic, Over 13 Episodes (Photo by Ryan Miller/Capture Imaging)

WINNER: “The Orville – Identity: Part II”
Tommy Tran, Kevin Lingenfelser, Joseph Vincent Pike // FuseFX
Brandon Fayette, Brooke Noska // Twentieth Century FOX TV

“Hawaii Five-O – Ke iho mai nei ko luna”
Thomas Connors, Anthony Davis, Chad Schott, Gary Lopez, Adam Avitabile // Picture Shop

“9-1-1 – 7.1”
Jon Massey, Tony Pirzadeh, Brigitte Bourque, Gavin Whelan, Kwon Choi // FuseFX

“Star Trek: Discovery – Such Sweet Sorrow Part 2”
Jason Zimmerman, Ante Dekovic, Aleksandra Kochoska, Charles Collyer, Alexander Wood // CBS Television Studios

“The Flash – King Shark vs. Gorilla Grodd”
Armen V. Kevorkian, Joshua Spivack, Andranik Taranyan, Shirak Agresta, Jason Shulman // Encore VFX

The 2019 HPA Engineering Excellence Awards were presented to:

Adobe – Content-Aware Fill for Video in Adobe After Effects

Epic Games — Unreal Engine 4

Pixelworks — TrueCut Motion

Portrait Displays and LG Electronics — CalMan LUT based Auto-Calibration Integration with LG OLED TVs

Honorable Mentions were awarded to Ambidio for Ambidio Looking Glass; Grass Valley, for creative grading; and Netflix for Photon.


IDC goes bicoastal, adds Hollywood post facility 


New York’s International Digital Centre (IDC) has opened a new 6,800-square-foot digital post facility in Hollywood, with Rosanna Marino serving as COO. She will manage the day-to-day operations of the West Coast post house. IDC LA will focus on serving the entertainment, content creation, distribution and streaming industries.

Rosanna Marino

Marino will manage sales, marketing, engineering and the day-to-day operations for the Hollywood location, while IDC founder/CEO Marcy Gilbert, will lead the company’s overall activities and New York headquarters.

IDC will provide finishing, color grading and editorial in Dolby Vision 4K HDR, UHD as well as global QC. IDC LA features 11 bays and a DI theater, which includes Dolby 7.1 Atmos audio mixing, dubbing and audio description. They are also providing subtitle and closed caption-timed text creation and localization, ABS scripting and translations in over 40 languages.

To complete the end-to-end chain, they provide IMF and DCP creation, supplemental and all media fulfillment processing, including audio and timed text conforms for distribution. IDC is an existing Netflix Partner Program member — NP3 in New York and NPFP for the Americas and Canada.

IDC LA occupies the top two floors and rooftop deck in a vintage 1930’s brick building on Santa Monica Boulevard.


Review: Nugen Audio’s VisLM2 loudness meter plugin

By Ron DiCesare

In 2010, President Obama signed the CALM Act (Commercial Advertisement Loudness Mitigation) regulating the audio levels of TV commercials. At that time, I had many “laypeople” complain to me how commercials were often so much louder than the TV programs. Over the past 10 years, I have seen the rise of audio meter plugins to meet the requirements of the CALM Act, resulting in reducing this complaint dramatically.

A lot has changed since the 2010 FCC mandate of -24LKFS +/-2db. LKFS was the scale name at the time, but we will get into this more later. Today, we have countless viewing options such as cable networks, a large variety of streaming services, the internet and movie theaters utilizing 7.1 or Dolby Atmos. Add to that, new metering standards such as True Peak and you have the likelihood of confusing and possibly even conflicting audio standards.

Nugen Audio has updated its VisLM for addressing today’s complex world of audio levels and audio metering. The VisLM2 is a Mac and Windows plugin compatible with Avid Pro Tools and any DAW that uses RTAS, AU, AAX, VST and VST3. It can also be installed as a standalone application for Windows and OSX. By using its many presets, Loudness History Mode and countless parameters to view and customize, the VisLM2 can help an audio mixer monitor a mix to see when their programs are in and out of audio level spec using a variety of features.

VisLM2

The Basics
The first thing I needed to see was how it handled the 2010 audio standard of -24LKFS, now known as LUFS. LKFS (Loudness K-weighted relative to Full Scale) was the term used in the United States. LUFS (Loudness Units relative to Full Scale) was the term used in Europe. The difference is in name only, and the audio level measurement is identical. Now all audio metering plugins use LUFS, including the VisLM2.

I work mostly on TV commercials, so it was pretty easy for me to fire up the VisLM2 and get my LUFS reading right away. Accessing the US audio standard dictated by the CALM Act is simple if you know the preset name for it: ITU-R B.S. 1770-4. I know, not a name that rolls off the tongue, but it is the current spec. The VisLM2 has four presets of ITU-R B.S. 1770 — revision 01, 02, 03 and the current revision 04. Accessing the presets is easy, once you realize that they are not in the preset section of the plugin as one might think. Presets are located in the options section of the meter.

While this was my first time using anything from Nugen Audio, I was immediately able to run my 30-second TV commercial and get my LUFS reading. The preset gave me a few important default readings to view while mixing. There are three numeric displays that show Short-Term, Loudness Range and Integrated, which is how the average loudness is determined for most audio level specs. There are two meters that show Momentary and Short-Term levels, which are helpful when trying to pinpoint any section that could be putting your mix out of audio spec. The difference is that Momentary is used for short bursts, such as an impact or gun shot, while Short-Term is used for the last three-second “window” of your mix. Knowing the difference between the two readings is important. Whether you work on short- or long-format mixes, knowing how to interpret both Momentary and Short-Term readings is very helpful in determining where trouble spots might be.

Have We Outgrown LUFS?
Most, if not all, deliverables now specify a True Peak reading. True Peak has slowly but firmly crept its way into audio spec and it can be confusing. For US TV broadcast, True Peak spec can range as high as -2dBTP and as low as -6dBTP, but I have seen it spec out even lower at -8dBTP for some of my clients. That means a TV network can reject or “bounce back” any TV programming or commercial that exceeds its LUFS spec, its True Peak spec or both.

VisLM2

In most cases, LUFS and True Peak readings work well together. I find that -24LUFS Integrated gives a mixer plenty of headroom for staying below the True Peak maximum. However, a few factors can work against you. The higher the LUFS Integrated spec (say, for an internet project) and/or the lower the True Peak spec (say, for a major TV network), the more difficult you might find it to manage both readings. For anyone like me — who often has a client watching over my shoulder telling me to make the booms and impacts louder — you always want to make sure you are not going to have a problem keeping your mix within spec for both measurements. This is where the VisLM2 can help you work within both True Peak and LUFS standards simultaneously.

To do that using the VisLM2, let’s first understand the difference between True Peak and LUFS. Integrated LUFS is an average reading over the duration of the program material. Whether the program material is 15 seconds or two hours long, hitting -24LUFS Integrated, for example, is always the average reading over time. That means a 10-second loud segment in a two-hour program could be much louder than a 10-second loud segment in a 15-second commercial. That same loud 10 seconds can practically be averaged out of existence during a two-hour period with LUFS Integrated. Flawed logic? Possibly. Is that why TV networks are requiring True Peak? Well, maybe yes, maybe no.

True Peak is forever. Once the highest True Peak is detected, it will remain as the final True Peak reading for the entire length of the program material. That means the loud segment at the last five minutes of a two-hour program will dictate the True Peak reading of the entire mix. Let’s say you have a two-hour show with dialogue only. In the final minute of the show, a single loud gunshot is heard. That one-second gunshot will determine the other one hour, 59 minutes, and 59 seconds of the program’s True Peak audio level. Flawed logic? I can see it could be. Spotify’s recommended levels are -14LUFS and -2dBTP. That gives you a much smaller range for dynamics compared to others such as network TV.

VisLM2

Here’s where the VisLM2 really excels. For those new to Nugen Audio, the clear stand out for me is the detailed and large history graph display known as Loudness History Mode. It is a realtime updating and moving display of the mix levels. What it shows is up to you. There are multiple tabs to choose from, such as Integrated, True Peak, Short-Term, Momentary, Variance, Flags and Alerts, to name a few. Selecting any of these tabs will result in showing, or not showing, the corresponding line along the timeline of the history graph as the audio plays.

When any of the VisLM2’s presets are selected, there are a whole host of parameters that come along with it. All are customizable, but I like to start with the defaults. My thinking is that the default values were chosen for a reason, and I always want to know what that reason is before I start customizing anything.

For example, the target for the preset of ITU-R B.S. 1770-4 is -24LUFS Integrated and -2dBTP. By default, both will show on the history graph. The history graph will also show default over and under audio levels based on the alerts you have selected in the form of min and max LUFS. But, much to my surprise, the default alert max was not what I expected. It wasn’t -24LUFS, which seemed to be the logical choice to me. It was 4dB higher at -20LUFS, which is 2dB above the +/-2dB tolerance. That’s because these min and max alert values are not for Integrated or average loudness as I had originally thought. These values are for Short-Term loudness. The history graph lines with its corresponding min and max alerts are a visual cue to let the mixer know if he or she is in the right ballpark. Now this is not a hard and fast rule. Simply put, if your short-term value stays somewhere between -20 and -28LUFS throughout most of an entire project, then you have a good chance of meeting your target of -24LUFS for the overall integrated measurement. That is why the value range is often set up as a “green” zone on the loudness display.

VisLM2

The folks at Nugen point out that it isn’t practically possible to set up an alert or “red zone” for integrated loudness because this value is measured over the entire program. For that, you have to simply view the main reading of your Integrated loudness. Even so, I will know if I am getting there or not by viewing my history graph while working. Compare that to the impractical approach of running the entire mix before having any idea of where you are going to net out. The VisLM2 max and min alerts help keep you working within audio spec right from the start.

Another nice feature about the large history graph window is the Macro tab. Selecting the Macro feature will give you the ability to move back and forth anywhere along the duration of your mix displayed in the Loudness History Mode. That way you can check for problem spots long after they have happened. Easily accessing any part of the audio level display within the history graph is essential. Say you have a trouble spot somewhere within a 30-minute program; select the Macro feature and scroll through the history graph to spot any overages. If an overage turns out to be at, say, eight minutes in, then cue up your DAW to that same eight-minute mark to address changes in your mix.

Another helpful feature designed for this same purpose is the use of flags. Flags can be added anywhere in your history graph while the audio is running. Again, this can be helpful for spotting, or flagging, any problem spots. For example, you can flag a loud action scene in an otherwise quiet dialogue-driven program that you know will be tricky to balance properly. Once flagged, you will have the ability to quickly cue up your history graph to work with that section. Both the Macro and Flag functions are aided by tape-machine-like controls for cueing up the Loudness History Mode display to any problem spots you might want to view.

Presets, Presets, Presets
The VisLM2 comes with 34 presets for selecting what loudness spec you are working with. Here is where I need to rely on the knowledge of Nugen Audio to get me going in the right direction. I do not know all of the specs for all of the networks, formats and countries. I would venture a guess that very few audio mixers do either. So I was not surprised when I saw many presets that I was not familiar with. Common presets in addition to ITU-R B.S. 1770 are six versions of EBU R128 for European broadcast and two Netflix presets (stereo and 5.1), which we will dive into later on. The manual does its best to describe some of the presets, but it falls short. The descriptions lack any kind of real-world language, only techno-garble. I have no idea what AGCOM 219/9/CSP LU is and, after reading the manual, I still don’t! I hope a better source of what’s what regarding each preset will become available sometime soon.

MasterCheck

But why no preset for Internet audio level spec? Could mixing for AGCOM 219/9/CSP LU be even more popular than mixing for the Internet? Unlikel. So let’s follow Nugen’s logic here. I have always been in the -18LUFS range for Internet only mixes. However, ask 10 different mixers and you will likely get 10 different answers. That is why there is not an Internet preset included with the VisLM2 as I had hoped. Even so, Nugen offers its MasterCheck plugin for other platforms such as Spotify and YouTube. MasterCheck is something I have been hoping for, and it would be the perfect companion to the VisLM2.

The folks at Nugen have pointed out a very important difference between broadcast TV and many Internet platforms: Most of the streaming services (YouTube, Spotify, Tidal, Apple Music, etc.) will perform their own loudness normalization after the audio is submitted. They do not expect audio engineers to mix to their standards. In contrast, Netflix and most TV networks will expect mixers to submit audio that already meets their loudness standards. VisLM2 is aimed more toward engineers who are mixing for platforms in the second category.

Streaming Services… the Wild West?
Streaming services are the new frontier, at least to me. I would call it the Wild West by comparison to broadcast TV. With so many streaming services popping up, particularly “off-brand” services, I would ask if we have gone back in time to the loudness wars of the late 2000s. Many streaming services do have an audio level spec, but I don’t know of any consensus between them like with network TV.

That aside, one of the most popular streaming services is Netflix. So let’s look at the VisLM2’s Netflix preset in detail. Netflix is slightly different from broadcast TV because its spec is based on dialogue. In addition to -2dTP, Netflix has an LUFS spec of -27 +/- 2dB Integrated Dialogue. That means the dialogue level is averaged out over time, rather than using all program material like music and sound effects. Remember my gunshot example? Netflix’s spec is more forgiving of that mixing scenario. This can lead to more dynamic or more cinematic mixes, which I can see as a nice advantage when mixing.

Netflix currently supports Dolby Atmos on selected titles, but word on the street is that Netflix deliverables will be requiring Atmos for all titles. I have not confirmed this, but I can only hope it will be backward-compatible for non-Atmos mixes. I was lucky enough to speak directly with Tomlinson Holman of THX fame (Tomlinson Holman eXperiment) about his 10.2 format that included height long before Atmos was available. In the case of 10.2, Holman said it was possible to deliver a single mono channel audio mix in 10.2 by simply leaving all other channels empty. I can only hope this is the same for Netflix’s Atmos deliverables so you can simply add or subtract the amount of channels needed when you are outputting your final mix. Regardless, we can surely look to Nugen Audio to keep us updated with its Netflix preset in the VisLM2 should this become a reality.

True Peak within VisLM2

VisLM Updates
For anyone familiar with the original version of the VisLM, there are three updates that are worth looking at. First is the ability to resize and select what shows in the display. That helps with keeping the window active on your screen as you are working. It can be a small window so it doesn’t interfere with your other operations. Or you can choose to show only one value, such as Integrated, to keep things really small. On the flip side, you can expand the display to fill the screen when you really need to get the microscope out. This is very helpful with the history graph for spotting any trouble spots. The detail displayed in the Loudness History Mode is by far the most helpful thing I have experienced using the VisLM2.

Next is the ability to display both LUFS and True Peak meters simultaneously. Before, it was one or the other and now it is both. Simply select the + icon between the two meters. With the importance of True Peak, having that value visible at all times is extremely valuable.

Third is the ability to “punch in,” as I call it, to update your Integrated reading while you are working. Let’s say you have your overall Integrated reading, and you see one section that is making you go over. You can adjust your levels on your DAW as you normally would and then simply “punch in” that one section to calculate the new Integrated reading. Imagine how much time you save by not having to run a one-hour show every time you want to update your Integrated reading. In fact, this “punch in” feature is actually the VisLM2 constantly updating itself. This is just another example of how the VisLM2 helps keep you working within audio spec right from the start.

Multi-Channel Audio Mixing
The one area I can’t test the VisLM2 on is multi-channel audio, such as 5.1 and Dolby Atmos. I work mostly on TV commercials, Internet programming, jazz records and the occasional indie film. So my world is all good old-fashioned stereo. Even so, the VisLM2 can measure 5.1, 7.1, and 7.1.2, which is the channel count for Dolby Atmos bed tracks. For anyone who works in multi-channel audio, the VisLM2 will measure and display audio levels just as I have described it working in stereo.

Summing Up
With the changing landscape of TV networks, streaming services and music-only platforms, the resulting deliverables have opened up the flood gates of audio specs like never before. Long gone are the days of -24LUFS being the one and only number you need to know.

To help manage today’s complicated and varied amount of deliverables along with the audio spec to go with it, Nugen Audio’s VisLM2 absolutely delivers.


Ron DiCesare is a NYC-based freelance audio mixer and sound designer. His work can be heard on national TV campaigns, Vice and the Viceland TV network. He is also featured in the doc “Sing You A Brand New Song” talking about the making of Coleman Mellett’s record album, “Life Goes On.”


Report: Apple intros 16-inch MacBook Pro, previews new Mac Pro, display

By Pat Birk

At a New York City press event, Apple announced that it will being shipping a new 16-inch MacBook Pro this week. This new offering will feature an updated 16-inch Retina display with a pixel density of 226ppi; 9th-generation Intel processors featuring up to 8 cores and up to 64GB of DDR4 memory; vastly expanded SSDs ranging from 512GB to a whopping 8TB; upgraded discrete AMD Radeon Pro 5000M series graphics; completely redesigned speakers and internal microphones; and an overhauled keyboard dubbed, of course, the “Magic Keyboard.”

The MacBook Pro’s new Magic Keyboard.

These MacBooks also feature a new cooling system, with wider vents and a 35 percent larger heatsink, along with a 100-watt hour battery (which the company stressed is the maximum capacity allowed by the Federal Aviation Administration), contributing to an additional hour of battery life while web browsing or playing back video.

I had the opportunity to do a brief hands-on demo, and for the first time since Apple introduced the Touch Bar to the MacBook Pro, I have found myself wanting a new Mac. The keyboard felt great, offering far more give and far less plastic-y clicks than the divisive Butterfly keyboard. The Mac team has reintroduced a physical escape key, along with an inverted T-style cluster of arrow keys, both features that will be helpful for coders. Apple also previewed its upcoming Mac Pro tower and Pro Display XDR.

Sound Offerings
As an audio guy, I was naturally drawn to the workstation’s sound offerings and was happy when the company dedicated a good portion of the presentation to touting its enhanced speaker and microphone arrays. The six-speaker system features dual-opposed woofer drivers, which offer enhanced bass while canceling out problematic distortion-causing frequencies. When compared side by side with high-end offerings from other manufacturers, the MacBook offered a far more complete sonic experience than the competition, and I believe Apple is right in saying that they’ve achieved an extra half octave of bass range with this revision.

The all-new MacBook Pro features a 16-inch Retina display.

It’s really impressive for a laptop, but I honestly don’t see it replacing a good pair of headphones or a half-decent Bluetooth for most users. I can see it being useful in the occasional pitch meeting, or showing an idea or video to a friend with no other option, but feel it’s more of a nice touch than a major selling point.

The three-microphone array was impressive, as well, and I can see it offering legitimate functionality for working creatives. When A/B’d with competing internal microphones, there was really no comparison. The MacBook’s mics deliver crisp, clean recordings with very little hiss and no noticeable digital artifacting, both of which were clearly present in competing PCs. I could realistically see this working for a small podcast, or on-the-go musicians recording demos. We live in a world where Steve Lacie recorded and produced a beat for Kendrick Lamar on an iPhone. When Apple claims that the signal-to-noise ratio rivals or even surpasses that of digital mics like the Blue Yeti, they may very well be right. However, in an A/B comparison, I found the Blue to have more body and room ambience, while the MacBook sounded a bit thin and sterile.

Demos
The rest of the demo featured creative professionals — coders, animators, colorists and composers — pushing the spec’d out Mac and MacBook Pros to their limits. A coder demonstrated testing a program in realtime on eight emulations of iOS and iPad OS at once.

A video editor demonstrated the new Mac Pro (not the MacBook) running a project with six 8K video sources playing at once through an animation layer, with no rendering at all. We were also treated to a brief Blackmagic Da Vinci Resolve demo on a Pro Display XDR. A VFX artist demonstrated making realtime lighting changes to an animation comprised of eight million polygons on the Mac Pro, again with no need for rendering.

The Mac Pro and Pro Display XDR, the world’s best pro display, will be available in December.

Composers showed us a Logic X session running a track produced for Lizzo by Oak Felder. The song had over 200 tracks, replete with plugins and instruments — Felder was able to accomplish this on an MacBook Pro. Also on the MacBook, they had a session loaded running multiple instances of MIDI instruments using sample libraries from Cinesamples, Spitfire Audio and Orchestral Tools. The result could easily have fooled me into believing it had been recorded with a live orchestra, and the fact that all of these massive, processor intensive sample libraries could operate at the same without making the Mac Pro break a sweat had me floored.

Summing Up
Apple has delivered a very solid upgrade in the new 16-inch MacBook Pro, especially as a replacement for the earlier iterations of the Touch Bar MacBook Pros. They have begun taking orders, with prices starting at $2,399 for the 2.6GHz 6-core model, and $2,799 for the 2.3GHz 8-core model.
As for the new Mac Pro and Pro Display XDR, they’re coming in December, but company representatives remained tight-lipped on a release date.


Pat Birk is a musician, sound engineer and post pro at Silver Sound, a boutique sound house based in New York City.


postPerspective’s ‘SMPTE 2019 Live’ interview coverage

postPerspective was the official production team for SMPTE during its most recent conference in downtown Los Angeles this year. Taking place once again at the Bonaventure Hotel, the conference featured events and sessions all week. (You can watch those interviews here.)

These sessions ranged from “Machine Learning & AI in Content Creation” to “UHD, HDR, 4K, High Frame Rate” to “Mission Critical: Project Artemis, Imaging from the Moon and Deep Space Imaging.” The latter featured two NASA employees and a live talk with astronauts on the International Space Station. It was very cool.

postPerspective’s coverage was also cool and included many sit-down interviews with those presenting at the show (including former astronaut and One More Orbit director Terry Virts as well as Todd Douglas Miller, the director of the Apollo 11 doc), SMPTE executives and long-standing members of the organization.

In addition to the sessions, manufacturers had the opportunity to show their tools on the exhibit floor, where one of our crews roamed with camera and mic in hand reporting on the newest tech.

Whether you missed the conference or experienced it firsthand, these exclusive interviews will provide a ton of information about SMPTE, standards, and the future of our industry, as well as just incredibly smart people talking about the merger of technology and creativity.

Enjoy our coverage!

Blog: Making post deliverables simple and secure

By Morgan Swift

Post producers don’t have it easy. With an ever-increasing number of platforms for distribution and target languages to cater to, getting one’s content to the global market can be challenging to say the least. To top it all, given the current competitive landscape, producers are always under pressure to reduce costs and meet tight deadlines.

Having been in the creative services business for two decades, we’ve all seen it before — post coordinators and supervisors getting burnt out working late nights, often juggling multiple projects and being pushed to the breaking point. You can see it in their eyes. What adds to the stress is dealing with multiple vendors to get various kinds of post finishing work done — from color grading to master QC to localization.

Morgan Swift

Localization is not the least of these challenges. Different platforms specify different deliverables, including access services like closed captions (CC) and audio description (AD); along with as-broadcast scripts (ABS) and combined continuity spotting lists (CCSL). Each of these deliverables requires specialized teams and tools to execute. Needless to say, they also have a significant impact on the budget — usually at least tens of thousands of dollars (much more for a major release).

It is therefore extremely critical to plan post deliverables well in advance to ensure that you are in complete control of turnaround time (TAT), expected spend and potential cost saving opportunities. Let’s look at a few ways of streamlining the process of creating access services deliverables. To do this, we need to understand the various factors at play.

First of all, we need to consider the amount of effort involved in creating these deliverables. There is typically a lot of overlap, as deliverables like as-broadcast scripts and combined continuity spotting lists are often required for creating closed captions and audio description. This means that it is cheaper to combine the creation of all these deliverables instead of getting them done separately.

The second factor to think about is security. Given that pre-release content is extremely vulnerable to piracy, the days of getting an extra DVD with visible timecode for closed captions should be over. Even the days of sending a non-studio-approved link just to create the deliverables should be over.
Why? Because today, there exist tailor-made solutions that have been designed to facilitate secure localization operations. They enable easy creation of a folder that can be used to send and receive files securely, even by external vendors. One such solution is Clear Media ERP, which was built ground-up by Prime Focus Technologies in order to address these challenges.

There is no additional cost to send and receive videos or post deliverable files if you already have a system like this set up for a show. You can keep your pre-release content completely safe, leveraging the software’s advanced security features which include multi-factor authentication, Okta integration, bulk watermarking, burnt-in watermarks for downloads, secure script and document distribution and more.

With the right tech stack, you can get one beautifully organized and secure location to store all of your Access Services deliverables. Which means your team can finally sit back and focus on what matters the most — creating incredible content.


Morgan Swift  is director of account management at Prime Focus Technologies in Los Angeles.

SMPTE 2019 Live: Gala Award Winners

postPerspective was invited by SMPTE to host the exclusive coverage of their 2019 Awards Gala. (Watch here!)

The annual event was hosted by Kasha Patel (a digital storyteller at NASA Earth Observatory by day and a science comedian by night!), and presenters included Steve Wozniak. Among this year’s honorees — Netflix’s Anne Aaron, Gary J. Sullivan, Michelle Munson and Sky’s Cristina Gomila Torres. Honorary Membership was bestowed on Roderick Snell (Snell & Wilcox) and Paul Kellar (Quantel).

If you missed this year’s SMPTE Awards Gala, or even if you were there, check out our backstage interviews with some of our industry’s luminaries. We hope you enjoy watching these interviews as much as we enjoyed shooting them.

Oh, and a big shout out to the team from AlphaDogs who shot and edited all of our 2019 SMPTE Live coverage!

James Norris joins Nomad in London as editor, partner

Nomad in London has added James Norris as editor and partner. A self-taught, natural editor, James started out running for the likes of Working Title, Partizan and Tomboy Films. He then moved to Whitehouse Post as an assistant where he refined his craft and rose through the ranks to become an editor.

Over the past 15 years, he’s worked across commercials, music videos, features and television. Norris edited Ikea’s Fly Robot Fly spot and Asda’s Get Possessed piece, and has recently cut a new project for Nike. Working within television and film, he also cut an episode of the BAFTA-nominated drama Our World War and feature film We Are Monster.

“I was attracted to Nomad for their vision for the future and their dedication to the craft of editing. They have a wonderful history but are also so forward-thinking and want to create new, exciting things. The New York and LA offices have seen incredible success over the last few years, and now there’s Tokyo and London too. On top of this, Nomad feels like home already. They’re really lovely people — it really does feel like a family.”

Norris will be cutting on Avid Media Composer at Nomad.

 

Production and post boutique Destro opens in LA

Industry veterans Drew Neujahr, Sean McAllen, and Shane McAllen have partnered to form Destro, a live-action and post production boutique based in Los Angeles. Destro has already developed and produced an original documentary series, Seed, which profiles artists and innovators across a range of disciplines. In addition, the team has recently worked on projects for Google, Nintendo and Michelin.

Destro’s primary focus will be producing, directing, and post on live-action projects. However, with the partners’ extensive background in motion and VFX, the team is adept at executing mixed-media pipelines when the occasion calls.

With the launch of original studio projects like Seed, Destro sees an opportunity not only to showcase its own voice but to present a case study to forge symbiotic relationships with brands that have real stories to tell about their teams, products, users, and core values.

“Great ideas don’t always happen at conception,” says Neujahr. “When the weather changes during production or the client rethinks the concept in post, being able to improvise and adjust brings about the best work.”

Neujahr and the McAllen brothers bring a combined 45 years of experience spanning commercial and film production, post production and entertainment branding/marketing.

Neujahr’s experience includes features and marketing as both a producer and a creative. He has directed short films, commercials and the documentary series Western State. As a producer, head of production and executive producer at top motion graphics and visual effects studios in LA, he oversaw spots for Ford, Burger King, Walmart, Nickelodeon, FX and History.

Sean McAllen is a seasoned film and commercial editor who has crafted both short-form and long-form work for Ford, Chevy, Nissan, Toyota, Red Bull, Google and Samsung. He also co-wrote and edited the Emmy-nominated documentary feature Houston We Have a Problem. McAllen got his start co-founding a Tokyo/Los Angeles-based production company, where he directed commercials, broadcast documentaries and entertainment marketing content.

Shane McAllen is a veteran of the film and commercial industry. His feature editing credits include contributions to Iron Man 3 and Captain America: The Winter Soldier. On the commercial side, he has worked on campaigns for BMW, Apple and Nintendo. He is also an accomplished writer, producer and director who has worked on a bevy of projects for Google AR and two product reveals for the Nintendo Switch.

“We all got into this crazy world because we love telling stories,” concludes Sean McAllen. “And we share a mutual respect for each other’s craft. Ultimately, our strength is our approachability. We’re the ones who pick up the phone, answer the emails, make the coffee, and do the work.”

Main Image: (L-R) Sean McAllen, Drew Neujahr, and Shane McAllen

Dell intros new 4K monitors for creators

Dell Technologies is offering a new 4K monitor developed with creatives in mind. The Dell UltraSharp 27 4K PremierColor (UP2720Q) is a 27-inch 4K monitor with built-in colorimeter and Thunderbolt 3 for content creators who require color-critical performance and a fast connection to any dock or PC.

Creatives get optimal color performance by calibrating the UltraSharp 27 4K PremierColor monitor with the built-in colorimeter, and they can save time by scheduling automated color checks and calibrations with the Dell Calibration Assistant. This monitor works seamlessly with CalMan software (sold separately) to perform a variety of tasks, including calibrations with the built-in or an external colorimeter. An included shading hood snaps firmly to the monitor via magnets to reduce unwanted glare and reflections.

The UltraSharp 27 4K PremierColor monitor shows images in accurate color and sharp detail with 3840×2160 Ultra HD 4K resolution and a high pixel density of 163ppi. It features a high contrast ratio of 1,300:1. Each monitor is factory-calibrated for accurate color right out of the box. Plus, it supports a wide color coverage that includes 100% Adobe RGB, 80% BT.2020 and 98% DCI-P3.

Thunderbolt 3 offers speeds of up to 40Gbps, creating one compact port for a fast connection to devices. With Thunderbolt 3, users can connect a laptop to the monitor and charge up to 90W from a single cable while simultaneously transferring video and data signals. They can also daisy-chain up to two 4K monitors with Thunderbolt 3 for greater multitasking capabilities.

Terminator: Dark Fate director Tim Miller

By Iain Blair

He said he’d be back, and he meant it. Thirty-five years after he first arrived to menace the world in the 1984 classic The Terminator, Arnold Schwarzenegger has returned as the implacable killing machine in Terminator: Dark Fate, the latest installment of the long-running franchise.

And he’s not alone in his return. Terminator: Dark Fate also reunites the film’s producer and co-writer James Cameron with original franchise star Linda Hamilton for the first time in 28 years in a new sequel that picks up where Terminator 2: Judgment Day left off.

When the film begins, more than two decades have passed since Sarah Connor (Hamilton) prevented Judgment Day, changed the future and re-wrote the fate of the human race. Now, Dani Ramos (Natalia Reyes) is living a simple life in Mexico City with her brother (Diego Boneta) and father when a highly advanced and deadly new Terminator — a Rev-9 (Gabriel Luna) — travels back through time to hunt and kill her. Dani’s survival depends on her joining forces with two warriors: Grace (Mackenzie Davis), an enhanced super-soldier from the future, and a battle-hardened Sarah Connor. As the Rev-9 ruthlessly destroys everything and everyone in its path on the hunt for Dani, the three are led to a T-800 (Schwarzenegger) from Sarah’s past that might be their last best hope.

To helm all the on-screen mayhem, black humor and visual effects, Cameron handpicked Tim Miller, whose credits include the global blockbuster Deadpool, one of the highest grossing R-rated films of all time (it grossed close to $800 million). Miller then assembled a close-knit team of collaborators that included director of photography Ken Seng (Deadpool, Project X), editor Julian Clarke (Deadpool, District 9) and visual effects supervisor Eric Barba (The Curious Case of Benjamin Button, Oblivion).

Tim Miller on set

I recently talked to Miller about making the film, its cutting-edge VFX, the workflow and his love of editing and post.

How daunting was it when James Cameron picked you to direct this?
I think there’s something wrong with me because I don’t really feel fear as normal people do. It just manifests as a sense of responsibility, and with this I knew I’d never measure up to Jim’s movies but felt I could do a good job. Jim was never going to tell this story, and I wanted to see it, so it just became more about the weight of that sense of responsibility, but not in a debilitating way. I felt pretty confident I could carry this off. But later, the big anxiety was not to let down Linda Hamilton. Before I knew her, it wasn’t a thing, but later, once I got to know her I really felt I couldn’t mess it up (laughs).

This is still Cameron’s baby even though he handed over the directing to you. How hands-on was he?
He was busy with Avatar, but he was there for a lot of the early meetings and was very involved with the writing and ideas, which was very helpful thematically. But he wasn’t overbearing on all that. Then later when we shot, he wanted to write a few of the key scenes, which he did, and then in the edit he was in and out, but he never came into my edit room. He’d give notes and let us get on with it.

What sort of film did you set out to make?
A continuation of Sarah’s story. I never felt it was John’s story to me. It was always about a mother’s love for a son, and I felt like there was a real opportunity here. And that that story hadn’t been told — partly because the other sequels never had Linda. Once she wanted to come back, it was always the best possible story. No one else could be her or Arnold’s character.

Any surprises working with them?
Before we shot, people were telling me, “You got to be ready, we can’t mess around. When Arnold walks on set you’d better be rolling!” Sure enough, when he walked on he’d go, “And…” (Laughs) He really likes to joke around. With Linda — and the other actors — it was a love-fest. They’re both such nice, down-to-earth people, and I like a collegial atmosphere. I’m not a screamer. I’m very prepared, and I feel if you just show up on time, you’re already ahead of the game as a director.

What were the main technical challenges in pulling it all together?
They were all different for each big action set piece, and fitting it all into a schedule was tough, as we had a crazy amount of VFX. The C-5 plane sequence was far and away the biggest challenge to do and [SFX supervisor] Neil Corbould and his team designed and constructed all the effects rigs for the movie. The C-5 set was incredible, with two revolving sets, one vertical and one horizontal. It was so big you could put a bus in it, and it was able to rotate 360 degrees and tilt in either direction at the same time.

You just can’t simulate that reality of zero gravity on the actors. And then after we got it all in camera, which took weeks, our VFX guy Eric Barba finished it off. The other big one was the whole underwater scene, where the Humvee falls over the top of a dam and goes underwater as it’s swept down a river. For that, we put the Humvee on a giant scissor lift that could take it all the way under, so the water rushes in and fills it up. It’s really safe to do, but it feels frighteningly realistic for the actors.

This is only my second movie, so I’m still learning, but the advantage is I’m really willing to listen to any advice from the smart people around me on set on how best to do all this stuff.

How early on did you start integrating post and all the VFX?
Right from the start. I use previz a lot, as I come from that environment and I’m very comfortable with it, and that becomes the template for all of production to work from. Sometimes it’s too much of a template and treated like a bible, but I’m like, “Please keep thinking. Is there a better idea?” But it’s great to get everyone on the same page, so very early on you see what’s VFX, what’s live-action only, what’s a combination, and you can really plan your shoot. We did over 45 minutes of previz, along with storyboards. We did tons of postviz. My director’s cut had no blue/green at all. It was all postviz for every shot.

Tim Miller and Linda Hamilton

DP Ken Seng, who did Deadpool with you, shot it. Talk about how you collaborated on the look.
We didn’t really have time to plan shot lists that much since we moved so much and packed so much into every day. A lot of it was just instinctive run-and-gun, as the shoot was pretty grueling. We shot in Madrid and [other parts of] Spain, which doubled for Mexico. Then we did studio work in Budapest. The script was in flux a lot, and Jim wrote a few scenes that came in late, and I was constantly re-writing and tweaking dialogue and adjusting to the locations because there’s the location you think you’ll get and then the one you actually get.

Where did you post?
All at Blur, my company where we did Deadpool. The edit bays weren’t big enough for this though, so we spilled over into another building next door. That became Terminator HQ with the main edit bay and several assistant bays, plus all the VFX and compositing post teams. Blur also helped out with postviz and previz.

Do you like the post process?
I love post! I was an animator and VFX guy first, so it’s very natural to me, and I had a lot of the same team from Deadpool, which was great.

Talk about editing with Julian Clarke who cut Deadpool. How did that work?
It was the same set up. He’d be back here in LA cutting while we shot. He’s so fast; he’d be just one day behind me — I’ve never met anyone who works as hard. Then after the shoot, we’d edit all day and then I’d deal with VFX reviews for hours.

Can you talk about how Adobe Creative Cloud helped the post and VFX teams achieve their creative and technical goals?
I’m a big fan, and that started back on Deadpool as David Fincher was working closely with Adobe to make Premiere something that could beat Avid. We’re good friends — we’re doing our animated Netflix show Love, Death & Robots together — and he was like, “Dude, you gotta use this tool,” so we used it on Deadpool. It was still a little rocky on that one, but overall it was a great experience, and we knew we’d use it on this one. Adobe really helped refine it and the workflow, and it was a huge leap.

What were the big editing challenges?
(Laughs) We just shot too much movie. We had many discussions about cutting one or more of the action scenes, but in the end, we just took out some of the action from all of them, instead of cutting a particular set piece. But it’s tricky cutting stuff and still making it seamless, especially in a very heavily choreographed sequence like the C-5.

VFX plays a big role. How many were there?
Over 2,500 — a huge amount. The VFX on this were so huge it became a bit of a problem, to be honest.

L-R: Writer Iain Blair and director Tim Miller

How did you work with VFX supervisor Eric Barba.
He did a great job and oversaw all the vendors, including ILM, who did most of them. We tried to have them do all the character-based stuff, to keep it in one place, but in the end, we also had Digital Domain, Method, Blur, UPP, Cantina, and some others. We also brought on Jeff White from ILM since it was more than Eric could handle.

Talk about the importance of sound and music.
Tom Holkenborg, who scored Deadpool, did another great job. We also reteamed with sound design and mixer Craig Henighan and we did the mix at Fox. They’re both crucial in a film like this, but I’m the first to admit music’s not my strength. Luckily, Julian Clarke is excellent with that and very focused. He worked hard at pulling it all together. I love sound design and we talked about all the spotting, and Julian managed a lot of that too for me because I was so busy with the VFX.

Where did you do the DI and how important is it to you?
It’s huge, and we did it at Company 3 with Tim Stipan, who did Deadpool. I like to do a lot of reframing, adding camera shake and so on. It has a subtle but important effect on the overall film.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Color Chat: Light Iron’s Corinne Bogdanowicz

Corinne Bogdanowicz colorist at Light Iron, joined the post house in 2010 after working as a colorist and digital compositor for Post Logic/Prime Focus, Pacific Title and DreamWorks Animation.

Bogdanowicz, who comes from a family of colorists/color scientists (sister and father), has an impressive credit list, including the features 42, Flight, Hell or High Water, Allied and Wonder. On the episodic side, she has colored all five seasons of Amazon’s Emmy-winning series Transparent, as well as many other shows, including FX’s Baskets and Boomerang for BET. Her most recent work includes Netflix’s Dolemite is My Name and HBO’s Mrs. Fletcher.

HBO’s Mrs. Fletcher

We reached out to find out more…

NAME: Corinne Bogdanowicz

COMPANY: Light Iron

CAN YOU DESCRIBE YOUR COMPANY?
Light Iron is a post production company owned by Panavision. We have studios in New York and Los Angeles.

AS A COLORIST, WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
I think that most people would be surprised that we are the last stop for all visuals on a project. We are where all of the final VFX come together, and we also manage the different color spaces for final distribution.

ARE YOU SOMETIMES ASKED TO DO MORE THAN JUST COLOR ON PROJECTS?
Yes, I am very often doing work that crosses over into visual effects. Beauty work, paint outs and VFX integration are all commonplace in the DI suite these days.

WHAT’S YOUR FAVORITE PART OF THE JOB?
The collaboration between myself and the creatives on a project is my favorite aspect of color correction. There is always a moment when we start color where I get “the look,” and everyone is excited that their vision is coming to fruition.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Maybe farming? (laughs) I’m not sure. I love being outdoors and working with animals.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I have an art background, and when I moved to Los Angeles years ago I worked in VFX. I quickly was introduced to the world of color and found it was a great fit. I love the combination of art and technology, as well as constantly being introduced to new ideas by industry creatives.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
Where’d You Go, Bernadette?, Sextuplets, Truth Be Told, Transparent, Mrs. Fletcher and Dolemite is My Name.

Transparent

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
This is a hard question because I feel like I leave a little piece of myself in everything that I work on.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
My phone, the coffee maker and FilmLight Baselight.

WHAT DO YOU DO TO DE-STRESS FROM THE PRESSURES OF THE JOB?
I have two small children at home, so I think I de-stress when I get to work (laughs)!

Review: Lenovo Yoga A940 all-in-one workstation

By Brady Betzel

While more and more creators are looking for alternatives to the iMac, iMac Pro and Mac Pro, there are few options with high-quality, built-in monitors: Microsoft Surface Studio, HP Envy, and Dell 7000 are a few. There are even fewer choices if you want touch and pen capabilities. It’s with that need in mind that I decided to review the Lenovo Yoga A940, a 27-inch, UHD, pen- and touch-capable Intel Core i7 computer with an AMD Radeon RX 560 GPU.

While I haven’t done a lot of all-in-one system reviews like the Yoga A940, I have had my eyes on the Microsoft Surface Studio 2 for a long time. The only problem is the hefty price tag of around $3,500. The Lenovo’s most appealing feature — in addition to the tech specs I will go over — is its price point: It’s available from $2,200 and up. (I saw Best Buy selling a similar system to the one I reviewed for around $2,299. The insides of the Yoga and the Surface Studio 2 aren’t that far off from each other either, at least not enough to make up for the $1,300 disparity.)

Here are the parts inside the Lenovo Yoga A940: Intel Core i7-8700 3.2GHz processor (up to 4.6GHz with Turbo Boost), six cores (12 threads) and 12MB cache; 27-inch 4K UHD IPS multitouch 100% Adobe RGB display; 16GB DDR4 2666MHz (SODIMM) memory; 1TB 5400 RPM drive plus 256GB PCIe SSD; AMD Radeon RX 560 4GB graphics processor; 25-degree monitor tilt angle; Dolby Atmos speakers; Dimensions: 25 inches by 18.3 inches by 9.6 inches; Weight: 32.2 pounds; 802.11AC and Bluetooth 4.2 connectivity; side panel inputs: Intel Thunderbolt, USB 3.1, 3-in-1 card reader and audio jack; rear panel inputs: AC-in, RJ45, HDMI and four USB 3.0; Bluetooth active pen (appears to be the Lenovo Active Pen 2); and QI wireless charging technology platform.

Digging In
Right off the bat, I just happened to put my Android Galaxy phone on the odd little flat platform located on the right side of the all-in-one workstation, just under the monitor, and I saw my phone begin to charge wirelessly. QI wireless charging is an amazing little addition to the Yoga; it really comes through in a pinch when I need my phone charged and don’t have the cable or charging dock around.

Other than that nifty feature, why would you choose a Lenovo Yoga A940 over any other all-in-one system? Well, as mentioned, the price point is very attractive, but you are also getting a near-professional-level system in a very tiny footprint — including Thunderbolt 3 and USB connections, HDMI port, network port and SD card reader. While it would be incredible to have an Intel i9 processor inside of the Yoga, the i7 clocks in at 3.2GHz with six cores. Not a beast, but enough to get the job done inside of Adobe Premiere and Blackmagic’s DaVinci Resolve, but maybe with transcoded files instead of Red raw or the like.

The Lenovo Yoga A940 is outfitted with a front-facing Dolby Atmos audio speaker as well as Dolby Vision technology in the IPS display. The audio could use a little more low end, but it is good. The monitor is surprisingly great — the whites are white and the blacks are black; something not everyone can get right. It has 100% Adobe RGB color coverage and is Pantone-validated. The HDR is technically Dolby Vision and looks great at about 350 nits (not the brightest, but it won’t burn your eyes out either). The Lenovo BT active pen works well. I use Wacom tablets and laptop tablets daily, so this pen had a lot to live up to. While I still prefer the Wacom pen, the Lenovo pen, with 4,096 levels of sensitivity, will do just fine. I actually found myself using the touchscreen with my fingers way more than the pen.

One feature that sets the A940 apart from the other all-in-one machines is the USB Content Creation dial. With the little time I had with the system, I only used it to adjust speaker volume when playing Spotify, but in time I can see myself customizing the dials to work in Premiere and Resolve. The dial has good action and resistance. To customize the dial, you can jump into the Lenovo Dial Customization Assistant.

Besides the Intel i7, there is an AMD Radeon RX 560 with 4GB of memory, two 3W and two 5W speakers, 32 GB of DDR4 2666 MHz memory, a 1 TB 5400 RPM hard drive for storage, and a 256GB PCIe SSD. I wish the 1TB drive was also an SSD, but obviously Lenovo has to keep that price point somehow.

Real-World Testing
I use Premiere Pro, After Effects and Resolve all the time and can understand the horsepower of a machine through these apps. Whether editing and/or color correcting, the Lenovo A940 is a good medium ground — it won’t be running much more than 4K Red raw footage in real time without cutting the debayering quality down to half if not one-eighth. This system would make a good “offline” edit system, where you transcode your high-res media to a mezzanine codec like DNxHR or ProRes for your editing and then up-res your footage back to the highest resolution you have. Or, if you are in Resolve, maybe you could use optimized media for 80% of the workflow until you color. You will really want a system with a higher-end GPU if you want to fluidly cut and color in Premiere and Resolve. That being said, you can make it work with some debayer tweaking and/or transcoding.

In my testing I downloaded some footage from Red’s sample library, which you can find here. I also used some BRAW clips to test inside of Resolve, which can be downloaded here. I grabbed 4K, 6K, and 8K Red raw R3D files and the UHD-sized Blackmagic raw (BRAW) files to test with.

Adobe Premiere
Using the same Red clips as above, I created two one-minute-long UHD (3840×2160) sequences. I also clicked “Set to Frame Size” for all the clips. Sequence 1 contained these clips with a simple contrast, brightness and color cast applied. Sequence 2 contained these same clips with the same color correction applied, but also a 110% resize, 100 sharpen and 20 Gaussian Blur. I then exported them to various codecs via Adobe Media Encoder using the OpenCL for processing. Here are my results:

QuickTime (.mov) H.264, No Audio, UHD, 23.98 Maximum Render Quality, 10 Mb/s:
Color Correction Only: 24:07
Color Correction w/ 110% Resize, 100 Sharpen, 20 Gaussian Blur: 26:11
DNxHR HQX 10 bit UHD
Color Correction Only: 25:42
Color Correction w/ 110% Resize, 100 Sharpen, 20 Gaussian Blur: 27:03

ProRes HQ
Color Correction Only: 24:48
Color Correction w/ 110% Resize, 100 Sharpen, 20 Gaussian Blur: 25:34

As you can see, the export time is pretty long. And let me tell you, once the sequence with the Gaussian Blur and Resize kicked in, so did the fans. While it wasn’t like a jet was taking off, the sound of the fans definitely made me and my wife take a glance at the system. It was also throwing some heat out the back. Because of the way Premiere works, it relies heavily on the CPU over GPU. Not that it doesn’t embrace the GPU, but, as you will see later, Resolve takes more advantage of the GPUs. Either way, Premiere really taxed the Lenovo A940 when using 4K, 6K and 8K Red raw files. Playback in real time wasn’t possible except for the 4K files. I probably wouldn’t recommend this system for someone working with lots of higher-than-4K raw files; it seems to be simply too much for it to handle. But if you transcode the files down to ProRes, you will be in business.

Blackmagic Resolve 16 Studio
Resolve seemed to take better advantage of the AMD Radeon RX 560 GPU in combination with the CPU, as well as the onboard Intel GPU. In this test I added in Resolve’s amazing built-in spatial noise reduction, so other than the Red R3D footage, this test and the Premiere test weren’t exactly comparing apples to apples. Overall the export times will be significantly higher (or, in theory, they should be). I also added in some BRAW footage to test for fun, and that footage was way easier to work and color with. Both sequences were UHD (3840×2160) 23.98. I will definitely be looking into working with more BRAW footage. Here are my results:

Playback: 4K realtime playback at half-premium, 6K no realtime playback, 8K no realtime playback

H.264 no audio, UHD, 23.98fps, force sizing and debayering to highest quality
Export 1 (Native Renderer)
Export 2 (AMD Renderer)
Export 3 (Intel QuickSync)

Color Only
Export 1: 3:46
Export 2: 4:35
Export 3: 4:01

Color, 110% Resize, Spatial NR: Enhanced, Medium, 25; Sharpening, Gaussian Blur
Export 1: 36:51
Export 2: 37:21
Export 3: 37:13

BRAW 4K (4608×2592) Playback and Export Tests

Playback: Full-res would play at about 22fps; half-res plays at realtime

H.264 No Audio, UHD, 23.98 fps, Force Sizing and Debayering to highest quality
Color Only
Export 1: 1:26
Export 2: 1:31
Export 3: 1:29
Color, 110% Resize, Spatial NR: Enhanced, Medium, 25; Sharpening, Gaussian Blur
Export 1: 36:30
Export 2: 36:24
Export 3: 36:22

DNxHR 10 bit:
Color Correction Only: 3:42
Color, 110% Resize, Spatial NR: Enhanced, Medium, 25; Sharpening, Gaussian Blur: 39:03

One takeaway from the Resolve exports is that the color-only export was much more efficient than in Premiere, taking just over three or four times realtime for the intensive Red R3D files, and just over one and a half times real time for BRAW.

Summing UpIn the end, the Lenovo A940 is a sleek looking all-in-one touchscreen- and pen-compatible system. While it isn’t jam-packed with the latest high-end AMD GPUs or Intel i9 processors, the A940 is a mid-level system with an incredibly good-looking IPS Dolby Vision monitor with Dolby Atmos speakers. It has some other features — like IR camera, QI wireless charger and USB Dial — that you might not necessarily be looking for but love to find.

The power adapter is like a large laptop power brick, so you will need somewhere to stash that, but overall the monitor has a really nice 25-degree tilt that is comfortable when using just the touchscreen or pen, or when using the wireless keyboard and mouse.

Because the Lenovo A940 starts at just around $2,299 I think it really deserves a look when searching for a new system. If you are working in primarily HD video and/or graphics this is the all-in-one system for you. Check out more at their website.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Bonfire adds Jason Mayo as managing director/partner

Jason Mayo has joined digital production company Bonfire in New York as managing director and partner. Industry veteran Mayo will be working with Bonfire’s new leadership lineup, which includes founder/Flame artist Brendan O’Neil, CD Aron Baxter, executive producer Dave Dimeola and partner Peter Corbett. Bonfire’s offerings include VFX, design, CG, animation, color, finishing and live action.

Mayo comes to Bonfire after several years building Postal, the digital arm of the production company Humble. Prior to that he spent 14 years at Click 3X, where he worked closely with Corbett as his partner. While there he also worked with Dimeola, who cut his teeth at Click as a young designer/compositor. Dimeola later went on to create The Brigade, where he developed the network and technology that now forms the remote, cloud-based backbone referred to as the Bonfire Platform.

Mayo says a number of factors convinced him that Bonfire was the right fit for him. “This really was what I’d been looking for,” he says. “The chance to be part of a creative and innovative operation like Bonfire in an ownership role gets me excited, as it allows me to make a real difference and genuinely effect change. And when you’re working closely with a tight group of people who are focused on a single vision, it’s much easier for that vision to be fully aligned. That’s harder to do in a larger company.”

O’Neil says that having Mayo join as partner/MD is a major move for the company. “Jason’s arrival is the missing link for us at Bonfire,” he says. “While each of us has specific areas to focus on, we needed someone who could both handle the day to day of running the company while keeping an eye on our brand and our mission and introducing our model to new opportunities. And that’s exactly his strong suit.”

For the most part, Mayo’s familiarity with his new partners means he’s arriving with a head start. Indeed, his connection to Dimeola, who built the Bonfire Platform — the company’s proprietary remote talent network, nicknamed the “secret sauce” — continued as Mayo tapped Dimeola’s network for overflow and outsourced work while at Postal. Their relationship, he says, was founded on trust.

“Dave came from the artist side, so I knew the work I’d be getting would be top quality and done right,” Mayo explains. “I never actually questioned how it was done, but now that he’s pulled back the curtain, I was blown away by the capabilities of the Platform and how it dramatically differentiates us.

“What separates our system is that we can go to top-level people around the world but have them working on the Bonfire Platform, which gives us total control over the process,” he continues. “They work on our cloud servers with our licenses and use our cloud rendering. The Platform lets us know everything they’re doing, so it’s much easier to track costs and make sure you’re only paying for the work you actually need. More importantly, it’s a way for us to feel connected – it’s like they’re working in a suite down the hall, except they could be anywhere in the world.”

Mayo stresses that while the cloud-based Platform is a huge advantage for Bonfire, it’s just one part of its profile. “We’re not a company riding on the backs of freelancers,” he points out. “We have great, proven talent in our core team who work directly with clients. What I’ve been telling my longtime client contacts is that Bonfire represents a huge step forward in terms of the services and level of work I can offer them.”

Corbett believes he and Mayo will continue to explore new ways of working now that he’s at Bonfire. “In the 14 years Jason and I built Click 3X, we were constantly innovating across both video and digital, integrating live action, post production, VFX and digital engagements in unique ways,” he observes. “I’m greatly looking forward to continuing on that path with him here.”

Technicolor Post opens in Wales 

Technicolor has opened a new facility in Cardiff, Wales, within Wolf Studios. This expansion of the company’s post production footprint in the UK is a result of the growing demand for more high-quality content across streaming platforms and the need to post these projects, as well as the growth of production in Wales.

The facility is connected to all of Technicolor’s locations worldwide through the Technicolor Production Network, giving creatives easy access and to their projects no matter where they are shooting or posting.

The facility, an extension of Technicolor’s London operations, supports all Welsh productions and features a multi-purpose, state-of-the-art suite as well as space for VFX and front-end services including dailies. Technicolor Wales is working on Bad Wolf Production’s upcoming fantasy epic His Dark Materials, providing picture and sound services for the BBC/HBO show. Technicolor London’s recent credits include The Two Popes, The Souvenir, Chernobyl, Black Mirror, Gentleman Jack and The Spanish Princess.

Within this new Cardiff facility, Technicolor is offering 2K digital cinema projection, FilmLight Baselight color grading, realtime 4K HDR remote review, 4K OLED video monitoring, 5.1/7.1 sound, ADR recording/source connect, Avid Pro Tools sound mixing, dailies processing and Pulse cloud storage.

Bad Wolf Studios in Cardiff offers 125,000 square feet of stage space with five stages. There is flexible office space, as well as auxiliary rooms and costume and props storage. Its within

Behind the Title: C&I Studios founder Joshua Miller

While he might run the company, founder/CEO Joshua Miller is happiest creating. He also says there is no job too small: “Nothing is beneath you.”

NAME: Joshua Otis Miller

COMPANY: C&I Studios

CAN YOU DESCRIBE YOUR COMPANY?
C&I Studios is a production company and advertising agency. We are located in New York City, Los Angeles, and Fort Lauderdale.

WHAT’S YOUR JOB TITLE?
Founder and CEO

WHAT DOES THAT ENTAIL?
Well, my job is a little weird. While I own and run the company, my passion has always been filmmaking… since I was four years old. I also run the video and film team at the studio, so my job means a lot of things. One day, I can be shooting on a mountain and the next day writing scripts and concepts, or editing, creating feature films or TV shows or managing post production. Since I’m the CEO, I spend a ton of time bringing in new business and adding technology to the company. Every day feels brand new to me, and that is the best part.

Black Violin

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
I think the thing that surprises most people is that when I’m on set working, I’m not sitting back drinking a mojito. I’m carrying the tripods and the sandbags and setting up the shots. I’m also the one signing everyone’s checks. One of our core beliefs at our company is “nothing is beneath you,” and that means you can do anything — including cleaning toilets —that helps the company grow, and it requires you to drop your ego. In the creative industry that’s a big deal.

WHAT’S YOUR FAVORITE PART OF THE JOB?
My favorite part of the job is working with my team. I got so sick of the freelance game — it’s so individualized, and everyone is out for themselves. I wanted to start C&I to work with people consistently, dream together, build together and create together. That is by far better than anything else.

WHAT’S YOUR LEAST FAVORITE?
My least favorite part of the job is firing people. That just sucks.

WHAT IS YOUR MOST PRODUCTIVE TIME OF THE DAY?
Between 4am and 5am. If you aren’t waking up earlier than everyone else, you aren’t doing it right.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I would be doing the exact same thing. I could be working at McDonald’s, but I’d be filming with my iPhone or Razer phone and editing. It’s not about the money; you can’t take this thing from me. It’s a part of me, and something I certainly didn’t choose. So, no matter where you put me, this is what will come out. And since Blackmagic DaVinci Resolve is free, this is something I could actually do… I could be working at McDonald’s and shooting for fun on my phone and editing in Resolve’s new cut page, which is magic. That actually sounds awesome. Well, except the McDonald’s part (laughs).

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
Again, I don’t feel like I chose it. It’s something that I always felt drawn to. I was interested in cameras since I was very young… tearing apart my parents VHS tapes to see how they worked. I was completely perplexed by the idea that a camera does something and then it goes on this tape, and I see what’s on that tape in this VHS player and on TV. That was something I had to learn and figure out. But the main reason I wanted to really dig into this field is because I remember being in my grandmother’s house watching those VHS tapes with my brothers and my family and everyone is just sitting around, laughing watching old memories. I can’t shake that feeling. People feel warm, vulnerable, close… that is the power you have with a camera and the ability to tell a story. It’s absolutely incredible.

Black Violin

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
Right now, I’m working on an incredible music video with Black Violin. We are shooting it in Los Angeles and Miami, and I’m really excited about it.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
Probably something I’m most proud of is our latest film Christmas Eve. We just poured everything into that film. It’s just magic. We have done a lot of amazing stuff, but that one is really close to me right now.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
Camera, computer, speakers (for music — I can’t live without music). Those three things are a must for me to breathe.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
I’m not really into social media, not a big fan of what it has turned us into (off of my soapbox now), but I do follow a ton of film companies and directors. I love following Shane Hurlbut, Blackmagic Design, SmallHD, Red Digital Cinema and Panavision, to name a view.

YOU MENTIONED LOVING MUSIC. DO YOU LISTEN WHILE YOU WORK?
Music is everything. It’s the oil to my car. Without that, I’m toast. Of course, I don’t listen to music when I’m editing, but when I’m on set I love to listen to music. Love the new Chance record. When I’m writing, it’s always either Bon Iver or Michael Giacchino. I love scores and composers.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
To distress, I love the moments in the studio when the staff and I just sit around and get to laugh and just hang out. I have a beautiful family and two wonderful kids, so when I’m not stressing about work I’m giving horsey-back rides to my son, while my daughter tries to explain TikTok to me.

Quick Chat: Element’s Matthew O’Rourke on Vivian partnership

Recently, Boston-based production and post company Element  launched Element Austin — a partnership with production studio Vivian. Element now represents a select directorial roster out of Austin.

We recently reached out to Element executive producer Matthew O’Rourke, who led the charge to get this partnership off the ground.

Can you talk a bit about your partnership with Vivian? How did that come about, and why was this important for Element to do?
I’ve had a relationship with Vivian’s co-owner, Buttons Pham, for almost 10 years. She was my go-to Texas-based resource while I was an executive producer at MMB working on Toyota. She is incredibly resourceful and a great human being. When I joined Element, she became a valued production service partner for our projects in the south (mostly based out of Texas and Atlanta). Our relationship with Vivian was always important to Element since it expands the production support we can offer for our directors and our clients.

Blue Cross Blue Shield

Expanding on that thought. What does Vivian offer that you guys don’t?
They let us have boots on the ground in Austin. They have a strong reputation there and deep resources to handle all levels of work.

How will this partnership work?
Buttons and her business partner Tim Hoppock have become additional executive producers for Element and lead the Element Austin office.

How does the Boston market differ from Austin?
Austin is a growing, vibrant market with tons of amazingly creative people and companies. Lots of production resources are coming in from Los Angeles, but are also developing locally.

Can you point to any recent jobs that resulted from this partnership?
Vivian has been a production services partner for several years, helping us with campaigns for Blue Cross Blue Shield, Subway and more. Since our launch a few weeks ago, we have entered into discussions with several agencies on upcoming work out of the Austin market.

What trends are you seeing overall for this part of the market?
Creative agencies are looking for reliable resources. Having a physical presence in Austin allows us to better support local clients, but also bring in projects from outside that market and produce efficient, quality work.

Good Company adds director Daniel Iglesias Jr.

Filmmaker Daniel Iglesias Jr., whose reel spans narrative storytelling to avant-garde fashion films with creativity and an eccentric visual style, has signed with full-service creative studio Good Company.

Iglesias’ career started while attending Chapman University’s renowned film school, where he earned a BFA in screen acting. At the same time, Iglesias and his friend Zack Sekuler began crafting images for his friends in the alt-rock band The Neighbourhood. Iglesias’ career took off after directing his first music video for the band’s breakout hit “Sweater Weather,” which reached over 310 million views. He continues working behind the camera for The Neighbourhood and other artists like X Ambassadors and AlunaGeorge.

Iglesias uses elements of surrealism and a blend of avant-garde and commercial compositions, often stemming from innovative camera techniques. His work includes projects for clients like Ralph Lauren, Steve Madden, Skyy Vodka and Chrysler and the Vogue film Death Head Sphinx.

One of his most celebrated projects was a two-minute promo for Margaux the Agency. Designed as a “living magazine,” Margaux Vol 1 merges creative blocking, camera movement and effects to create a kinetic visual catalog that is both classic and contemporary. The piece took home Best Picture at the London Fashion Film Festival, along with awards from the Los Angeles Film Festival, the International Fashion Film Awards and Promofest in Spain.

Iglesias’ first project since joining Good Company was Ikea’s Kama Sutra commercial for Ogilvy NY, a tongue-in-cheek exploration of the boudoir. Now he is working on a project for Paper Magazine and Tiffany.

“We all see the world through our own lens; through film, I can unscrew my lens and pop in onto other people and, by effect, change their point of view or even the depth of culture,” he says. “That’s why the medium excites me — I want to show people my lens.”

We reached out to Iglesias to learn a bit more about how he works.

How do you go about picking the people you work with?
I do have a couple DPs and PDs I like to work with on the regular, depending on the job, and sometimes it makes sense to work with someone new. If it’s someone new that I haven’t worked with before, I typically look at three things to get a sense of how right they are for the project: image quality, taste and versatility. Then it’s a phone call or meeting to discuss the project in person so we can feel out chemistry and execution strategy.

Do you trust your people completely in terms of what to shoot on, or do you like to get involved in that process as well?
I’m a pretty hands-on and involved director, but I think it’s important to know what you don’t know and delegate/trust accordingly. I think it’s my job as a director to communicate, as detailed and effectively as possible, an accurate explanation of the vision (because nobody sees the vision of the project better than I do). Then I must understand that the DPs/PDs/etc. have a greater knowledge of their field than I do, so I must trust them to execute (because nobody understands how to execute in their fields better than they do).

Since Good Company also provides post, how involved do you get in that process?
I would say I edit 90% of my work. If I’m not editing it myself, then I still oversee the creative in post. It’s great to have such a strong post workflow with Good Company.

The editors of Ad Astra: John Axelrad and Lee Haugen

By Amy Leland

The new Brad Pitt film Ad Astra follows astronaut Roy McBride (Pitt) as he journeys deep into space in search of his father, astronaut Clifford McBride (Tommy Lee Jones). The elder McBride disappeared years before, and his experiments in space might now be endangering all life on Earth. Much of the film features Pitt’s character alone in space with his thoughts, creating a happy challenge for the film’s editing team, who have a long history of collaboration with each other and the film’s director James Gray.

L-R: Lee Haugen and John Axelrad

Co-editors John Axelrad, ACE, and Lee Haugen share credits on three previous films — Haugen served as Axelrad’s apprentice editor on Two Lovers, and the two co-edited The Lost City of Z and Papillon. Ad Astra’s director, James Gray, was also at the helm of Two Lovers and The Lost City of Z. A lot can be said for long-time collaborations.

When I had the opportunity to speak with Axlerad and Haugen, I was eager to find out more about how this shared history influenced their editing process and the creation of this fascinating story.

What led you both to film editing?
John Axelrad: I went to film school at USC and graduated in 1990. Like everyone else, I wanted to be a director. Everyone that goes to film school wants that. Then I focused on studying cinematography, but then I realized several years into film school that I don’t like being on the set.

Not long ago, I spoke to Fred Raskin about editing Once Upon a Time… in Hollywood. He originally thought he was going to be a director, but then he figured out he could tell stories in an air-conditioned room.
Axelrad: That’s exactly it. Air conditioning plays a big role in my life; I can tell you that much. I get a lot of enjoyment out of putting a movie together and of being in my own head creatively and really working with the elements that make the magic. In some ways, there are a lot of parallels with the writer when you’re an editor; the difference is I’m not dealing with a blank page and words — I’m dealing with images, sound and music, and how it all comes together. A lot of people say the first draft is the script, the second draft is the shoot, and the third draft is the edit.

L-R: John and Lee at the Papillon premiere.

I started off as an assistant editor, working for some top editors for about 10 years in the ’90s, including Anne V. Coates. I was an assistant on Out of Sight when Anne Coates was nominated for the Oscar. Those 10 years of experience really prepped me for dealing with what it’s like to be the lead editor in charge of a department — dealing with the politics, the personalities and the creative content and learning how to solve problems. I started cutting on my own in the late ‘90s, and in the early 2000s, I started editing feature films.

When did you meet your frequent collaborator James Gray?
Axelrad: I had done a few horror features, and then I hooked up with James on We Own the Night, and that went very well. Then we did Two Lovers after that. That’s where Lee Haugen came in — and I’ll let him tell his side of the story — but suffice it to say that I’ve done five films for James Gray, and Lee Haugen rose up through the ranks and became my co-editor on the Lost City of Z. Then we edited the movie Papillon together, so it was just natural that we would do Ad Astra together as a team.

What about you, Lee? How did you wind your way to where we are now?
Lee Haugen: Growing up in Wisconsin, any time I had a school project, like writing a story or writing an article, I would change it into a short video or short film instead. Back then I had to shoot on VHS tape and edited tape to tape by pushing play and hitting record and timing it. It took forever, but that was when I really found out that I loved editing.

So I went to school with a focus on wanting to be an editor. After graduating from Wisconsin, I moved to California and found my way into reality television. That was the mid-2000s and it was the boom of reality television; there were a lot of jobs that offered me the chance to get in the hours needed for becoming a member of the Editors Guild as well as more experience on Avid Media Composer.

After about a year of that, I realized working the night shift as an assistant editor on reality television shows was not my real passion. I really wanted to move toward features. I was listening to a podcast by Patrick Don Vito (editor of Green Book, among other things), and he mentioned John Axelrad. I met John on an interview for We Own the Night when I first moved out here, but I didn’t get the job. But a year or two later, I called him, and he said, “You know what? We’re starting another James Gray movie next week. Why don’t you come in for an interview?” I started working with John the day I came in. I could not have been more fortunate to find this group of people that gave me my first experience in feature films.

Then I had the opportunity to work on a lower-budget feature called Dope, and that was my first feature editing job by myself. The success of the film at Sundance really helped launch my career. Then things came back around. John was finishing up Krampus, and he needed somebody to go out to Northern Ireland to edit the assembly of The Lost City of Z with James Gray. So, it worked out perfectly, and from there, we’ve been collaborating.

Axelrad: Ad Astra is my third time co-editing with Lee, and I find our working as a team to be a naturally fluid and creative process. It’s a collaboration entailing many months of sharing perspectives, ideas and insights on how best to approach the material, and one that ultimately benefits the final edit. Lee wouldn’t be where he is if he weren’t a talent in his own right. He proved himself, and here we are together.

How has your collaborative process changed and grown from when you were first working together (John, Lee and James) to now, on Ad Astra?
Axelrad: This is my fifth film with James. He’s a marvelous filmmaker, and one of the reasons he’s so good is that he really understands the subtlety and power of editing. He’s very neoclassical in his approach, and he challenges the viewer since we’re all accustomed to faster cutting and faster pacing. But with James, it’s so much more of a methodical approach. James is very performance-driven. It’s all about the character, it’s all about the narrative and the story, and we really understand his instincts. Additionally, you need to develop a second-hand language and truly understand what the director wants.

Working with Lee, it was just a natural process to have the two of us cutting. I would work on a scene, and then I could say, “Hey Lee, why don’t you take a stab at it?” Or vice versa. When James was in the editing room working with us, he would often work intensely with one of us and then switch rooms and work with the other. I think we each really touched almost everything in the film.

Haugen: I agree with John. Our way of working is very collaborative —that includes John and I, but also our assistant editors and additional editors. It’s a process that we feel benefits the film as a whole; when we have different perspectives, it can help us explore different options that can raise the film to another level. And when James comes in, he’s extremely meticulous. And as John said, he and I both touched every single scene, and I think we’ve even touched every frame of the film.

Axelrad: To add to what Lee said, about involving our whole editing team, I love mentoring, and I love having my crew feel very involved. Not just technical stuff, but creatively. We worked with a terrific guy, Scott Morris, who is our first assistant editor. Ultimately, he got bumped up during the course of the film and got an additional editor credit on Ad Astra.

We involve everyone, even down to the post assistant. We want to hear their ideas and make them feel like a welcome part of a collaborative environment. They obviously have to focus on their primary tasks, but I think it just makes for a much happier editing room when everyone feels part of a team.

How did you manage an edit that was so collaborative? Did you have screenings of dailies or screenings of cuts?
Axelrad: During dailies it was just James, and we would send edits for him to look at. But James doesn’t really start until he’s in the room. He really wants to explore every frame of film and try all the infinite combinations, especially when you’re dealing with drama and dealing with nuance and subtlety and subtext. Those are the scenes that take the longest. When I put together the lunar rover chase, it was almost easier in some ways than some of the intense drama scenes in the film.

Haugen: As the dailies came in, John and I would each take a scene and do a first cut. And then, once we had something to present, we would call everybody in to watch the scene. We would get everybody’s feedback and see what was working, what wasn’t working. If there were any problems that we could address before moving to the next scene, we would. We liked to get the outside point of view, because once you get further and deeper into the process of editing a film, you do start to lose perspective. To be able to bring somebody else in to watch a scene and to give you feedback is extremely helpful.

One thing that John established with me on Two Lovers — my first editing job on a feature — was allowing me to come and sit in the room during the editing. After my work was done, I was welcome to sit in the back of the room and just observe the interaction between John and James. We continued that process with this film, just to give those people experience and to learn and to observe how an edit room works. That helped me become an editor.

John, you talked about how the action scenes are often easier to cut than the dramatic scenes. It seems like that would be even more true with Ad Astra, because so much of this film is about isolation. How does that complicate the process of structuring a scene when it’s so much about a person alone with his own thoughts?
Axelrad: That was the biggest challenge, but one we were prepared for. To James’ credit, he’s not precious about his written words; he’s not precious about the script. Some directors might say, “Oh no, we need to mold it to fit the script,” but he allows the actors to work within a space. The script is a guide for them, and they bring so much to it that it changes the story. That’s why I always say that we serve the ego of the movie. The movie, in a way, informs us what it wants to be, and what it needs to be. And in the case of this, Brad gave us such amazing nuanced performances. I believe you can sometimes shape the best performance around what is not said through the more nuanced cues of facial expressions and gestures.

So, as an editor, when you can craft something that transcends what is written and what is photographed and achieve a compelling synergy of sound, music and performance — to create heightened emotions in a film — that’s what we’re aiming for. In the case of his isolation, we discovered early on that having voiceover and really getting more interior was important. That wasn’t initially part of the cut, but James had written voiceover, and we began to incorporate that, and it really helped make this film into more of an existential journey.

The further he goes out into space, the deeper we go into his soul, and it’s really a dive into the subconscious. That sequence where he dives underwater in the cooling liquid of the rocket, he emerges and climbs up the rocket, and it’s almost like a dream. Like how in our dreams we have superhuman strength as a way to conquer our demons and our fears. The intent really was to make the film very hypnotic. Some people get it and appreciate it.

As an editor, sound often determines the rhythm of the edit, but one of the things that was fascinating with this film is how deafeningly quiet space likely is. How do you work with the material when it’s mostly silent?
Haugen: Early on, James established that he wanted to make the film as realistic as possible. Sound, or lack of sound, is a huge part of space travel. So the hard part is when you have, for example, the lunar rover chase on the moon, and you play it completely silent; it’s disarming and different and eerie, which was very interesting at first.

But then we started to explore how we could make this sound more realistic or find a way to amplify the action beats through sound. One way was, when things were hitting him or things were vibrating off of his suit, he could feel the impacts and he could hear the vibrations of different things going on.

Axelrad: It was very much part of our rhythm, of how we cut it together, because we knew James wanted to be as realistic as possible. We did what we could with the soundscapes that were allowable for a big studio film like this. And, as Lee mentioned, playing it from Roy’s perspective — being in the space suit with him. It was really just to get into his head and hear things how he would hear things.

Thanks to Max Richter’s beautiful score, we were able to hone the rhythms to induce a transcendental state. We had Gary Rydstrom and Tom Johnson mix the movie for us at Skywalker, and they were the ultimate creators of the balance of the rhythms of the sounds.

Did you work with music in the cut?
Axelrad: James loves to temp with classical music. In previous films, we used a lot of Puccini. In this film, there was a lot of Wagner. But Max Richter came in fairly early in the process and developed such beautiful themes, and we began to incorporate his themes. That really set the mood.

When you’re working with your composer and sound designer, you feed off each other. So things that they would do would inspire us, and we would change the edits. I always tell the composers when I work with them, “Hey, if you come up with something, and you think musically it’s very powerful, let me know, and I am more than willing to pitch changing the edit to accommodate.” Max’s music editor, Katrina Schiller, worked in-house with us and was hugely helpful, since Max worked out of London.

We tend not to want to cut with music because initially you want the edit not to have music as a Band-Aid to cover up a problem. But once we feel the picture is working, and the rhythm is going, sometimes the music will just fit perfectly, even as temp music. And if the rhythms match up to what we’re doing, then we know that we’ve done it right.

What is next for the two of you?
Axelrad: I’m working on a lower-budget movie right now, a Lionsgate feature film. The title is under wraps, but it stars Janelle Monáe, and it’s kind of a socio-political thriller.

What about you Lee?
Haugen: I jumped onto another film as well. It’s an independent film starring Zoe Saldana. It’s called Keyhole Garden, and it’s this very intimate drama that takes place on the border between Mexico and America. So it’s a very timely story to tell.


Amy Leland is a film director and editor. Her short film, Echoes, is now available on Amazon Video. She also has a feature documentary in post, a feature screenplay in development, and a new doc in pre-production. She is an editor for CBS Sports Network and recently edited the feature “Sundown.” You can follow Amy on social media on Twitter at @amy-leland and Instagram at @la_directora.

Review: Boxx’s Apexx A3 AMD Ryzen workstation

By Mike McCarthy

Boxx’s Apexx A3 is based on AMD’s newest Ryzen CPUs and the X570 chipset. Boxx has taken these elements and added liquid CPU cooling, professional GPUs and a compact, solid case to create an optimal third-generation Ryzen system configured for pros. It can support dual GPUs and two 3.5-inch hard drives, as well as the three M.2 slots on the board and anything that can fit into its five PCIe slots. The system I am reviewing came with AMD’s top CPU, the 12-core 3900X running at 3.8GHz, as well as 64GB of DDR4-2666 RAM and a Quadro RTX 4000 GPU. I also tested it with a 40GbE network card and a variety of other GPUs.

I have been curious about AMD’s CPU reboot with Ryzen architecture, but I haven’t used an AMD-based system since the 64-bit Opterons in the HP xw9300s that I had in 2006. That was also around the same time that I last used a system from Boxx, in the form of its HD Pro RT editing systems, based on those same AMD Opteron CPUs. At the time, Boxx systems were relatively unique in that they had large internal storage arrays with eight or 10 separate disks, and those arrays came in a variety of forms.

The three different locations that I worked during that time period had Boxx workstations with IDE-, SATA- and SCSI-based storage arrays. All three types of storage experienced various issues at the locations where I worked with them, but that might have been more a result of unreliable hard drives and relatively new PCI RAID controllers available at that time more than a reflection on Boxx.

Regardless, and for whatever reason, Boxx focused more on processing performance than storage over the next decade, marketing more toward 3D animation and VFX artists (among other users) who do lots of processing on small amounts of data, instead of video editors who do small amounts of processing on large amounts of data. At this point, most large data sets are stored on network appliances or external arrays, although my projects have recently been leaning the other way, using older server chassis with lots of internal drive slots.

Out of the Box
The Apexx system shipped from Boxx in a reasonably sized carton with good foam protection. Compared to the servers I have been using recently, it is tiny and feather-light at 25 pounds. The compact case is basically designed upside down from conventional layouts, with the power supply at the bottom and the card slots at the top. To save space, it fits the 750W power supply directly over the CPU, which is liquid-cooled with a radiator at the front of the case. There are two SATA hard drive bays at the top of the case. The system is based on the X570 Aorus Ultra motherboard, which has three full-length and two x1 PCIe slots, as well as three M.2 slots.

The system has no shortage of USB ports, with four USB 3.0 ports up front next to the headphone and mic connectors, and 10 on the back panel. Of those, three are USB 3.1 Gen2, including one that is a Type-C port. All the rest are Type-A, three more USB 3.0 ports and four USB 2.0 ports. The white USB 3.0 port allows you to update the BIOS from a USB stick if desired, which might come in handy when AMD’s fix to the Zen2 boost frequency issue becomes available. There are also 5.1 analog audio and SPDIF connectors on the board, as well as HDMI out and Wi-Fi antenna ports.

I hooked up my 8K monitor and connected it to my network for initial config and setup. The simplest test I run is Maxon’s Cinebench 15, which returned a GPU score of 207 and a multi-core CPU score of 3169. Both those values are the highest results I have ever gotten with that tool, including from dual-socket systems workstations, although I have not tested the newest generation of Intel Xeons. AMD’s CPUs are well-suited for that particular test, and this is the first true Nvidia Quadro card I have tested from the Turing-based RTX generation.

As this is an AMD X570 board, it supports PCIe 4.0, but that is of little benefit to current GPUs. The one case where the extra bandwidth could currently make a difference is NVMe SSDs playing back high-resolution frames. This system only came with a PCIe 3.0 SSD, but I am hoping to get a newer PCIe 4.0 one to run benchmarks on for a future article. In the meantime, this one is doing just fine for most uses, with over 3GB/sec of read and over 2GB/sec of write bandwidth. This is more than fast enough for uncompressed 4K work.

Using Adobe Tools
Next I installed both the 2018 and 2019 versions of Adobe Premiere Pro and Media Encoder so I could run tests with the same applications I had used for previous benchmarks on other systems, for more accurate comparisons. I have a standard set of sequences I export in AME, which are based on raw camera footage from Red Monstro, Sony Venice and ARRI Alexa LF cameras, exported to HEVC at 8K and 4K, testing both 8-bit and deep color render paths. Most of these renders were also completed faster than on any other system I have tested, and this is “only” a single-socket consumer-level architecture (compared to Threadripper and Epyc).

I did further tests after adding a Mellanox 40GbE network card, and swapping out the Quadro RTX 4000 for more powerful GPUs. I tested a GeForce RTX 2080 TI, a Quadro RTX 6000, an older Quadro P6000 and an AMD Radeon Pro WX 8200. The 2080TI and RTX6000 did allow 8K playback in realtime from RedCineX, but the max resolution, full-frame 8K files were right at the edge of smooth (around 23fps). Any smaller frame sizes were fine at 24p. The more powerful GeForce card didn’t improve my AME export times much if at all and got a 25% lower OpenGL score in Cinebench, revealing that Quadro drivers still make a difference for some 3D applications and that Adobe users don’t benefit much from investing in a GPU beyond a GeForce 2070. The AMD card did much better than in my earlier tests, showing that AMD drivers and software support have improved significantly since then.

Real-World Use
Where the system really stood out is when I started to do some real work with it. The 40GbE connection to my main workstation allowed me to seamlessly open projects that are stored on my internal 40TB array. I am working on a large feature film at the moment, so I used it to export a number of reels and guide tracks. These are 4K sequences of 7K anamorphic Red footage with layers of GPU effects, titles, labels and notes, with over 20 layers of audio as well. Rendering out a 4K DNxHR file of a 20-minute reel takes 140 minutes on my 16-core dual-socket workstation, but this “consumer-level” AMD system kicks them out in under 90 minutes. My watermarked DNxHD guides render out 20% faster than before as well, even over the network. This is probably due to the higher overall CPU frequency, as I have discovered that Premiere doesn’t multi-thread very well.

For AME Render times, lower is better and for Cinebench scores, higher is better.
Comparison system details:
Dell Precision 7910 with the GeForce 2080 TI
Supermicro X9DRi with Quadro P6000
HP Z4 10-core workstation with GeForce 2080TI
Razer Blade 15 with GeForce 2080 TI Max-Q

I also did some test exports in Blackmagic DaVinci Resolve. I am less familiar with that program, so my testing was much more limited, but it exported nearly as fast as Premiere, and the Nvidia cards were only slightly faster than the AMD GPUs in that app. (But I have few previous Resolve tests to use as a point of comparison to other systems.)

As an AMD system, there are a few limitations as compared to a similar Intel model. First of all, there is no support for the hardware encoding available in Intel’s Quick Sync integrated graphics hardware. This lack of support only matters if you have software that uses that particular functionality, such as my Adobe apps. But the system seems fast enough to accomplish those encode and decode tasks on its own. It also lacks a Thunderbolt port, as until recently that was an exclusively Intel technology. Now that Thunderbolt 3 is being incorporated into USB 4.0, it will be more important to have, but it will become available in a wider variety of products. It might be possible to add a USB 4.0 card to this system when the time comes, which would alleviate this issue.

When I first received the system, it reported the CPU as an 800MHz chip, which was the result of a BIOS configuration issue. After fixing that, the only other problem I had was a conflict between my P6000 GPU and my 8K display, which usually work great together. But it won’t boot with that combo, which is a pretty obscure corner case. All other GPU and monitor combinations worked fine, and I tested a bunch. I worked with Boxx technical support on that and a few other minor issues, and they were very helpful, sending me spare parts to confirm that the issues weren’t caused by my own added hardware.

In the End
The system performed very well for me, and the configuration I received would meet the needs of most users. Even editing 8K footage no longer requires stepping up to a dual-socket system. The biggest variation will come with matching a GPU to your needs, as Boxx offers GeForce, Quadro and AMD options. Editors will probably be able to save some money, while those doing true 3D rendering might want to invest in an even more powerful GPU than the Quadro RTX 4000 that this system came with.

All of those options are available on the Boxx website, with the online configuration tool. The test model Boxx sent me retails for about $4,500. There are cheaper solutions available if you are a DIY person, but Boxx has assembled a well-balanced solution in a solid package, built and supported for you. They also sell much higher-end systems if you are in the market for that, but with recent advances, these mid-level systems probably meet the needs of most users. If you are interested in purchasing a system from them, using the code MIKEPOST at checkout will give you a discount.


Mike McCarthy is an online editor/workflow consultant with over 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.

Charlieuniformtango names company vets as new partners

Charlieuniformtango principal/CEO Lola Lott has named three of the full-service studio’s most veteran artists as new partners — editors Deedle LaCour and James Rayburn, and Flame artist Joey Waldrip. This is the first time in the company’s almost 25-year history that the partnership has expanded. All three will continue with their current jobs but have received the expanded titles of senior editor/partner and senior Flame artist/partner, respectively. Lott, who retains majority ownership of Charlieuniformtango, will remain principal/CEO, and Jack Waldrip will remain senior editor/co-owner.

“Deedle, Joey and James came to me and Jack with a solid business plan about buying into the company with their futures in mind,” explains Lott. “All have been with Charlieuniformtango almost from the beginning: Deedle for 20 years, Joey for 19 years and James for 18. Jack and I were very impressed and touched that they were interested and willing to come to us with funding and plans for continuing and growing their futures with us.

So why now after all these years? “Now is the right time because while Jack and I still have a passion for this business and we also have employees/talent — that have been with us for over 18 years — who also have a passion be a partner in this company,” says Lott. “While still young, they have invested and built their careers within the Tango culture and have the client bonds, maturity and understanding of the business to be able to take Tango to a greater level for the next 20 years. That was mine and Jack’s dream, and they came to us at the perfect time.”

Charlieuniformtango is a full-service creative studio that produces, directs, shoots, edits, mixes, animates and provides motion graphics, color grading, visual effects and finishing for commercials, short films, full-length feature films, documentaries, music videos and digital content.

Main Image: (L-R) Joey Waldrip, James Rayburn, Jack Waldrip, Lola Lott and Deedle LaCour

Review: Samsung’s 970 EVO Plus 500GB NVMe M.2 SSD

By Brady Betzel

It seems that the SSD drives are dropping in price by the hour. (This might be a slight over-exaggeration, but you understand what I mean.) Over the last year or so there has been a huge difference in pricing, including high-speed NVMe SSD drives. One of those is the highly touted Samsung EVO Plus NVMe line.

In this review, I am going to go over Samsung’s 500GB version of the 970 EVO Plus NVMe M.2 SSD drive. The Samsung 970 EVO Plus NVMe M.2 SSD drive comes in four sizes — 250GB, 500GB, 1TB, and 2TB — and retails (according to www.samsung.com) for $74.99, $119.99, $229.99 and $479.99, respectively. For what it’s worth, I really didn’t see much of price difference on other sites I visited, namely Amazon.com and Best Buy.

On paper, the EVO Plus line of drives can achieve speeds of up to 3,500MB/s read and 3,300MB/s write. Keep in mind that the lower the storage size the lower the read/write speeds will be. For instance, the EVO Plus 250GB SSD can still get up to 3,500MB/s in sequential read speeds, while the sequential write speeds dwindle down to max speeds of 2,300MB/s. Comparatively, the “standard” EVO line can get 3,400MB/s to 3,500MB/s sequential read speeds and 1,500MB/s sequential write speeds on the 250GB EVO SSD. The 500GB version costs just $89.99, but if you need more storage size, you will have to pay more.

There is another SSD to compare the 970 EVO Plus to, and that is the 970 Pro, which only comes in 512GB and 1TB sizes — costing around $169.99 and $349.99, respectively. While the Pro version has similar read speeds to the Plus (up to 3,500MB/s read) and actually slower write speeds (up to 2,700MB/s), the real ticket to admission for the Samsung 970 Pro is the Terabytes Written (TBW) warranty period. Samsung warranties the 970 line of drives for five years or Terabytes Written, whichever comes first. In the 500GB line of 970 drives, the “standard” and Plus 970 cover 300TBW, while the Pro covers a whopping 600TBW.

Samsung says its use of the latest V-NAND technology, in addition to its Phoenix controller, provides the highest speeds and power efficiency of the EVO NVMe drives. Essentially, V-NAND is a way to vertically stack memory instead of the previous method of stacking memory in a planar way. Stacking vertically allows for more memory in the same space in addition to longer life spans. You can read more about the Phoenix controller here.

If you are like me and want both a good warranty (or, really, faith in the product) and blazing speeds, check out the Samsung 970 EVO Plus line of drives. Great price point with almost all of the features as the Pro line. The 970 line of NVMe M.2 SSD drives fits the 2280 form factor (meaning 22mm x 80mm) and fits an M key-style interface. It’s important to understand what interface your SSD is compatible with: either M key (or M) or B key. Cards in the Samsung 970 EVO line are all M key. Most newer motherboards will have at least one if not two M.2 ports to plug drives into. You can also find PCIe adapters for under $20 or $30 on Amazon that will give you essentially the same read/write speeds. External USB 3.1 Gen 2, USB-C enclosures can also be found that will give you an easier way of replacing the drives when needed without having to open your case.

One really amazing way to use these newly lower-priced drives: When color correcting, editing, and/or performing VFX miracles in apps like Adobe Premiere Pro or Blackmagic Resolve, use NVMe drives for only cache, still stores, renders and/or optimized media. With the low cost of these NVMe M.2 drives, you might be able to include the price of one when charging a client and throw it on the shelf when done, complete with the project and media. Not only will you have a super-fast way to access the media, but you can easily get another one in the system when using an external drive.

Summing Up
In the end, the price points of the Samsung 970 EVO Plus NVMe M.2 drives are right in the sweet spot. There are, of course, competing drives that run a little bit cheaper, like the Western Digital Black SN750 NVMe SSDs (at around $99 for the 500GB model), but they come with a slightly slower read/write speed. So for my money, the Samsung 970 line of NVMe drives is a great combination of speed and value that can take your computer to the next level.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Foundry updates Nuke to version 12.0

Foundry has released Nuke 12.0, which introduces the next cycle of releases for the Nuke family. The Nuke 12.0 release brings improved interactivity and performance across the Nuke family, from additional GPU-enabled nodes for cleanup to a rebuilt playback engine in Nuke Studio and Hiero. Nuke 12.0 also sees the integration of GPU-accelerated tools integrated from Cara VR for camera solving, stitching and corrections and updates to the latest industry standards.

OpenEXR

New features of Nuke 12.0 include:
• UI interactivity and script loading – This release includes  a variety of optimizations throughout the software to improve performance, especially when working at scale. One key improvement offers a much smoother experience and noticeably maintains UI interactivity and reduced loading times when working in large scripts.
• Read and write performance – Nuke 12.0 includes focused improvement to OpenEXR read and write performance, including optimizations for several popular compression types (Zip1, Zip16, PIZ, DWAA, DWAB), improving render times and interactivity in scripts. Red and Sony camera formats also see additional GPU support.
• Inpaint and EdgeExtend – These GPU-accelerated nodes provide faster and more intuitive workflows for common tasks, with fine detail controls and contextual paint strokes.
• Grid Warp Tracker – Extending the Smart Vector toolset in NukeX, this node uses Smart Vectors to drive grids for match moving, warping and morphing images.
• Cara VR node integration – The majority of Cara VR’s nodes are now integrated into NukeX, including a suite of GPU-enabled tools for VR and stereo workflows and tools that enhance traditional camera solving and cleanup workflows.
• Nuke Studio, Hiero and HieroPlayer Playback – The timeline-based tools in the Nuke family see dramatic improvements in playback stability and performance as a result of a rebuilt playback engine optimized for the heavy I/O demands of color-managed workflows with multichannel EXRs.

Uppercut ups Tyler Horton to editor

After spending two years as an assistant at New York-based editorial house Uppercut, Tyler Horton has been promoted to editor. This is the first internal talent promotion for Uppercut.

Horton first joined Uppercut in 2017 after a stint as an assistant editor at Whitehouse Post. Stepping up as editor he’s cut notable projects, such as a recent Nike campaign “Letters to Heroes,” a series launched in conjunction with the US Open that highlights young athletes meeting their role models, including Serena Williams and Naomi Osaka. He also has cut campaigns for brands such as Asics, Hypebeast, Volvo and MOMA.

“From the beginning, Uppercut was always intentionally a boutique studio that embraced a collaborative of visions and styles — never just a one-person shop,” says Uppercut EP Julia Williams. “Tyler took initiative from day one to be as hands-on as possible with every project and we’ve been proud to see him really grow and refine his own voice.”

Horton’s love of film was sparked by watching sports reels and highlight videos. He went on to study film editing, then hit the road to tour with his band for four years before returning to his passion for film.

Cinelab London adds sound mastering supervisor and colorist

Cinelab London, which provides a wide range of film and digital restoration services, has added two new creatives to its staff — sound mastering supervisor Jason Stevens and senior colorist Mike David.

Stevens brings with him over 20 years of experience in sound and film archive restoration. Prior to his new role, he was part of the archive and restoration team at Pinewood Studios. Having worked there his whole career, Stevens’ worked on many big films, including the recent Yesterday, Rocketman and Judy. His clients have included the BFI, Arrow Films, Studio Canal and Fabulous Films.

During his career, Stevens has also been involved in short films, commercials and broadcast documentaries, recently completing a three-year project for Adam Matthew, the award-winning digital publisher of unique primary source collections from archives around the world.

“We have seen Jason’s enviable skills and talents put to their best use over the six years we have worked together,” says Adrian Bull, co-founder and CEO of Cinelab London. “Now we’re thrilled to have him join our growing in-house team. Talents like Jason’s are rare. He brings a wealth of creative and technical knowledge, so we feel lucky to be able to welcome him to our film family.”

Colorist Mike Davis also joins from Pinewood Studios (following its recent closure) where he spent five years grading feature films and episodic TV productions and specializing in archive and restoration. He has graded over 100 restoration titles for clients such as BFI, Studio Canal and Arrow Films on projects such as A Fish Called Wanda, Rita, Sue & Bob Too and Waterworld.

Davis has worked with the world’s leading DPs, handling dailies and grading major feature films including Mission Impossible, Star Wars: Rogue One and Annihilation. He enjoys working on a variety of content including short films, commercials, broadcast documentaries and Independent DI projects. He recently worked on Adewale Akinnuoye-Agbaje’s Farming, which won Best British Film at the Edinburgh Film Festival in June.

Davis started his career at Ascent Media, assisting on film rushes, learning how to grade and operate equipment. By 2010, he segued into production, spending time on set and on location working on stereoscopic 3D projects and operating 3D rigs. Returning to grading film and TV at Company 3, Davis then strengthened his talents working in long format film at Pinewood Studios.

Main Image: (L-R) Stevens and Davis

Pace Pictures and ShockBox VFX formalize partnership

Hollywood post house Pace Pictures and bicoastal visual effects, animation and motion graphics specialist ShockBox VFX have formed a strategic alliance for film and television projects. The two specialist companies provide studios and producers with integrated services encompassing all aspects of post in order to finish any project efficiently, cost-effectively and with greater creative control.

The agreement formalizes a successful collaborative partnership that has been evolving over many years. Pace Pictures and ShockBox collaborated informally in 2015 on the independent feature November Rule. Since then, they have teamed up on numerous projects, including, most recently, the Hulu series Veronica Mars, Lionsgate’s 3 From Hell and Universal Pictures’ Grand-Daddy Day Care and Undercover Brother 2. Pace provided services including creative editorial, color grading, editorial finishing and sound mixing. ShockBox contributed visual effects, animation and main title design.

“We offer complementary services, and our staff have developed a close working rapport,” says Pace Pictures president Heath Ryan. “We want to keep building on that. A formal alliance benefits both companies and our clients.”

“In today’s world of shrinking budgets and delivery schedules, the time for creativity in the post process can often suffer,” adds ShockBox founder and director Steven Addair. “Through our partnership with Pace, producers and studios of all sizes will be able to maximize our integrated VFX pipeline for both quality and volume.”

As part of the agreement, ShockBox will move its West Coast operations to a new facility that Pace plans to open later this fall. The two companies have also set up an encrypted, high-speed data connection between Pace Pictures Hollywood and ShockBox New York, allowing them to exchange project data quickly and securely.

FotoKem expands post services to Santa Monica

FotoKem is now offering its video post services in Santa Monica. This provides an accessible location for those working on the west side of LA, as well as access to the talent from its Burbank and Hollywood studios.

Designed to support an entire pipeline of services, the FotoKem Santa Monica facility is housed just off the 10 freeway, above FotoKem’s mixing and recording studio Margarita Mix. For many projects, color grading, sound mixing and visual effects reviews often take place in multiple locations around town. This facility offers showrunners and filmmakers a new west side post production option. Additionally, the secure fiber network connecting all FotoKem-owned locations ensures feature film and episodic finishing work can take place in realtime among sites.

FotoKem Santa Monica features a DI color grading theater, episodic and commercial color suite, editorial conform bay and a visual effects team — all tied to the comprehensive offerings at FotoKem’s main Burbank campus, Keep Me Posted’s episodic finishing facility and Margarita Mix Hollywood’s episodic grading suites. FotoKem’s entire roster of colorists are available to collaborate with filmmakers to ensure their vision is supported throughout the process. Recent projects include Shazam!, Vice, Aquaman, The Dirt, Little and Good Trouble.

Review: Accusonus Era 4 Pro audio repair plugins

By Brady Betzel

With each passing year it seems that the job title of “editor” changes. It’s not just someone responsible for shaping the story of the show but also for certain aspects of finishing, including color correction and audio mixing.

In the past, when I was offline editing more often, I learned just how important sending a properly mixed and leveled offline cut was. Whether it was a rough cut, fine cut or locked cut — the mantra to always put my best foot forward was constantly repeating in my head. I am definitely a “video” editor but, as I said, with editors becoming responsible for so many aspects of finishing, you have to know everything. For me this means finding ways to take my cuts from the middle of the road to polished with just a few clicks.

On the audio side, that means using tools like Accusonus Era 4 Pro audio repair plugins. Accusonus advertises these Era 4 plugins as one-button solutions, and they are as easy as one button but you can also nuance the audio if you like. The Era 4 Pro plugins work not only work with your typical DAW like Pro Tools 12.x and higher, but within nonlinear editors like Adobe Premiere Pro CC 2017 or higher, FCP X 10.4 or higher and Avid Media Composer 2018.12.

Digging In
Accusonus’ Era 4 Pro Bundle will cost you $499 for the eight plugins included in its audio repair offering. This includes De-Esser Pro, De-Esser, Era-D, Noise Remover, Reverb Remover, Voice Leveler, Plosive Remover and De-Clipper. There is also an Era 4 (non-pro) bundle for $149 that includes everything mentioned previously except for De-Esser Pro and Era-D. I will go over a few of the plugins in this review and why the Pro bundle might warrant the additional $350.

I installed the Era 4 Pro Bundle on a Wacom MobileStudio Pro tablet that is a few years old but can still run Premiere. I did this intentionally to see just how light the plugins would run. To my surprise my system was able to toggle each plug-in off and on without any issue. Playback was seamless when all plugins were applied. Now I wasn’t playing anything but video, but sometimes when I do an audio pass I turn off video monitoring to be extra sure I am concentrating on the audio only.

De-Esser
First up is the De-Esser, which tackles harsh sounds resulting from “s,” “z,” “ch,” “j” and “sh.” So if you run into someone who has some ear piercing “s” pronunciations, apply the De-Esser plugin and choose from narrow, normal or broad. Once you find which mode helps remove the harsh sounds (otherwise known as sibilance), you can enable “intense” to add more processing power (but doing this can potentially require rendering). In addition, there is an output gain setting, “Diff,” that plays only the parts De-Esser is affecting. If you want to just try the “one button” approach, the Processing dial is really all you need to touch. In realtime, you can hear the sibilance diminish. I personally like a little reality in my work so I might dial the processing to the “perfect” amount then dial it back 5% or 10%.

De-Esser Pro
Next up is De-Esser Pro. This one is for the editor who wants the one-touch processing but also the ability to dive into the specific audio spectrum being affected and see how the falloff is being performed. In addition, there are presets such as male vocals, female speech, etc. to jump immediately to where you need help. I personally find the De-Esser Pro more useful than the De-Esser. I can really shape the plugin. However, if you don’t want to be bothered with the more intricate settings, the De-Esser is a still a great solution. Is it worth the extra $350? I’m not sure, but combining it with the Era-D might make you want to shell out the cash for the Era 4 Pro bundle.

Era-D
Speaking of the Era-D, it’s the only plugin not described by its own title, funnily enough, but it is a joint de-noise and de-reverberation plugin. However, Era-D goes way beyond simple hum or hiss removal. With Era-D, you get “regions” (I love saying that because of the audio mixers who constantly talk in regions and not timecode) that can not only be split at certain frequencies — and have different percentage of plugin applied to said region — but also have individual frequency cutoff levels.

Something I had never heard of before is the ability to use two mics to fix a suboptimal recording on one of the two mics, which can be done in the Era-D plugin. There is a signal path window that you can use to mix the amount of de-noise and de-reverb. It’s possible to only use one or the other, and you can even run the plugin in parallel or cascade. If that isn’t enough, there is an advanced window with artifact control and more. Era-D is really the reason for that extra $350 between the standard Era 4 bundle and the Era 4 Bundle Pro — and it is definitely worth it if you find yourself removing tons of noise and reverb.

Noise Remover
My second favorite plugin in the Era 4 Bundle Pro is the Noise Remover. Not only is the noise removal pretty high-quality (again, I dial it back to avoid robot sounds), but it is painless. Dial in the amount of processing and you are 80% done. If you need to go further, then there are five buttons that let you focus where the processing occurs: all-frequencies (flat), high frequencies, low frequencies, high and low frequencies and mid frequencies. I love clicking the power button to hear the differences — with and without the noise removal — but also dialing the knob around to really get the noise removed without going overboard. Whether removing noise in video or audio, there is a fine art in noise reduction, and the Era 4 Noise Removal makes it easy … even for an online editor.

Reverb Remover
The Reverb Remover operates very much like the Noise Remover, but instead of noise, it removes echo. Have you ever gotten a line of ADR clearly recorded on an iPhone in a bathtub? I’ve worked on my fair share of reality, documentary, stage and scripted shows, and at some point, someone will send you this — and then the producers will wonder why it doesn’t match the professionally recorded interviews. With Era 4 Noise Remover, Reverb Remover and Era-D, you will get much closer to matching the audio between different recording devices than without plugins. Dial that Reverb Remover processing knob to taste and then level out your audio, and you will be surprised at how much better it will sound.

Voice Leveler
To level out your audio, Accusonus also has included the Voice Leveler, which does just what is says: It levels your audio so you won’t get one line blasting in your ears while the next one doesn’t because the speaker backed away from the mic. Much like the De-Esser, you get a waveform visual of what is being affected in your audio. In addition, there are two modes: tight and normal, helping to normalize your dialog. Think of the tight mode as being much more distinctive than a normal interview conversation. Accusonus describes tight as a more focused “radio” sound. The Emphasis button helps to address issues when the speaker turns away from a microphone and introduces tonal problems. Breath control is a simple

De-Clipper and Plosive Remover
The final two plugins in the Era 4 Bundle Pro are the Plosive Remover and De-Clipper. De-Clipper is an interesting little plugin that tries to restore lost audio due to clipping. If you recorded audio at high gain and it came out horribly, then it’s probably been clipped. De-Clipper tries to salvage this clipped audio by recreating overly saturated audio segments. While it’s always better to monitor your audio recording on set and re-record if possible, sometimes it is just too late. That’s when you should try De-Clipper. There are two modes: normal/standard use and one for trickier cases that take a little more processing power.

The final plugin, Plosive Remover, focuses on artifacting that’s typically caused by “p” and “b” sounds. This can happen if no pop screen is used and/or if the person being recorded is too close to the microphone. There are two modes: normal and extreme. Subtle pops will easily be repaired in normal mode, but extreme pops will definitely need the extreme mode. Much like De-Esser, Plosive Remover has an audio waveform display to show what is being affected, while the “Diff” mode only plays back what is being affected. However, if you just want to stick to that “one button” mantra, the Processing dial is really all you need to mess with. The Plosive Remover is another amazing plugin that, when you need it, really does a great job fast and easily.

Summing Up
In the end, I downloaded all of the Accusonus audio demos found on the Era 4 website, along with installers. This is the same place you can download the installers if you want to take part in the 14-day trial. I purposely limited my audio editing time to under one minute per clip and plugin to see what I could do. Check out my work with the Accusonus Era 4 Pro audio repair plugins on YouTube and see if anything jumps out at you. In my opinion, the Noise Remover, Reverb Remover and Era-D are worth the price of admission, but each plugin from Accusonus does great work.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

AJA adds HDR Image Analyzer 12G and more at IBC

AJA will soon offer the new HDR Image Analyzer 12G, bringing 12G-SDI connectivity to its realtime HDR monitoring and analysis platform developed in partnership with Colorfront. The new product streamlines 4K/Ultra HD HDR monitoring and analysis workflows by supporting the latest high-bandwidth 12G-SDI connectivity. The HDR Image Analyzer 12G will be available this fall for $19,995.

HDR Image Analyzer 12G offers waveform, histogram and vectorscope monitoring and analysis of 4K/Ultra HD/2K/HD, HDR and WCG content for broadcast and OTT production, post, QC and mastering. It also features HDR-capable monitor outputs that not only go beyond HD resolutions and offer color accuracy but make it possible to configure layouts to place the preferred tool where needed.

“Since its release, HDR Image Analyzer has powered HDR monitoring and analysis for a number of feature and episodic projects around the world. In listening to our customers and the industry, it became clear that a 12G version would streamline that work, so we developed the HDR Image Analyzer 12G,” says Nick Rashby, president of AJA.

AJA’s video I/O technology integrates with HDR analysis tools from Colorfront in a compact 1-RU chassis to bring HDR Image Analyzer 12G users a comprehensive toolset to monitor and analyze HDR formats, including PQ (Perceptual Quantizer) and hybrid log gamma (HLG). Additional feature highlights include:

● Up to 4K/Ultra HD 60p over 12G-SDI inputs, with loop-through outputs
● Ultra HD UI for native resolution picture display over DisplayPort
● Remote configuration, updates, logging and screenshot transfers via an integrated web UI
● Remote Desktop support
● Support for display referred SDR (Rec.709), HDR ST 2084/PQ and HLG analysis
● Support for scene referred ARRI, Canon, Panasonic, Red and Sony camera color spaces
● Display and color processing lookup table (LUT) support
● Nit levels and phase metering
● False color mode to easily spot pixels out of gamut or brightness
● Advanced out-of-gamut and out-of-brightness detection with error intolerance
● Data analyzer with pixel picker
● Line mode to focus a region of interest onto a single horizontal or vertical line
● File-based error logging with timecode
● Reference still store

At IBC 2019, AJA also showed new products and updates designed to advance broadcast, production, post and pro AV workflows. On the stand were the Kumo 6464-12G for routing and the newly shipping Corvid 44 12G developer I/O models. AJA has also introduced the FS-Mini utility frame sync Mini-Converter and three new OpenGear-compatible cards: OG-FS-Mini, OG-ROI-DVI and OG-ROI-HDMI. Additionally, the company previewed Desktop Software updates for Kona, Io and T-Tap; Ultra HD support for IPR Mini-Converter receivers; and FS4 frame synchronizer enhancements.

Behind the Title: Chapeau CD Lauren Mayer-Beug

This creative director loves the ideation process at the start of a project when anything is possible, and saving some of those ideas for future use.

COMPANY: LA’s Chapeau Studios

CAN YOU DESCRIBE YOUR COMPANY?
Chapeau provides visual effects, editorial, design, photography and story development fluidly with experience in design, web development, and software and app engineering.

WHAT’S YOUR JOB TITLE?
Creative Director

WHAT DOES THAT ENTAIL?
It often entails seeing a job through from start to finish. I look at it like making a painting or a sculpture.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
Perhaps just how hands-on the process actually is. And how analog I am, considering we work in such a tech-driven environment.

Beats

WHAT’S YOUR FAVORITE PART OF THE JOB?
Thinking. I’m always thinking big picture to small details. I love the ideation process at the start of a project when anything is possible. Saving some of those ideas for future use, learning about what you want to do through that process. I always learn more about myself through every ideation session.

WHAT’S YOUR LEAST FAVORITE?
Letting go of the details that didn’t get addressed. Not everything is going to be perfect, so since it’s a learning process there is inevitably something that will catch your eye.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
My mind goes to so many buckets. A published children’s book author with a kick-ass coffee shop. A coffee bean buyer so I could travel the world.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I always skewed in this direction. My thinking has always been in the mindset of idea coaxer and gatherer. I was put in that position in my mid-20s and realized I liked it (with lots to learn, of course), and I’ve run with it ever since.

IS THERE A PROJECT YOU ARE MOST PROUD OF?
That’s hard to say. Every project is really so different. A lot of what I’m most proud of is behind the scenes… the process that will go into what I see as bigger things. With Chapeau, I will always love the Facebook projects, all the pieces that came together — both on the engineering side and the fun creative elements.

Facebook

What I’m most excited about is our future stuff. There’s a ton on the sticky board that we aim to accomplish in the very near future. Thinking about how much is actually being set in motion is mind-blowing, humbling and — dare I say — makes me outright giddy. That is why I’m here, to tell these new stories — stories that take part in forming the new landscape of narrative.

WHAT TOOLS DO YOU USE DAY TO DAY?
Anything Adobe. My most effective tool is the good-old pen to paper. That works clearly in conveying ideas and working out the knots.

WHERE DO YOU FIND INSPIRATION?
I’m always looking for inspiration and find it everywhere, as many other creatives do. However, nature is where I’ve always found my greatest inspiration. I’m constantly taking photos of interesting moments to save for later. Oftentimes I will refer back to those moments in my work. When I need a reset I hike, run or bike. Movement helps.

I’m always going outside to look at how the light interacts with the environment. Something I’ve become known for at work is going out of my way to see a sunset (or sunrise). They know me to be the first one on the roof for a particularly enchanting magic hour. I’m always staring at the clouds — the subtle color combinations and my fascination with how colors look the way they do only by context. All that said, I often have my nose in a graphic design book.

The overall mood realized from gathering and creating the ever-popular Pinterest board is so helpful. Seeing the mood color wise and texturally never gets old. Suddenly, you have a fully formed example of where your mind is at. Something you could never have talked your way through.

Then, of course, there are people. People/peers and what they are capable of will always amaze me.

Mavericks VFX provides effects for Hulu’s The Handmaid’s Tale

By Randi Altman

Season 3 episodes of Hulu’s The Handmaid’s Tale are available for streaming, and if you had any illusions that things would lighten up a bit for June (Elizabeth Moss) and the ladies of Gilead, I’m sorry to say you will be disappointed. What’s not disappointing is that, in addition to the amazing acting and storylines, the show’s visual effects once again play a heavy role.

Brendan Taylor

Toronto’s Mavericks VFX has created visual effects for all three seasons of the show, based on Margaret Atwood’s dystopian view of the not-too-distant future. Its work has earned two Emmy nominations.

We recently reached out to Maverick’s founder and visual effects supervisor, Brendan Taylor, to talk about the new season and his workflow.

How early did you get involved in each season? What sort of input did you have regarding the shots?
The Handmaid’s Tale production is great because they involve us as early as possible. Back in Season 2, when we had to do the Fenway Park scene, for example, we were in talks in August but didn’t shoot until November. For this season, they called us in August for the big fire sequence in Episode 1, and the scene was shot in December.

There’s a lot of nice leadup and planning that goes into it. Our opinions are sought after and we’re able to provide input on what’s the best methodology to use to achieve a shot. Showrunner Bruce Miller, along with the directors, have a way of how they’d like to see it, and they’re great at taking in our recommendations. It was very collaborative and we all approach the process with “what’s best for the show” in mind.

What are some things that the showrunners asked of you in terms of VFX? How did they describe what they wanted?
Each person has a different approach. Bruce speaks in story terms, providing a broader sense of what he’s looking for. He gave us the overarching direction of where he wants to go with the season. Mike Barker, who directed a lot of the big episodes, speaks in more specific terms. He really gets into the details, determining the moods of the scene and communicating how each part should feel.

What types of effects did you provide? Can you give examples?
Some standout effects were the CG smoke in the burning fire sequence and the aftermath of the house being burned down. For the smoke, we had to make it snake around corners in a believable yet magical way. We had a lot of fire going on set, and we couldn’t have any actors or stunt person near it due to the size, so we had to line up multiple shots and composite it together to make everything look realistic. We then had to recreate the whole house in 3D in order to create the aftermath of the fire, with the house being completely burned down.

We also went to Washington, and since we obviously couldn’t destroy the Lincoln Memorial, we recreated it all in 3D. That was a lot of back and forth between Bruce, the director and our team. Different parts of Lincoln being chipped away means different things, and Bruce definitely wanted the head to be off. It was really fun because we got to provide a lot of suggestions. On top of that, we also had to create CGI handmaids and all the details that came with it. We had to get the robes right and did cloth simulation to match what was shot on set. There were about a hundred handmaids on set, but we had to make it look like there were thousands.

Were you able to reuse assets from last season for this one?
We were able to use a handmaids asset from last season, but it needed a lot of upgrades for this season. Because there were closer shots of the handmaids, we had to tweak it and made sure little things like the texture, shaders and different cloth simulations were right for this season.

Were you on set? How did that help?
Yes, I was on set, especially for the fire sequences. We spent a lot of time talking about what’s possible and testing different ways to make it happen. We want it to be as perfect as possible, so I had to make sure it was all done properly from the start. We sent another visual effects supervisor, Leo Bovell, down to Washington to supervise out there as well.

Can you talk about a scene or scenes where being on set played a part in doing something either practical or knowing you could do it in CG?
The fire sequence with the smoke going around the corner took a lot of on-set collaboration. We had tried doing it practically, but the smoke was moving too fast for what we wanted, and there was no way we could physically slow it down.

Having the special effects coordinator, John MacGillivray, there to give us real smoke that we could then match to was invaluable. In most cases on this show, very few audible were called. They want to go into the show knowing exactly what to expect so we were prepared and ready.

Can you talk about turnaround time? Typically, series have short ones. How did that affect how you worked?
The average turnaround time was eight weeks. We began discussions in August, before shooting, and had to delivery by January. We worked with Mike to simplify things without diminishing the impact. We just wanted to make sure we had the chance to do it well given the time we had. Mike was very receptive in asking what we needed to do to make it the best it could be in the timeframe that we had. Take the fire sequence, for example. We could have done full-CGI fire but that would have taken six months. So we did our research and testing to find the most efficient way to merge practical effects with CGI and presented the best version in a shorter period of time.

What tools were used?
We used Foundry Nuke for compositing. We used Autodesk Maya to build all the 3D houses, including the burned-down house, and to destroy the Lincoln Memorial. Then we used Side Effects Houdini to do all the simulations, which can range from the smoke and fire to crowd and cloth.

Is there a shot that you are most proud of or that was very challenging?
The shot where we reveal the crowd over June when we’re in Washington was incredibly challenging. The actual Lincoln Memorial, where we shot, is an active public park, so we couldn’t prevent people from visiting the site. The most we could do was hold them off for a few minutes. We ended up having to clean out all of the tourists, which is difficult with moving camera and moving people. We had to reconstruct about 50% of the plate. Then, in order to get the CG people to be standing there, we had to create a replica of the ground they’re standing on in CG. There were some models we got from the US Geological Society, but they didn’t completely line up, so we had to make a lot of decisions on the fly.

The cloth simulation in that scene was perfect. We had to match the dampening and the movement of all the robes. Stephen Wagner, who is our effects lead on it, nailed it. It looked perfect, and it was really exciting to see it all come together. It looked seamless, and when you saw it in the show, nobody believed that the foreground handmaids were all CG. We’re very proud.

What other projects are you working on?
We’re working on a movie called Queen & Slim by Melina Matsoukas with Universal. It’s really great. We’re also doing YouTube Premium’s Impulse and Netflix’s series Madam C.J. Walker.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Colorist Chat: Technicolor’s Doug Delaney

Industry veteran Doug Delaney started his career in VFX before the days of digital, learning his craft from the top film timers and color scientists as well as effects supervisors.

Today he is a leading colorist and finisher at Technicolor, working on major movies including the recent Captain Marvel. We spoke to him to find out more about how he works.

NAME:Doug Delaney

TITLE:Senior Colorist

IN ADDITION TO CAPTAIN MARVEL, CANYOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
We have just wrapped on Showtime’s The Loudest Voice,which documented Fox News’ Roger Ailes and starred Russell Crow, Naomi Watts and Sienna Miller.

I also just had the immense pleasure of working with DP Cameron Duncan on Nat Geo’s thriller The Hot Zone. For that show we actually worked together early on to establish two looks — one for laboratory scenes taking place in Washington, DC, and another for scenes in central Africa. These looks were then exported as LUTs for dailies so that the creative intent was established from the beginning of shooting and carried through to finishing.

And earlier this year I worked on Love, Death & Robots, which just received two Emmy nominations, so big congrats to that team!

ARE YOU SOMETIMES ASKED TO DO MORE THAN JUST COLOR ON PROJECTS?
Yes, these days I tend to think of “colorists” as finishing artists — meaning that our suites are typically the last stop for a project and where everything comes together.

The technology we have access to in our suites continues to develop, and therefore our capabilities have expanded — there is more we can do in our suites that previously would have needed to be handled by others. A perfect example is visual effects. Sometimes we get certain shots in from VFX vendors that are well-executed but need to be a bit more nuanced — say it’s a driving scene against a greenscreen, and the lighting outside the car feels off for the time of day it’s supposed to be in the scene. Whereas we used to have to kick it back to VFX to fix, I can now go in and use the alpha channels and mattes to color-correct that imbalance.

And what’s important about this new ability is that in today’s demanding schedules and deadlines, it allows us to work collaboratively in real time with the creative rather than in an iterative workflow that takes time we often don’t have.

WHAT’S YOUR FAVORITE PART OF THE JOB?
The look development. That aspect can take on various conversations depending on the project. Sometimes it’s talking with filmmakers in preproduction, sometimes just when it gets to post, but ultimately, being part of the creative journey and how to deliver the best-looking show is what I love.

That and when the final playback happens in our room, when the filmmakers see for the first time all of the pieces of the puzzle come together with sound … it’s awesome.

ANY SUGGESTIONS FOR GETTING THE MOST OUT OF A PROJECT FROM A COLOR PERSPECTIVE?
Understanding that each project has a different relationship with the filmmaker, there needs to be transparency and agreement to the process amongst the director, DP, execs, etc. Whether a clear vision is established early on or they are open to further developing the look, a willingness to engage in an open dialogue is key.

Personally I love when I’m able to help develop the color pipeline in preproduction, as I find it often makes the post experience more seamless. For example, what aired on Strange Angel Season 2 was not far removed from dailies because we had established a LUT in advance and had worked with wardrobe, make-up and others to carry the look through. It doesn’t need to be complicated, but open communication and planning really can go a long way in creating a stunning visual identity and a seamless experience.

HOW DO YOU PREFER THE DP OR DIRECTOR TO DESCRIBE THE LOOK THEY WANT? PHYSICAL EXAMPLES, FILMS TO EMULATE, ETC.?
Physical examples — photo books, style sheets with examples of tones they like and things like that. But ultimately my role is to correctly interpret what it is that they like in what they are showing me and to discern if what they are looking for is a literal representation, or more of an inspiration to start from and massage. Again, the open communication and ability to develop strong working relationships — in which I’m able to discern when there is a direct ask versus a need versus an opportunity to do more and push the boundaries — is key to a successful project.

WHAT SYSTEM DO YOU WORK ON?
Baselight. I love the flexibility of the system and the support that the FilmLight team provides us, as we are constantly pushing the capabilities of the platform, and they continue to deliver.

WHERE CAN PEOPLE FIND YOU ON SOCIAL MEDIA
@colorist_douglasdelaney

Behind the Title: One Thousand Birds sound designer Torin Geller

Initially interested in working in a music studio, once this sound pro got a taste of audio post, there was no turning back.

NAME: Torin Geller

COMPANY: NYC’s One Thousand Birds (OTB)

CAN YOU DESCRIBE YOUR COMPANY?
OTB is a bi-coastal audio post house specializing in sound design and mixing for commercials, TV and film. We also create interactive audio experiences and installations.

One Thousand Birds

WHAT’S YOUR JOB TITLE?
Sound and Interactive Designer

WHAT DOES THAT ENTAIL?
I work on every part of our sound projects: dialogue edit, sound design and mix, as well as help direct and build our interactive installation work.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
Operating a scissor lift!

WHAT’S YOUR FAVORITE PART OF THE JOB?
Working with my friends. The atmosphere at OTB is like no other place I’ve worked; many of the people working here are old friends. I think it helps us a lot in terms of being creative since we’re not afraid to take risks and everyone here has each other’s backs.

WHAT’S YOUR LEAST FAVORITE?
Unexpected overtime.

WHAT IS YOUR MOST PRODUCTIVE TIME OF THE DAY?
In the morning, right after my first cup of coffee.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Making ambient music in the woods.

JBL spot with Aaron Judge

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I went to school for music technology hoping to work in a music studio, but fell into working in audio post after getting an internship at OTB during school. I still haven’t left!

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
Recently, we worked on a great mini doc for Royal Caribbean that featured chef Paxx Caraballo Moll, whose story is really inspiring. We also recently did sound design and Foley for an M&Ms spot, and that was a lot of fun.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
We designed and built a two-story tall interactive chandelier at a hospital in Kansas City — didn’t see that one coming. It consists of a 20-foot-long spiral of glowing orbs that reacts to the movements of people walking by and also incorporates reactive sound. Plus, I got to work on the design of the actual structure with my sister who’s an artist and landscape architect, which was really cool.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
– headphones
– music streaming
– synthesizers

Hospital installation

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
I love following animators on Instagram. I find that kind of work especially inspiring. Movement and sound are so integral to each other, and I love seeing how that can interplay in abstract plus interesting ways of animation that aren’t necessarily possible in film.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I’ve recently started rock climbing and it’s an amazing way to de-stress. I’ve never been one to exercise, but rock climbing feels very different. It’s intensely challenging but totally non-competitive and has a surprisingly relaxed pace to it. Each climb is a puzzle with a very clear end, which makes it super satisfying. And nothing helps you sleep better than being physically exhausted.

Nvidia and Asus offer first laptop with Quadro RTX 6000 GPU

In another new addition to the Nvidia RTX Studio of laptops, the Nvidia Quadro RTX 6000 GPU will power the Asus ProArt StudioBook One, making it the first laptop to offer the Nvidia Quadro RTX 6000 in a mobile solution so creatives can run complex workloads regardless of location.

The Quadro RTX 6000 within the ProArt StudioBook One provides creatives a similar high-end experience as a deskside workstation. The ProArt StudioBook One is able to handle massive datasets and accelerate compute-intensive workflows, such as creating 3D animations, rendering photoreal product designs, editing 8K video, visualizing volumetric geophysical datasets and conducting walk-throughs of photoreal building designs in VR.

RTX Studio systems, which integrate Nvidia Quadro RTX or GeForce RTX GPUs, offer advanced features — like realtime raytracing, AI and 8K Red video acceleration — to creative and technical professionals.

The Asus ProArt StudioBook One combines performance and portability with the power of Quadro RTX 6000 and features of the new Nvidia “ACE” reference design system, including:
• 24GB of ultra-fast GPU memory to tackle large scenes, models, datasets and complex multi-app workflows.
• Nvidia Turing architecture RT Cores and Tensor Cores to deliver realtime raytracing, advanced shading and AI-enhanced tools to accelerate professional workflows.
• Advanced thermal cooling solution featuring ultra-thin titanium vapor chambers.
• Enhanced Nvidia Optimus technology for seamless switching between the discrete and integrated graphics based on application use with no need to restart applications or reboot the system.
• Slim 300W high-density, high-efficiency power adapter for charging and power at half the size of traditional 300W power adapters.
• Professional 4K 120Hz Pantone-validated display with 100% Adobe RGB color coverage, color accuracy and factory calibration.

In other Nvidia-related news, Acer announced its latest additions to the ConceptD series of laptops, including the ConceptD Pro models featuring Quadro GPUs.

In addition to the Asus ProArt StudioBook One, Nvidia announced 11 additional RTX Studio laptops and desktops from Acer, Asus, HP and MSI, bringing the total number of RTX Studio systems to 39.

Nigel Bennett upped to managing director at UK’s Molinare

Molinare has promoted Nigel Bennett to the role of managing director. He joined the studio earlier this year from Pinewood Studios, where over a 20-year period he worked his way up from re-recording mixer to group director of creative services, a position that he held since 2014.

Bennett’s responsibilities include growing revenue across feature film, TV drama, feature documentaries and reality TV. Over the coming months he will work with the existing senior team at Molinare to implement a new business growth and investment plan with the full support of Molinare’s shareholders, Saphir Capital and Next Wave Partners.

Bennett replaces Julie Parmenter, who has left the company after seven years. While at Molinare, Parmenter was integral to maintaining the successful Molinare brand, subsequent acquisition of Hackenbacker and expansion into Hoxton.

Company 3 buys Sixteen19, offering full-service post in NYC

Company 3 has acquired Sixteen19, a creative editorial, production and post company based in New York City. The deal includes Sixteen19’s visual effects wing, PowerHouse VFX, and a mobile dailies operation with international reach.

The acquisition helps Company 3 further serve NYC’s booming post market for feature film and episodic TV. As part of the acquisition, industry veterans and Sixteen19 co-founders Jonathan Hoffman and Pete Conlin, along with their longtime collaborator, EVP of business development and strategy Alastair Binks, will join Company 3’s leadership team.

“With Sixteen19 under the Company 3 umbrella, we significantly expand what we bring to the production community, addressing a real unmet need in the industry,” says Company 3 president Stefan Sonnenfeld. “This infusion of talent and infrastructure will allow us to provide a complete suite of services for clients, from the start of production through the creative editing process to visual effects, final color, finishing and mastering. We’ve worked in tandem with Sixteen19 many times over the years, so we know that they have always provided strong client relationships, a best-in-class team and a deeply creative environment. We’re excited to bring that company’s vision into the fold at Company 3.”

Sonnenfeld will continue to serve as president of Company 3, and oversee operations of Sixteen19. As a subsidiary of Deluxe, Company 3 is part of a broad portfolio of post services. Bringing together the complementary services and geographic reach of Company3, Sixteen19 and Powerhouse VFX, will expand Company 3’s overall portfolio of post offerings and reach new markets in the US and internationally.

Sixteen19’s New York location includes 60 large editorial suites; two 4K digital cinema grading theaters; and a number of comfortable spaces, open environments and many common areas. Sixteen19’s mobile dailies services will add a perfect companion to Company 3’s existing offerings in that arena. PowerHouse VFX includes dedicated teams of experienced supervisors, producers and artists in 2D and 3D visual effects and compositing.

“The New York film community initially recognized the potential for a Company 3 and Sixteen19 partnership,” says Sixteen19’s Hoffman. “It’s not just the fact that a significant majority of the projects we work on are finished at Company 3, it’s more that our fundamental vision about post has always been aligned with Stefan’s. We value innovation; we’ve built terrific creative teams; and above all else, we both put clients first, always.”

Sixteen19 and Powerhouse VFX will retain their company names.

MovieLabs, film studios release ‘future of media creation’ white paper

MovieLabs (Motion Pictures Laboratories), a nonprofit technology research lab that works jointly with member studios Sony, Warner Bros., Disney, Universal and Paramount, has published a new white paper presenting an industry vision for the future of media creation technology by 2030.

The paper, co-authored by MovieLabs and technologists from Hollywood studios, paints a bold picture of future technology and discusses the need for the industry to work together now on innovative new software, hardware and production workflows to support and enable new ways to create content over the next 10 years. The white paper is available today for free download on the MovieLabs website.

The 2030 Vision paper lays out key principles that will form the foundation of this technological future, with examples and a discussion of the broader implications of each. The key principles envision a future in which:

1. All assets are created or ingested straight to the cloud and do not need to move.
2. Applications come to the media.
3. Propagation and distribution of assets is a “publish” function.
4. Archives are deep libraries with access policies matching speed, availability and security to the economics of the cloud.
5. Preservation of digital assets includes the future means to access and edit them.
6. Every individual on a project is identified and verified and their access permissions are efficiently and consistently managed.
7. All media creation happens in a highly secure environment that adapts rapidly to changing threats.
8. Individual media elements are referenced, tracked, interrelated and accessed using a universal linking system.
9. Media workflows are non-destructive and dynamically created using common interfaces, underlying data formats and metadata.
10. Workflows are designed around realtime iteration and feedback.

Rich Berger

“The next 10 years will bring significant opportunities, but there are still major challenges and inherent inefficiencies in our production and distribution workflows that threaten to limit our future ability to innovate,” says Richard Berger, CEO of MovieLabs. “We have been working closely with studio technology leaders and strategizing how to integrate new technologies that empower filmmakers to create ever more compelling content with more speed and efficiency. By laying out these principles publicly, we hope to catalyze an industry dialog and fuel innovation, encouraging companies and organizations to help us deliver on these ideas.”

The publication of the paper will be supported with a panel discussion at the IBC Conference in Amsterdam. The panel, “Hollywood’s Vision for the Future of Production in 2030,” will include senior technology leaders from the five major Hollywood motion picture studios. It will take place on Sunday, September 15 at 2:15pm in the IBC Conference in the Forum room of the RAI. postPerspective’s Randi Altman will moderate the panel made up of Sony’s Bill Baggelaar, Disney’s Shadi Almassizadeh, Universal’s Michael Wise and Paramount’s Anthony Guarino. More details can be found here.

“Sony Pictures Entertainment has a deep appreciation for the role that current and future technologies play in content creation,” says CTO of Sony Pictures Don Eklund. “As a subsidiary of a technology-focused company, we benefit from the power of Sony R&D and Sony’s product groups. The MovieLabs 2030 document represents the contribution of multiple studios to forecast and embrace the impact that cloud, machine learning and a range of hardware and software will have on our industry. We consider this a living document that will evolve over time and provide appreciated insight.”

According to Wise, SVP/CTO at Universal Pictures, “With film production experiencing unprecedented growth, and new innovative forms of storytelling capturing our audiences’ attention, we’re proud to be collaborating across the industry to envision new technological paradigms for our filmmakers so we can efficiently deliver worldwide audiences compelling entertainment.”

For those not familiar with MovieLabs, their stated goal is “to enable member studios to work together to evaluate new technologies and improve quality and security, helping the industry deliver next-generation experiences for consumers, reduce costs and improve efficiency through industry automation, and derive and share the appropriate data necessary to protect and market the creative assets that are the core capital of our industry.”

Digital Arts expands team, adds Nutmeg Creative talent

Digital Arts, an independently owned New York-based post house, has added several former Nutmeg Creative talent and production staff members to its roster — senior producer Lauren Boyle, sound designer/mixers Brian Beatrice and Frank Verderosa, colorist Gary Scarpulla, finishing editor/technical engineer Mark Spano and director of production Brian Donnelly.

“Growth of talent, technology, and services has always been part of the long-term strategy for Digital Arts, and we’re fortunate to welcome some extraordinary new talent to our staff,” says Digital Arts owner Axel Ericson. “Whether it’s long-form content for film and television, or working with today’s leading agencies and brands creating dynamic content, we have the talent and technology to make all of our clients’ work engaging, and our enhanced services bring their creative vision to fruition.”

Brian Donnelly, Lauren Boyle and Mark Spano.

As part of this expansion, Digital Arts will unveil additional infrastructure featuring an ADR stage/mix room. The current facility boasts several state-of-the-art audio suites, a 4K finishing theater/mixing dubstage, four color/finishing suites and expansive editorial and production space, which is spread over four floors.

The former Nutmeg team has hit the ground running working their long-time ad agency, network, animation and film studio clients. Gary Scarpulla worked on color for HBO’s Veep and Los Espookys, while Frank Verderosa has been working with agency Ogilvy on several Ikea campaigns. Beatrice mixed spots for Tom Ford’s cosmetics line.

In addition, Digital Arts’ in-house theater/mixing stage has proven to be a valuable resource for some of the most popular TV productions, including recording recent commentary sessions for the legendary HBO series, Game of Thrones and the final season of Veep.

Especially noteworthy is colorist Ericson’s and finishing editor Mark Spano’s collaboration with Oscar-winning directors Karim Amer and Jehane Noujaim to bring to fruition the Netflix documentary The Great Hack.

Digital Arts also recently expanded its offerings to include production services. The company has already delivered projects for agencies Area 23, FCB Health and TCA.

“Digital Arts’ existing infrastructure was ideally suited to leverage itself into end-to-end production,” Donnelly says. “Now we can deliver from shoot to post.”

Tools employed across post are Avid Pro Tools, D Control ES, S3 for audio post and Avid Media Composer, Adobe Premiere and Blackmagic Resolve for editing. Color grading is via Resolve.

Main Image: (L-R) Frank Verderosa, Brian Beatrice and Gary Scarpulla

 

Cabin adds two editors, promotes another

LA-based editorial studio Cabin Editing Company has grown its editing staff with the addition of Greg Scruton and Debbie Berman. They have also promoted Scott Butzer to editor. The trio will work on commercials, music videos, branded content and other short-form projects.

Scruton, who joins Cabin from Arcade Edit, has worked on dozens of high-profile commercials and music videos throughout his career, including Pepsi’s 2019 Grammy’s spot Okurrr, starring Cardi B; Palms Casino Resort’s star-filled Unstatus Quo; and Kendrick Lamar’s iconic Humble music video, for which he earned an AICE Award. Scruton has worked with high-profile ad agencies and directors, including Anomaly; Wieden + Kennedy; 72andSunny; Goodby, Silverstein & Partners; Dave Meyers; and Nadia Lee Cohen. He uses Avid Media Composer and Adobe Premiere.

Feature film editor Berman joins Cabin on the heels of her successful run with Marvel Studios, having recently served as an editor on Spider-Man: Homecoming, Black Panther and Captain Marvel. Her work extends across mediums, with experience editing everything from PSAs and documentaries to animated features. Now expanding her commercial portfolio with Cabin, Berman is currently at work on a Toyota campaign through Saatchi & Saatchi. She will continue to work in features as well. She mostly uses Media Composer but can also work on Premiere.

Cabin’s Butzer was recently promoted to editor after joining the company in 2017 and honing his talent across many platforms, including commercials, music videos and documentaries. His strengths include narrative and automotive work. Recent credits include Every Day Is Your Day for Gatorade celebrating the 2019 Women’s World Cup, The Professor for Mercedes Benz and Vince Staples’ Fun! music video. Butzer has worked with ad agencies and directors, including TBWA\Chiat\Day; Wieden + Kennedy; Goodby, Silverstein & Partners; Team One; Marcus Sonderland; Ryan Booth; and Rachel McDonald. Butzer previously held editorial positions at Final Cut and Whitehouse Post. He studied film at the University of Colorado at Boulder. He also uses Media Composer and Premiere.

London’s Cheat expands with color and finishing suites

London-based color and finishing house Cheat has expanded, adding three new grading and finishing suites, a production studio and a client lounge/bar space. Cheat now has four large broadcast color suites and services two other color suites at Jam VFX and No.8 in Fitzrovia and Soho, respectively. Cheat has a creative partnership with these studios.

Located in the Arthaus building in Hackney, all four of Cheat’s color suites have calibrated projection or broadcast monitoring and are equipped with cutting-edge hardware for HDR and working with 8K. Cheat was the first color company to complete a TV series in 8K on Netflix’s The End of The F***ing World in 2017. Having invested in improved storage and network infrastructure during this period, the facility is well-equipped to take on 8K and HDR projects.

Cheat uses Autodesk Flame for finishing and Blackmagic DaVinci Resolve for color grading.

The new HDR grading suite offers HDR mastering above 2,000 nits with a Flanders Scientific XM310K reference monitor that can master up to 3,000 nits. Cheat is also now a full-fledged Dolby Vision-certified mastering facility.

“Improving client experience was, of course, a key consideration in shaping the design of the renovation,” says Toby Tomkins, founder of Cheat. “The new color suite is our largest yet and comfortably seats up to 10 people. We designed it from the ground up with a raised client platform and a custom-built bias wall. This allows everyone to look at the same single monitor while grading and maintaining the spacious and relaxed feel of our other suites. The new lounge and bar area also offer a relaxing area for clients to feel at home.”

Dick Wolf’s television empire: his production and post brain trust

By Iain Blair

The TV landscape is full of scripted police procedurals and true crime dramas these days, but the indisputable and legendary king of that crowded landscape is Emmy-winning creator/producer Dick Wolf, whose name has become synonymous with high-quality drama.

Arthur Forney

Since it burst onto the scene back in 1990, his Law & Order show has spawned six dramas and four international spinoffs, while his “Chicago” franchise gave birth to another four series — the hugely popular Chicago Med, Chicago Fire and Chicago P.D. His Chicago Justice was cancelled after one season.

Then there’s his “FBI” shows, as well as the more documentary-style Cold Justice. If you’ve seen Cold Justice — and you should — you know that this is the real deal, focusing on real crimes. It’s all the more fascinating and addictive because of it.

Produced by Wolf and Magical Elves, the real-life crime series follows veteran prosecutor Kelly Siegler, who gets help from seasoned detectives as they dig into small-town murder cases that have lingered for years without answers or justice for the victims. Together with local law enforcement from across the country, the Cold Justice team has successfully helped bring about 45 arrests and 20 convictions. No case is too cold for Siegler, as the new season delves into new unsolved homicides while also bringing updates to previous cases. No wonder Wolf calls it “doing God’s work.” Cold Justice airs on true crime network Oxygen.

I recently spoke with Emmy-winning Arthur Forney, executive producer of all Wolf Entertainment’s scripted series (he’s also directed many episodes), about posting those shows. I also spoke with Cold Justice showrunner Liz Cook and EP/head of post Scott Patch.

Chicago Fire

Dick Wolf has said that, as head of post, you are “one of the irreplaceable pieces of the Wolf Films hierarchy.” How many shows do you oversee?
Arthur Forney: I oversee all of Wolf Entertainment’s scripted series, including Law & Order: Special Victims Unit, Chicago Fire, Chicago P.D., Chicago Med, FBI and FBI: Most Wanted.

Where is all the post done?
Forney: We do it all at NBCUniversal StudioPost in LA.

How involved is Dick Wolf?
Forney: Very involved, and we talk all the time.

How does the post pipeline work?
Forney: All film is shot on location and then sent back to the editing room and streamed into the lab. From there we do all our color corrections, which takes us into downloading it into Avid Media Composer.

What are the biggest challenges of the post process on the shows?
Forney: Delivering high-quality programming with a shortened post schedule.

Chicago Med

What are the editing challenges involved?
Forney: Trying to find the right way of telling the story, finding the right performances, shaping the show and creating intensity that results in high-quality television.

What about VFX? Who does them?
Forney: All of our visual effects are done by Spy Post in Santa Monica. All of the action is enhanced and done by them.

Where do you do the color grading?
Forney: Coloring/grading is all done at NBCUniversal StudioPost.

Now let’s talk to Cook and Patch about Cold Justice:

Liz and Scott, I recently saw the finale to Season 5 of Cold Justice. That was a long season.
Liz Cook: Yes, we did 26 episodes, so it was a lot of very long days and hard work.

It seems that there’s more focus than ever on drug-related cases now.
Cook: I don’t think that was the intention going in, but as we’ve gone on, you can’t help but recognize the huge drug problem in America now. Meth and opioids pop up in a lot of cases, and it’s obviously a crisis, and even if they aren’t the driving force in many cases, they’re definitely part of many.

L-R: Kelly Siegler, Dick Wolf, Scott Patch and Liz Cook. Photo by Evans Vestal Ward

How do you go about finding cases for the show?
Cook: We have a case-finding team, and they get the cases various ways, including cold-calling. We have a team dedicated to that, calling every day, and we get most of them that way. A lot come through agencies and sheriff’s departments that have worked with us before and want to help us again. And we get some from family members and some from hits on the Facebook page we have.

I assume you need to work very closely with local law agencies as you need access to their files?
Cook: Exactly. That’s the first part of the whole puzzle. They have to invite us in. The second part is getting the family involved. I don’t think we’d ever take on a case that the family didn’t want us to do.

What’s involved for you, and do you like being a showrunner?
Cook: It’s a tough job and pretty demanding, but I love it. We go through a lot of steps and stuff to get a case approved, and to get the police and family on board, and then we get the case read by one of our legal readers to evaluate it and see if there’s a possibility that we can solve it. At that point we pitch it to the network, and once they approve it and everyone’s on board, then if there are certain things like DNA and evidence that might need testing, we get all that going, along with ballistics that need researching, and stuff like phone records and so on. And it actually moves really fast – we usually get all these people on board within three weeks.

How long does it take to shoot each show?
Cook: It varies, as each show is different, but around seven or eight days, sometimes longer. We have a case coming up with cadaver dogs, and that stuff will happen before we even get to the location, so it all depends. And some cases will have 40 witnesses, while others might have over 100. So it’s flexible.

Cold Justice

Where do you post, and what’s the schedule like?
Scott Patch: We do it all at the Magical Elves offices here in Hollywood — the editing, sound, color correction. The online editor and colorist is Pepe Serventi, and we have it all on one floor, and it’s really convenient to have all the post in house. The schedule is roughly two months from the raw footage to getting it all locked and ready to air, which is quite a long time.

Dailies come back to us and we do our first initial pass by the story team and editors, and they’ll start whittling all the footage down. So it takes us a couple of weeks to just look at all the footage, as we usually have about 180 hours of it, and it takes a while to turn all that into something the editors can deal with. Then it goes through about three network passes with notes.

What about dealing with all the legal aspects?
Patch: That makes it a different kind of show from most of the others, so we have legal people making sure all the content is fine, and then sometimes we’ll also get notes from local law agencies, as well as internal notes from our own producers. That’s why it takes two months from start to finish.

Cook: We vet it through local law, and they see the cuts before it airs to make sure there are no problems. The biggest priority for us is that we don’t hurt the case at all with our show, so we always check it all with the local D.A. and police. And we don’t sensationalize anything.

Cold Justice

Patch: That’s another big part of editing and post – making sure we keep it authentic. That can be a challenge, but these are real cases with real people being accused of murder.

Cook: Our instinct is to make it dramatic, but you can’t do that. You have to protect the case, which might go to trial.

Talk about editing. You have several editors, I assume because of the time factor. How does that work?
Patch: Some of these cases have been cold for 25 or 30 years, so when the field team gets there, they really stand back and let the cops talk about the case, and we end up with a ton of stuff that you couldn’t fit into the time slot however hard you tried. So we have to decide what needs to be in, what doesn’t.

Cook: On day one, our “war room” day, we meet with the local law and everyone involved in the case, and that’s eight hours of footage right there.

Patch: And that gets cut down to just four or five minutes. We have a pretty small but tight team, with 10 editors who split up the episodes. Once in a while they’ll cross over, but we like to have each team and the producers stay with each episode as long as they can, as it’s so complicated. When you see the finished show, it doesn’t seem that complicated, but there are so many ways you could handle the footage that it really helps for each team to really take ownership of that particular episode.

How involved is Dick Wolf in post?
Cook: He loves the whole post process, and he watches all the cuts and has input.

Patch: He’s very supportive and obviously so experienced, and if we’re having a problem with something, he’ll give notes. And for the most part, the network gives us a lot of flexibility to make the show.

What about VFX on the show?
Patch: We have some, but nothing too fancy, and we use an outside VFX/graphics company, LOM Design. We have a lot of legal documents on the show, and that stuff gets animated, and we’ll also have some 3D crime scene VFX. The only other outside vendor is our composer, Robert ToTeras.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Quick Chat: Bonfire Labs’ Mary Mathaisell

Over the course of nearly 30 years, San Francisco’s Bonfire Labs has embraced change. Over the years, the company evolved from an editorial and post house to a design and creative content studio that leverages the best aspects of the agency and production company models without adhering to either one.

This hybrid model has worked well for product launches for Google, Facebook, Salesforce, Logitech and many others.

The latest change is in the company’s ownership, with the last of the original founders stepping down and a new management partnership taking over — led by executive producer Mary Mathaisell, managing director Jim Bartel and head of strategy and creative Chris Weldon.

We spoke with Mathaisell to get a better sense of Bonfire Labs’ past, present and future.

Can you give us some history of Bonfire Labs? When did you join the company? How/why did you first get into producing?
I’ve been with Bonfire Labs for seven years. I started here as head of production. After being at several large digital agencies working on campaigns and content for brands like Target, Gap, LG and PayPal, I wanted to build something more sustainable than just another campaign and was thrilled that Bonfire was interested in growing into a full-service creative company with integrated production.

Prior to working at AKQA and Publicis, I worked in VFX and production as well as design for products and interfaces, but my primary focus and love has always been commercial production.

The studio has evolved from a traditional post studio to creative strategy and content company. What were the factors that drove those changes?
Bonfire Labs has always been smart about staying small and strategic about the kind of work and clients to focus on. We have been able to change based on both the kind of work we want to be doing and what the market needs. With a giant need for content, especially video content, we have decided to staff and service clients as experts across all the phases of creative development and production and finishing. Instead of going to an agency and a production company and post houses, our clients can work directly with us on everything from concept to finishing.

Silicon Valley is clearly a big client base for you. What are they generally coming to you for? Are the content needs in high tech different from other business sectors?
Our clients usually have a new product, feature or brand that they want the world to know about. We work on product launches, brand awareness campaigns, product education, event content and social content. Most of our work is for technology companies, but every company these days has a technology component. I would say that speed to market is one key differentiator for our clients. We are often building stories as we are in production, so we get a lot done with our clients through creative collaboration and by not following the traditional rules of an agency or a production company.

Any specific trends that you’re seeing recently from your clients? New areas that Bonfire is looking to explore, either new markets for your talents or technology you’re looking to explore further?
Rapid brand prototyping is a new service we are offering to much excitement. Because we have experience across so many technology brands and work closely with our clients, we can develop a language and brand voice faster than most traditional agencies. Technology brands are evolving so quickly that we often start working on content creation before a brand has defined itself or transitioned to its next phase. Rapid brand prototyping allows brands to test content and grow the brand simultaneously.

Blade Shadow

Can you talk about some projects that you have done recently that challenged you and the team?
We rolled out a launch film for a new start-up client called Blade Shadow. We are working with Salesforce to develop trailblazer stories and anthem films for its .org branch, which focuses on NGOs, education and philanthropy.

The company is undergoing a transition with some of the original partners. Can you talk about that a bit as well?
The original founders have passed the torch to the group of people who have been managing and producing the work over the past five to 15 years. We have six new owners, three managing partners and three associate partners. Jim Bartel is the managing director; Chris Weldon is the head of strategy and creative, and I’m the executive producer in charge of content development and production. The three of us make up the management team.

The three of us make up the management team. Sheila Smith (head of production) Robbie Proctor (head of editorial) and Phil Spitler (creative technology lead) are associate partners as they contribute to and lead so much of our work and process and have been part of the company for over 10 years each.

 

Review: Dell UltraSharp 27 4K InfinityEdge monitor

By Sophia Kyriacou

The Dell UltraSharp U2718Q monitor did not disappoint. Getting started requires minimal effort. You are up and running in no time — from taking it out of the box to switching it on. The stand, the standard Dell mount, is simple to assemble and intuitive, so you barely need to look at any instructions. But if you do, there is a step-by-step guide to help you set up within minutes.

The monitor comes in a well-designed package, which ensures it gets to you safely and securely. The Dell stand is easily adjustable without fuss and remains in place to your liking, with a swivel of 45 degrees to the left or right, a 90-degree pivot clockwise and counter clockwise, and a maximum height of 130mm. This adjustability means it will certainly meet all your comfort and workflow needs, with the pivot being incredibly useful when working in portrait formats.

The InfinityEdge display not only makes the screen look attractive but, more importantly, gives you extra surface area. When working with more than one monitor, having the ultra-thin edge makes the viewing experience less of a distraction, especially when monitors are butted up together. For me, the InfinityEdge is what makes it … in addition to the image quality and resolution, of course!

The Dell UltraSharp U2718Q has a flicker-free screen, making it comfortable on the eyes. It also has 3480×2160 pixels and boasts a color depth of 1.07 billion colors. The anti-glare coating works very well and meets all the needs of work environments with multiple and varied lighting conditions.

There are several connectors to choose from: one DP (v 1.2), one mDP (v 1.2), one HDMI (v 2.0), one USB 3.0 port (upstream), four USB 3.0 ports (including two USB 3.0 BC 1.2) with charging capability at 2A (max), and an audio line out. You are certainly not going to be short of inputs. I found the on-screen navigation incredibly easy to use. The overall casing design is minimal and subtle, with tones of black and dark silver. With the addition of the InfinityEdge, this monitor looks attractive. There is also a matching keyboard and mouse available.

Summing Up
Personally, I like to set my main monitor at a comfortable distance, with the second monitor butted up to my left at an angle of -35 degrees. Being left-handed, this setup works for me ergonomically, keeping my browser, timeline and editing window on that side, so I’m free to focus on the larger-scale composition in front of me.

The two Dell UltraSharp U2718Q monitors I use are great, as they give me the breathing space to focus on creating without having to constantly move windows around, breaking my flow. And thanks to InfinityEdge, the overall experience feels seamless. I have both monitors set up exactly the same so the color matches and retains the same maximum quality perfectly.


Sophia Kyriacou is an award-winning conceptual creative motion designer and animator with over 22 years experience within the broadcast design industry. She’s splits her time between working at the BBC in London and taking on freelance jobs. She is a full voting member at BAFTA and is currently working on a script for a 3D animated short film. 

Post vet Chris Peterson joins NYC’s Chimney North

Chimney’s New York studio has hired Chris Peterson as its new EP of longform entertainment, building on the company’s longform credits, which include The Dead Don’t Die, Atomic Blonde, Chappaquiddick, The Wife and Her.

Chimney is a full-service company working in feature films, television, commercials, digital media, live events and business-to-business communications. The studio has offices in 11 cities and eight countries worldwide.

In his new role, Peterson will be using his expertise in film finance, tax credit maximization and technical workflows to grow Chimney’s feature film and television capabilities. He brings over 20 years of experience in production and post, including a stint at Mechanism Digital and Post Factory, NY.

Peterson’s resume is diverse and spans the television, film, technology, advertising, music, video game and XR industries. Projects include the Academy Award-winning feature Spotlight, the Academy Award-winning documentary OJ: Made in America, and the Grammy-nominated Roger Waters: The Wall. For E! Entertainment’s travel series Wild On, he produced shows in Argentina, Brazil, Trinidad and across the United States. He was also a post producer on Howard Stern on Demand.

“Chimney combines the best of both worlds: a boutique feel and global resources,” says Peterson. “Add to that the company’s expertise in financing and tax credits, and you have a unique resource for producers and filmmakers.”

For the past eight years, Peterson has been board secretary of the Post New York Alliance, which was co-founded by Chimney North America CEO Marcelo Gandola. The PNYA is a trade association that lobbied for and passed the first post-only tax credit, which was recently extended for two years. Peterson is also a member of SMPTE.

Behind the Title: Amazon senior post exec Frank Salinas

NAME: Frank Salinas

COMPANY: Amazon Studios

CAN YOU DESCRIBE YOUR COMPANY?
We’re Amazon.com….Look us up. Small e-commerce bookstore turned global marketplace, cloud storage services and content maker and broadcaster.

WHAT’S YOUR JOB TITLE?
Senior Post Production Executive

WHAT DOES THAT ENTAIL?
My core responsibility is to support and shepherd our series, specials and/or episodes in partnership with our production company from preproduction to delivery.

From the early stages of conceptualizing and planning our productions through color grading, mixing, QC, mastering, publishing and broadcast/launch, it’s my responsibility to oversee that our timelines are met and our commitments to our customers are kept.

Our customers expect the highest standards for quality. I work closely and in tandem with all the other departments to assure that our content is ready for distribution on time, under budget and to the utmost standards. Meaning we are shooting at the highest quality, localizing (whether subtitles or dubbing) in all the languages we are distributing to and that the quality is upheld throughout that process.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
I’m making it a point of getting involved in the post production process before cameras are chosen or scripts are ever finalized to assure we have a clear runway and a set workflow for success.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Being on set or leading into that moment before going on set and having a plan and a strategy in motion and being able to watch it be executed. It almost never plays out as you predicted, but having the knowledge and the confidence to adjust, and being fluid in that moment, is my favorite part of the job.

WHAT’S YOUR LEAST FAVORITE?
My least favorite part of the job would have to be the extraneous meetings that go into making a series. It’s part of the process but I’m not a big fan of meetings

WHAT IS YOUR MOST PRODUCTIVE TIME OF THE DAY?
My most productive part of the day would likely be my 90-minute drive into the office. This is when I can create my “to-do’s list” for that day, and then the two to three hours I have in the morning before anyone arrives. This allows me to tackle the list without interruption. That and the few times I have the opportunity to run in the morning. It’s those times that allow me to clear my head and process my thoughts linearly.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
If I wasn’t a post executive, I’d likely be a real estate agent or TV/film agent. I get a lot of joy whenever I’m able to make someone happy by being able to pair them with something or someone that fits them perfectly — whatever it is that they are looking for. Finding that perfect marriage between that person and that thing they are needing or wanting brings me a lot of happiness.

WHY DID YOU CHOOSE THIS PROFESSION?
I’ve enjoyed television and the film medium for as long as I can remember. From the moment I saw my first episode of The Twilight Zone and realized that you could really leave your audience asking the question of “Is this real?” or “What if? I thought there was something so powerful about that.

Lorena

CAN YOU NAME A RECENT PROJECT YOU HAVE WORKED ON?
The documentary Lorena; Last One Laughing Mexico;This Is Football, premiering early August; Gymkhana; The Jonas Brothers film Chasing Happiness; The live Prime Day concert 2019;
The series Carnival Row (launching 8/31); and the All or Nothing series, just to name a few.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
I have a few, but most of them stem from my time at 25/7 Productions. Ultimate Beastmaster, The Briefcase and Strong all hold a special place in my heart, not only because I was able to work on them with people whom I consider my family but because we created something that positively changed peoples lives and expanded their way of thinking

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
I’m going to list four since I’m a techy through and through…

My phone. It’s my safety blanket and my window to the world.

My laptop, which is just a larger window or blanket.

My car. Although it’s basic in nature and not pretentious at all, it allows me to be mobile but still allows me a safe place to work. For the amount of time I spend in my car it’s really become my mobile office.

My headphones. Whether I’m running in my neighborhood or traveling on a plane, the joy I get from listening to music and podcasts is absolute. I love music.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
Instagram and Facebook are the two I find myself on, and I tend to follow things that I’m passionate about. My sports teams — the Dodgers, Lakers and Kings — and I love architecture and food so I tend to follow those publications that showcase great photos of both.

DO YOU LISTEN TO MUSIC WHILE YOU WORK? CARE TO SHARE YOUR FAVORITE MUSIC TO WORK TO?
I love music… almost all of it. Classic rock, reggae, pop, hip-hop, rap, house, country, jazz, Latin, punk. Everything but Phish or Grateful Dead? I just don’t get it.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
My love for running, cooking or eating great food, traveling and being with my family helps to remind me that it’s only TV. I constantly need to be reminded that what we are doing, while important, is also just entertainment.

HPA’s 2019 Engineering Excellence Award winners  

The Hollywood Professional Association (HPA) Awards Committee have announced the winners of the 2019 HPA Engineering Excellence Award. They were selected by a judging panel after a session held at IMAX on June 22. Honors will be bestowed on November 21 at the 14th annual HPA Awards gala at the Skirball Cultural Center in Los Angeles.

The HPA Awards were founded in 2005 to recognize creative artistry and innovation in the professional media content industry. A coveted honor, the Engineering Excellence Award rewards outstanding technical and creative ingenuity in media, content production, finishing, distribution and archive.

“Every year, it is an absolute pleasure and a privilege to witness the innovative work that is brewing in our industry,” says HPA Awards Engineering Committee chair Joachim Zell. “Judging by the number of entries, which was our largest ever, there is genuine excitement within our community to push our capabilities to the next level. It was a close race and shows us that the future is being plotted by the brilliance that we see in the Engineering Excellence Awards. Congratulations to the winners, and all the entrants, for impressive and inspiring work.”

Adobe After Effects

Here are the winners:
Adobe – Content-Aware Fill for Video in Adobe After Effects
Content-Aware Fill for video uses intelligent algorithms to automatically remove unwanted objects like boom mics or distracting signs from video. Using optical flow technology, Content-Aware Fill references frames before, next to or after an object and fills the area automatically making it look as if the object was never there.

Epic Games — Unreal Engine 4
Unreal Engine is a flexible and scalable realtime visualization platform enabling animation, simulation, performance capture and photorealistic renders at unprecedented speeds. Filmmakers, broadcasters and beyond use Unreal Engine to scout virtual locations and sets, complete previsualization, achieve in-camera final-pixel VFX on set, deliver immersive live mixed reality broadcasts, edit CG characters and more in realtime. Unreal Engine dramatically streamlines content creation and virtual production, affording creators greater flexibility and freedom to achieve their visions.

Pixelworks — TrueCut Motion
TrueCut Motion is a cinematic video tool for finely tuning motion appearance.  It uses Pixelworks’ 20 years of experience in video processing, together with a new motion appearance model and motion dataset. Used as a part of the creative process, TrueCut Motion enables filmmakers to explore a broader range of motion appearances than previously possible.

Portrait Displays and LG Electronics — CalMan LUT based Auto-Calibration Integration with LG OLED TVs
OLED televisions are commonly used in Hollywood for various uses, including as a client viewing monitor, SDR BT.709 reference monitor and as QC monitor for consumer deliverables, including broadcasting, optical media and OTT. To be used in these professional settings, a highly accurate color calibration is essential. Portrait Displays and LG Electronics partnered to bring 1D and 3D LUT-based hardware level CalMan AutoCal to the 2018 and newer LG OLED televisions.

Honorable Mentions were awarded to Ambidio for Ambidio Looking Glass; Grass Valley, for creative grading; and Netflix, Inc. for Photon.

In addition to the honors for excellence in engineering, the HPA Awards will recognize excellence in 12 craft categories, including color grading, editing, sound and visual effects. The recipients of the Judges Award for Creativity and Innovation and Lifetime Achievement Award will be announced in the coming weeks.
Tickets for the 14th annual HPA Awards will be available for purchase later this summer.