Author Archives: Randi Altman

Colorist Chat: Nice Shoes’ Maria Carretero on Super Bowl ads, more

This New York based colorist, who worked on four Super Bowl spots this year, talks workflow, inspiration and more.

Name: Maria Carretero

Company: Nice Shoes

What kind of services does Nice Shoes offer?
Nice Shoes is a creative studio with color, editorial, animation, VFX, AR and VR services. It’s a full-service studio with offices in NYC, Chicago, Boston, Minneapolis, and Toronto, as well as remote locations throughout North America.

Michelob Ultra’s Jimmy Works It Out

As a colorist, what would surprise people the most about what falls under that title?
I think people are surprised when they discover that there is a visual language in every single visual story that connects your emotions through all the imagery that we’ve collected in our brains. This work gives us the ability to nudge the audience emotionally over the course of a piece. Color grading is rooted in a very artistic base – core, emotional aspects that have been studied in art and color theory that make you explore cinematography in such an interesting way.

What system do you work on?
We use FilmLight Baselight as our primary system, but the team is also versed in Blackmagic Resolve.

Are you sometimes asked to do more than just color on projects?
Sometimes. If you have a solid relationship with the DP or the director, they end up consulting you about palettes, optics and references, so you become an active part of the creativity in the film, which is very cool. I love when I can get involved in projects from the beginning.

What’s your favorite part of the job?
My favorite moment is when you land on the final look and you see that the whole film is making visual sense and you feel that the story, the look and the client are all aligned — that’s magic!

Any least favorites?
No, I love coloring. Sometimes the situation becomes difficult because there are technical issues or disagreements, but it’s part of the work to push through those moments and make things work

If you didn’t have this job, what would you be doing instead?
I would probably be a visual artist… Always struggling to keep the lights on. I’m kidding! I have so much respect for visual artists, I think they should be treated better by our society because without art there is no progress.

How early did you know this would be your path?
I was a visual artist for seven years. I was part of Nives Fernandez’s roster http://www.nfgaleria.com/, and all that I wanted at that time was to try to tell my stories as an artist. I was freelancing in VFX to get some money that helped me survive, and I landed on the VFX side, and from there to color was a very easy switch. When I landed in Deluxe Spain 16 years ago and I started to explore color, I quickly fell in love.

It’s why I like to say that color chose me.

Avocados From Mexico: Shopping Network

You recently worked on a number of Super Bowl spots. Can you talk a bit about your work on them, and any challenges relating to deadlines?
This year I worked on four Super Bowl spots Michelob Ultra PureGold: 6 for 6 Pack, Michelob Ultra: Jimmy Works It Out, Walmart: United Towns and Avocados From Mexico: Shopping Network.

Working on these kinds of projects is definitely a really interesting experience. The deadlines are tight, the pressure is enormous, but at the same time, the amount of talent and creativity involved is gigantic, so if you survive (laughs) you always will be a better professional. As a colorist I love to be challenged. I love dealing with difficult situations where all your resources and your energy is being put to the test.

Any suggestions for getting the most out of a project from a color perspective?
Thousands! Technical understanding, artistic involvement, there are so many… But definitely trying to create something new, special, different, embracing the challenges and pushing beyond the boundaries is the key to delivering good work.

How do you prefer to work with the DP or director?
I like working with both. Debating with any kind of artist is the best. It’s really great to be surrounded by someone that uses a common “language.” As I mentioned earlier, I love when there’s the opportunity to get the conversation going at the beginning of a project so that there’s more opportunity for collaboration, debate and creativity.

How do you like getting feedback in terms of the look? Photos, films, etc.?
Every single bit of information is useful. I love when they verbalize what they’re going for using stories, feelings — when you can really feel they’re expressing personality with the film.

Where do you find inspiration? Art? Photography?
I find inspiration in living! There are so many things that surround us that can be a source of inspiration. Art, landscapes, the light that you remember from your childhood, a painting, watching someone that grabs your attention on a train. New York is teeming with more than enough life and creativity to keep any artist going.

Name three pieces of technology you can’t live without.
The Tracker, Spotify and FaceTime.

This industry comes with tight deadlines, how do you de-stress from it all?
I have a sense of humor and lots of red wine (smiles).

London’s Molinare launches new ADR suite

Molinare has officially opened a new ADR suite in its Soho studio in anticipation of increased ADR output and to complement last month’s CAS award-winning ADR work on Fleabag. Other recent ADR credits for the company include Good Omens, The Capture and Strike Back. Molinare sister company Hackenbacker also picked up some award love with a  a BAFTA TV Craft and an AMPS award for Killing Eve.

Molinare and Hackenbacker’s audio setup includes nine mixing theaters, three of which have Dolby 5.1/7.1 Theatrical or Commercials & Trailers Certification, and one has full Dolby Atmos home entertainment mix capability.

Molinare works on high-end TV dramas, feature films, feature documentaries and TV reality programming. Recent audio credits include BBC One’s Dracula, The War of the Worlds from Mammoth Screen and Worzel Gummidge. Hackenbacker has recently worked on HBO’s Avenue 5 for returning director Armando Iannucci and Carnival Film’s Downton Abbey and has contributed to the latest season of Peaky Blinders.

Sohonet intros ClearView Pivot for 4K remote post

Sohonet is now offering ClearView Pivot, a solution for realtime remote editing, color grading, live screening and finishing reviews at full cinema quality. The new solution will provide connectivity and collaboration services for productions around the world.

ClearView Pivot offers 4K HDR with 12-bit color depth and 4:4:4 chroma sampling for full-color-quality video streaming with ultra-low latency over the Sohonet’s private media network, which avoids the extreme compression required due to contention and latency of public internet connections.

“Studios around the world need a realtime 4K collaboration tool that can process video at lossless color fidelity using the industry-standard JPEG 2000 codec between two locations across a network like ours. Avoiding the headache of the current ‘equipment only’ approach is the only scalable solution,” explains Sohonet CEO Chuck Parker.

Sohonet says its integrated solution is approved by ISE (Independent Security Evaluators) — the industry’s gold standard for security. Sohonet’s solution provides an encrypted stream between each endpoint and provides an auditable usage trail for every solution. The Soho Media Network ( SMN) connection offers ultra-low latency (measured in milliseconds), and the company says that unlike equipment-only solutions that require the user to navigate firewall and security issues and perform a “solution check” before each session, ClearView Pivot works immediately. As a point-to-multipoint solution, the user can also pivot easily from one endpoint to the next to collaborate with multiple people at the click of a button or even to stream to multiple destinations at the same time.

Sohonet has been working closely with productions on lots and on locations over the past few years in the ongoing development of ClearView Pivot. In those real-world settings, ClearView Pivot has been put through its paces with trials across multiple departments, and the color technologies have been fully inspected and approved by experts across the industry.

Sundance Videos: Editor to Editor

Our own Brady Betzel headed out to Park City this year to talk to a few editors whose films were being screened at the Sundance Film Festival.

As an editor himself, Betzel wanted to know about the all-important workflow, but also about how they got their start in the business and how important it is to find a balance between work life and personal life.

Among those he sat down with were Scare Me editor Patrick Lawrence, Boys State editors Jeff Gilbert and Connor Hall, Save Yourselves! editor Sofi Marshall, Aggie editor Gil Seltzer, Miss Juneteenth editor Courtney Ware, Black Bear editor Matthew L. Weiss, Spree editor Benjamin Moses Smith and Dinner in America writer/director/editor Adam Carter Rehmeier.

Click here to see them all.

MPI restores The Wizard of Oz in 4K HDR

By Barry Goch

The classic Victor Fleming-directed film The Wizard of Oz, which was released by MGM in 1939 and won two of its six Academy Award nominations, has been beautifully restored by Burbank’s Warner Bros. Motion Picture Imaging (MPI).

Bob Bailey

To share its workflow on the film, MPI invited a group of journalists to learn about the 4K UHD HDR restoration of this classic film. The tour guide for our high-tech restoration journey was MPI’s VP of operations and sales Bob Bailey, who walked us through the entire restoration process — from the original camera negative to final color.

The Wizard of Oz, which starred Judy Garland, was shot on a Technicolor three-strip camera system. According to Bailey, it ran three black and white negatives simultaneously. “That is why it is known as three-strip Technicolor. The magazine on top of the camera was triple the width of a normal black and white camera because it contained each roll of negative to capture your red, green and blue records,” explained Bailey.

“When shooting in Technicolor, you weren’t just getting the camera. You would rent a package that included the camera, a camera crew with three assistants, the film, the processing and a Technicolor color consultant.”

George Feltenstein, SVP of theatrical catalog marketing for Warner Bros. Home Entertainment, spoke about why the film was chosen for restoration. “The Wizard of Oz is among the crown jewels that we hold,” he said. “We wanted to embrace the new 4K HDR technology, but nobody’s ever released a film that old using this technology. HDR, or high dynamic range, has a color range that is wider than anything that’s come before it. There are colors [in The Wizard of Oz] that were never reproducible before, so what better a film to represent that color?”

Feltenstein went on to explain that this is the oldest film to get released in the 4K format. He hopes that this is just the beginning and that many of the films in Warner Bros.’ classic library will also be released on 4K HDR and worked on at MPI under Bailey’s direction.

The Process
MPI scanned each of the three-strip Technicolor nitrate film negatives at 8K 16-bit, composited them together and then applied a new color grain. The film was rescanned with the Lasergraphics Director 10K scanner. “We have just under 15 petabytes of storage here,” said Bailey. “That’s working storage, because we’re working on 8K movies since [some places in the world] are now broadcasting 8K.”

Steven Anastasi

Our first stop was to look at the Lasergraphics Director. We then moved on to MPI’s climate-controlled vault, where we were introduced to Steven Anastasi, VP of technical operations at Warner Bros. Anastasi explained that the original negative vault has climate-controlled conditions with 25% humidity at 35 degrees Fahrenheit, which is the combination required for keeping these precious assets safe for future generations. He said there are 2 million assets in the building, including picture and sound.

It was amazing to see film reels for 2001: A Space Odyssey sitting on a shelf right in front of me. In addition to the feature reels, MPI also stores millions of negatives captured throughout the years by Warner productions. “We also have a very large library,” reported Anastasi. “So the original negatives from the set, a lot of unit photography, head shots in some cases and so forth. There are 10 million of these.”

Finally, we were led into the color bay to view the film. Janet Wilson, senior digital colorist at MPI, has overseen every remaster of The Wizard of Oz for the past 20 years. Wilson used a FilmLight Baselight X system for the color grade. The grading suite housed multiple screens: a Dolby Pulsar for the Dolby Vision pass, a Sony X300 and a Panasonic EZ1000 OLED 4K HDR.

“We have every 4K monitor manufactured, and we run the film through all of them,” said Bailey. “We painstakingly go through the process from a post perspective to make sure that our consumers get the best quality product that’s available out in the marketplace.”

“We want the consumer experience on all monitors to be something that’s taken into account,” added Feltenstein. “So we’ve changed our workflow by having a consumer or prosumer monitor in these color correction suites so the colorist has an idea of what people are going to see at home, and that’s helped us make a better product.”

Our first view of the feature was a side-by-side comparison of the black and white scanned negative and the sepia color corrected footage. The first part of the film, which takes place in Kansas, was shot in black and white, and then a sepia look was applied to it. The reveal scene, when Dorothy passes through the door going into Oz, was originally shot in color. For this new release, the team generated a matte so Wilson could add this sepia area to the inside of the house as Dorothy transitioned into Oz.

“So this is an example of some of the stuff that we could do in this version of the restoration,” explained Wilson. “With this version, you can see that the part of the image where she’s supposed to be in the monochrome house is not actually black and white. It was really a color image. So the trick was always to get the interior of the house to look sepia and the exterior to look like all of the colors that it’s supposed to. Our visual effects team here at MPI — Mike Moser and Richie Hiltzik — was able to draw a matte for me so that I could color inside of the house independently of the exterior and make them look right, which was always a really tricky thing to do.”

Wilson referred back to the Technicolor three-strip, explaining that because you’ve got three different pieces of film — the different records — they’re receiving the light in different ways. “So sometimes one will be a little brighter than the other. One will be a little darker than the other, which means that the Technicolor is not a consistent color. It goes a little red, and then it goes a little green, and then it goes a little blue, and then it goes a little red again. So if you stop on any given frame, it’s going to look a little different than the frames around it, which is one of the tricky parts of color correcting technical art. When that’s being projected by a film projector, it’s less noticeable than when you’re looking at it on a video monitor, so it takes a lot of little individual corrections to smooth those kinds of things out.”

Wilson reported seeing new things with the 8K scan and 4K display. “The amount of detail that went into this film really shows up.” She said that one of the most remarkable things about the restoration was the amazing detail visible on the characters. For the first time in many generations, maybe ever, you can actually see the detail of the freckles on Dorothy’s face.

In terms of leveraging the expanded dynamic range of HDR, I asked Wilson if she tried to map the HDR, like in kind of a sweet spot, so that it’s both spectacular yet not overpowering at the same time.

“I ended up isolating the very brightest parts of the picture,” she replied. “In this case, it’s mostly the sparkles on their shoes and curving those off so I could run those in, because this movie is not supposed to have modern-day animation levels of brightness. It’s supposed to be much more contained. I wanted to take advantage of brightness and the ability to show the contrast we get from this format, because you can really see the darker parts of the picture. You can really see detail within the Wicked Witch’s dress. I don’t want it to look like it’s not the same film. I want it to replicate that experience of the way this film should look if it was projected on a good print on a good projector.”

Dorothy’s ruby slippers also presented a challenge to Wilson. “They are so red and so bright. They’re so light-reflective, but there were times when they were just a little too distracting. So I had to isolate this level at the same track with slippers and bring them down a little bit so that it wasn’t the first and only thing you saw in the image.”

If you are wondering if audio was part of this most recent restoration, the answer is no, but it had been remastered for a previous version. “As early at 1929, MGM began recording its film music using multiple microphones. Those microphonic angles allowed the mixer to get the most balanced monophonic mix, and they were preserved,” explained Feltenstein. “Twenty years ago, we created a 5.1 surround mix that was organically made from the original elements that were created in 1939. It is full-frequency, lossless audio, and a beautiful restoration job was made to create that track so you can improve upon what I consider to be close to perfection without anything that would be disingenuous to the production.”

In all, it was an amazing experience to go behind the scenes and see how the wizards of MPI created a new version of this masterpiece for today and preserved it for future generations.

This restored version of The Wizard of Oz is a must-see visual extravaganza, and there is no better way to see it than in UHD, HDR, Dolby Vision or HDR10+. What I saw in person took my breath away, and I hope every movie fan out there can have the opportunity to see this classic film in its never-before-seen glory.

The 4K version of The Wizard of Oz is currently available via an Ultra HD Blu-ray Combo Pack and digital.


Barry Goch is a finishing artist at LA’s The Foundation as well as a UCLA Extension Instructor, Post Production. You can follow him on Twitter at @Gochya

Behind the Title: Harbor sound editor/mixer Tony Volante

“As re-recording mixer, I take all the final edited elements and blend them together to create the final soundscape.”

Name: Tony Volante

Company: Harbor

Can you describe what Harbor does?
Harbor was founded in 2012 to serve the feature film, episodic and advertising industries. Harbor brings together production and post production under one roof — what we like to call “a unified process allowing for total creative control.”

Since then, Harbor has grown into a global company with locations in New York, Los Angeles and London. Harbor hones every detail throughout the moving-image-making process: live-action, dailies, creative and offline editorial, design, animation, visual effects, CG, sound and picture finishing.

What’s your job title?
Supervising Sound Editor/Re-Recording Mixer

What does that entail?
I supervise the sound editorial crew for motion pictures and TV series along with being the re-recording mixer on many of my projects. I put together the appropriate crew and schedule along with helping to finalize a budget through the bidding process. As re-recording mixer, I take all the final edited elements and blend them together to create the final soundscape.

What would surprise people the most about what falls under that title?
How almost all the sound that someone hears in a movie has been replaced by a sound editor.

What’s your favorite part of the job?
Creatively collaborating with co-workers and hearing it all come together in the final mix.

What is your most productive time of day?
Whenever I can turn off my emails and can concentrate on mixing.

If you didn’t have this job, what would you be doing instead?
Fishing!

When did you know this would be your path?
I played drums in a rock band and got interested in sound at around 18 years old. I was always interested in the “sound” of an album along with the musicality. I found myself buying records based on who had produced and engineered them.

Can you name some recent projects?
Fosse/Verdo (FX) and Boys State, which just one Grand Jury Prize at Sundance.

How has the industry changed since you began working?
Technology has improved workflows immensely and has helped us with the creative process. It has also opened up the door to accelerating schedules to the point of sacrificing artistic expression and detail.

Name three pieces of technology you can’t live without
Avid Pro Tools, my iPhone and my car’s navigation system.

How do you de-stress from it all?
I stand in the middle of a flowing stream fishing with my fly rod. If I catch something that’s a bonus!

Sony adds 4K HDR reference monitors to Trimaster range

Sony is offering a new set of high-grade 4K HDR monitors as part of its Trimaster range. The PVM-X2400 (24-inch) and the PVM-X1800 (18.4-inch) professional 4K HDR monitors were demo’d at the BSC Expo 2020 in London. They will be available in the US starting in July.

The monitors provide ultra-high definition with a resolution of 3840×2160 pixels and a brightness of all-white luminance of 1000 cd/m2. For optimum film production, their wide color gamut matches the BVM-HX310 Trimaster HX master monitor. This means both monitors feature accurate color reproduction and greyscale, which helps filmmakers make critical imaging decisions and deploy faithful color matching throughout the workflow.

The monitors, which are small and portable, are designed to expand footprints in 4K HDR production, including applications such as on-set monitoring, nonlinear video editing, studio wall monitoring and rack-mount monitoring in OB trucks or machine rooms.

The monitors also feature new Black Detail High/Mid/Low, which helps maintain accurate color reproduction by reducing the brightness of the backlight to reproduce the correct colors and gradations in low-luminance areas. Another new function, Dynamic Contrast Drive, changes backlight luminance to adapt to each scene or frame when transferring images from PVM-X2400/X1800 to an existing Sony OLED monitor.  This functionality allows filmmakers to check the highlight and low-light balance of the contents with both bright and dark scenes.

Other features include:
• Dynamic contrast ratio of 1,000,000:1 by Dynamic Contrast Drive, a new backlight driving system that dynamically changes the backlight luminance to adapt for each frame of a scene.
• 4K/HD scopes with HDR scales that are waveform/vector.
• Quad View display and User 3D LUT functionality.
• 12G/6G/3G/HD-SDI with auto configuration.

An online editor’s first time at Sundance

By Brady Betzel

I’ve always wanted to attend the Sundance Film Festival, and my trip last month did not disappoint. Not only is it an iconic industry (and pop-culture) event, but the energy surrounding it is palpable.

Once I got to Park City and walked Main Street — with the sponsored stores (Canon and Lyft among others) and movie theaters, like the Egyptian — I started to feel an excitement and energy that I haven’t felt since I was making videos in high school and college… when there were no thoughts of limits and what I should or shouldn’t do.

A certain indescribable nervousness and love started to bubble up. Sitting in the luxurious Park City Burger King with Steve Hullfish (Art of the Cut) and Joe Herman (Cinemontage) before my second screening of Sundance 2020: Dinner in America, I was thinking how I was so lucky to be in a place that is packed with creatives. It sounds cliché and trite, but it really is reinvigorating to surround yourself with positive energy — especially if you can get caught up in cynicism like me.

It brought me back to my college classes, taught by Daniel Restuccio (another postPerspective writer), at California Lutheran University, where we would cut out pictures from magazines, draw pictures, blow up balloons, eat doughnuts and do whatever we could to get our ideas out in the open.

While Sundance occasionally felt like an amalgamation of the thirsty-hipster Coachella crowd mixed with a high school video production class (but with million-dollar budgets), it still had me excited to create. Sundance 2020 in Park City was a beautiful resurgence of ideas and discussions about how we as an artistic community can offer accessibility to everyone and anyone who wants to tell their own story on screen.

Inclusiveness Panel
After arriving in Park City, my first stop was a panel hosted by Adobe called “Empowering Every Voice in Film and the World.” Maybe it was a combination of the excitement of Sundance and the discussion about accessibility, but it really got me thinking. The panel was expertly hosted by Adobe’s Meagan Keane and included producer, director Yance Ford (Disclosure: Trans Lives on Screen, Oscar-nominated for Strong Island); editor Eileen Meyer (Crip Camp); editor Stacy Goldate (Disclosure: Trans Lives on Screen); and director Crystal Kayiza (See You Next Time).

I walked away feeling inspired and driven to increase my efforts in accessibility. Eileen said one of her biggest opportunities came from the Karen Schmeer Film Editing Fellowship, a year-long fellowship for emerging documentary editors.

Yance drove home the idea of inclusivity and re-emphasized the idea of access to equipment. But it’s not simply about access — you also have to make a great story and figure out things like distribution. I was really struck by all the speakers on-stage, but Yance really spoke to me. He feels like the voice we need when representing marginalized groups and to see more content from these creatives. The more content we see the better.

Crystal spoke about the community needing to tell stories that don’t necessarily have standard plot points and stakes. The idea to encourage people to create their stories and for those that are in power to help and support these stories and trust the filmmakers, regardless of whether you identify with the ideas and themes.

Rebuilding Paradise

Screenings
One screening I attended was Rebuilding Paradise, directed by Ron Howard. He was at the premiere, along with some of the people who lost everything in the Paradise, California fires. In the first half of November 2018, there were several fires that raged out of control in California. One surrounded the city of Simi Valley and worked its way toward the Pacific Coast. (It was way too close for my comfort in Simi Valley. We eventually evacuated but were fine.)

Another fire was in the town of Paradise, which burnt almost the entire city to the ground. Watching Rebuilding Paradise filled me with great sadness for those who lost family members and their homes. Some of the “found footage” was absolutely breathtaking. One in particular was of a father racing out of what appears to be hell, surrounded by flames, in his car with his child asking if they were going to die. Absolutely incredible and heart wrenching.

Dinner in America

Another film I saw was Dinner in America, as referenced earlier in this piece. I love a good dark comedy/drama, so when I got a ticket to Adam Carter Rehmeier’s Dinner in America I was all geared up. Little did I know it would start off with a disgruntled 20-something throwing a chair through a window and lighting the front sidewalk on fire. Kudos to composer John Swihart, who took a pretty awesome opening credit montage and dropped the heat with his soundtrack.

Dinner in America is a mid-‘90s Napoleon Dynamite cross-pollinated with the song “F*** Authority” by Pennywise. Coincidentally, Swihart composed the soundtrack for Napoleon Dynamite. Seriously, the soundtrack to Dinner in America is worth the ticket price alone, in my opinion. It adds so much to one of the main character’s attitude. The parallel editing mixed with the fierce anti-authoritarianism love story, lived by Kyle Gallner and Emily Skeggs, make for a movie you probably won’t forget.

Adam Rehmeier

During the Q&A at the end, writer, director and editor Rehmeier described how he essentially combined two ideas that led to Dinner in America. As I watched the first 20 minutes, it felt like two separate movies, but once it came together it really paid off. Much like the cult phenomenon Napoleon Dynamite, Dinner in America will resonate with a wide audience. It’s worth watching when it comes to a theater (or streaming platform) near you. In the meantime, check out my video interview with him.

Adobe Productions
During Sundance, Adobe announced an upcoming feature for Premiere called “Productions.” While in Park City, I got a small demo of the new Productions at Adobe’s Sundance Production House. It took about 15 minutes before I realized that Adobe has added the one feature that has set Avid Media Composer apart for over 20 years — bin locking. Head’s up Avid, Adobe is about to release multi-user workflow that is much easier to understand and use than on previous iterations of Premiere.

The only thing that caught me off guard was the nomenclature — Productions and Projects. Productions is the title, but really a “Production” is a project, and what they call a “project” is a bin. If you’re familiar with Media Composer, you can create a project and inside have folders and bins. Bins are what house media links, sequences, graphics and everything else. In the new Productions update, a “Production” will house all of your “Projects” (i.e. a Project with bins).

Additionally, you will be able to lock “Projects.” This means that in a multi-user environment (which can be something like a SAN or even an Avid Nexis), a project and media can live on the shared server and be accessed by multiple users. These users can be named and identified inside of the Premiere Preferences. And much like Blackmagic’s DaVinci Resolve, you can update the “projects” when you want to — individually or all projects at once. On its face, Productions looks like the answer to what every editor has said is one of the only reasons Avid is still such a powerhouse in “Hollywood” — the ability to work relatively flawlessly among tons of editors simultaneously. If what I saw works the way it should, Adobe is looking to take a piece of the multi-user environment pie Avid has controlled for so long.

Summing Up
In the end, the Sundance Film Festival 2020 in Park City was likely a once-in-a-lifetime experience for me. From seeing celebrities, meeting other journalists, getting some free beanies and hand warmers (it was definitely not 70 degrees like California), to attending parties hosted by Canon and Light Iron — Sundance can really reinvigorate your filmmaking energy.

It’s hard to keep going when you get burnt out by just how hard it is to succeed and break through the barriers in film and multimedia creation. But seeing indie films and meeting like-minded creatives, you can get excited to create your own story. And you realize that there are good people out there, and sometimes you just have to fly to Utah to find them.

Walking down Main Street, I found a coffee shop named Atticus Coffee and Tea House. My oldest son’s name is Atticus, so I naturally had to stop in and get him something, I ended up getting him a hat and me a coffee. It was good. But what I really did was sit out front pretending to shoot b-roll and eavesdropping on some conversations. It really is true that being around thoughtful energy is contagious. And while some parts of Sundance feel like a hipster-popularity contest, there are others who are there to improve and absorb culture from all around.

The 2020 Sundance Film Festival’s theme in my eyes was to uplift other people’s stories. As Harper Lee wrote in “To Kill a Mockingbird” when Atticus Finch is talking with Scout: “First of all, if you learn a simple trick, Scout, you’ll get along a lot better with all kinds of folks. You never really understand a person until you consider things from his point of view . . . until you climb into his skin and walk around in it.”


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Alkemy X adds all-female design collective Mighty Oak

Alkemy X has added animation and design collective Mighty Oak to its roster for US commercial representation. Mighty Oak has used its expertise in handmade animation techniques and design combined with live action for brands and networks, including General Electric, Netflix, Luna Bar, HBO, Samsung NBC, Airbnb, Conde Nast, Adult Swim and The New York Times.

Led by CEO/EP Jess Peterson, head of creative talent Emily Collins and CD Michaela Olsen, the collective has garnered over 3 billion online views. Mighty Oak’s first original short film, Under Covers, premiered at the 2019 Sundance Film Festival. Helmed by Olsen, the quirky stop-motion short features handmade puppets and forced-perspective sets to glimpse into the unsuspecting lives and secrets that rest below the surface of a small town.

“I was immediately struck by the extreme care that Mighty Oak takes on each and every frame of their work,” notes Alkemy X EP Eve Ehrich. “Their handmade style and fresh approach really make for dynamic, memorable animation, regardless of the concept.”

Mighty Oak’s Peterson adds, “We are passionate about collaborating with our clients from the earliest stages, working together to craft original character designs and creating work that is memorable and fun.”

Post house DigitalFilm Tree names Nancy Jundi COO

DigitalFilm Tree (DFT) has named Nancy Jundi as chief operating officer. She brings a wealth of experience to her new role, after more than 20 years working with entertainment and technology companies.

Jundi has been an outside consultant to DFT since 2014 and will now be based in DFT’s Los Angeles headquarters, where she joins founder and CEO Ramy Katrib in pioneering new offerings for DFT.

Jundi began her career in investment banking and asset protection before segueing into the entertainment industry. Her experience includes leading sales and marketing at Runway during its acquisition by The Post Group (TPG). She then joined that team as director of marketing and communications to unify the end-to-end post facilities into TPG’s singular brand narrative. She later co-founded Mode HQ (acquired by Pacific Post) before transitioning into technology and SaaS companies.

Since 2012, Jundi has served as a consultant to companies in industries as varied as financial technology, healthcare, eCommerce, and entertainment, with brands such as LAbite, Traffic Zoom and GoBoon. Most recently, she served as interim senior management for CareerArc.

“Nancy is simply one of the smartest and most creative thinkers I know,” says Katrib. “She is the rare interdisciplinary – the creative and technological thinker that can exist in both arenas with clarity and tenacity.”

Marriage Story director Noah Baumbach

By Iain Blair

Writer/director Noah Baumbach first made a name for himself with The Squid and the Whale, his 2005 semi-autobiographical, bittersweet story about his childhood and his parents’ divorce. It launched his career, scoring him an Oscar nomination for Best Original Screenplay.

Noah Baumbach

His latest film, Marriage Story, is also about the disintegration of a marriage — and the ugly mechanics of divorce. Detailed and emotionally complex, the film stars Scarlett Johansson and Adam Driver as the doomed couple.

In all, Marriage Story scooped up six Oscar nominations — Best Picture, Best Actress, Best Actor, Best Supporting Actress, Best Original Screenplay and Best Original Score. Laura Dern walked away with a statue for her supporting role.

The film co-stars Dern, Alan Alda and Ray Liotta. The behind-the-scenes team includes director of photography Robbie Ryan, editor Jennifer Lame and composer Randy Newman.

Just a few days before the Oscars, Baumbach — whose credits also include The Meyerwitz Stories, Frances Ha and Margot at the Wedding — talked to me about making the film and his workflow.

What sort of film did you set out to make?
It’s obviously about a marriage and divorce, but I never really think about a project in specific terms, like a genre or a tone. In the past, I may have started a project thinking it was a comedy but then it morphs into something else. With this, I just tried to tell the story as I initially conceived it, and then as I discovered it along the way. While I didn’t think about tone in any general sense, I became aware as I worked on it that it had all these different tones and genre elements. It had this flexibility, and I just stayed open to all those and followed them.

I heard that you were discussing this with Adam Driver and Scarlett Johansson as you wrote the script. Is that true?
Yes, but it wasn’t daily. I’d reached out to both of them before I began writing it, and luckily they were both enthusiastic and wanted to do it, so I had them as an inspiration and guide as I wrote. Periodically, we’d get together and discuss it and I’d show them some pages to keep them in the loop. They were very generous with conversations about their own lives, their characters. My hope was that when I gave them the finished script it would feel both new and familiar.

What did they bring to the roles?
They were so prepared and helped push for the truth in every scene. Their involvement from the very start did influence how I wrote their roles. Nicole has that long monologue and I don’t know if I’d have written it without Scarlett’s input and knowing it was her. Adam singing “Being Alive” came out of some conversations with him. They’re very specific elements that come from knowing them as people.

You reunited with Irish DP Robbie Ryan, who shot The Meyerowitz Stories. Talk about how you collaborated on the look and why you shot on film?
I grew up with film and feel it’s just the right medium for me. We shot The Meyerowitz Stories on Super 16, and we shot this on 35mm, and we had to deal with all these office spaces and white rooms, so we knew there’d be all these variations on white. So there was a lot of discussion about shades and the palette, along with the production and costume designers, and also how we were going to shoot these confined spaces, because it was what the story required.

You shot on location in New York and LA. How tough was the shoot?
It was challenging, but mainly because of the sheer length of many of the scenes. There’s a lot of choreography in them, and some are quite emotional, so everyone had to really be up for the day, every day. There was no taking it easy one day. Every day felt important for the movie.

Where did you do the post?
All in New York. I have an office in the Village where I cut my last two films, and we edited there again. We mixed on the Warner stage, where I’ve mixed most of my movies. We recorded the music and orchestra in LA.

Do you like the post process?
I really love it. It’s the most fun and the most civilized part of the whole process. You go to work and work on the film all day, have dinner and go home. Writing is always a big challenge, as you’re making it up as you go along, and it can be quite agonizing. Shooting can be fun, but it’s also very stressful trying to get everything you need. I love working with the actors and crew, but you need a high level of energy and endurance to get through it. So then post is where you can finally relax, and while problems and challenges always arise, you can take time to solve them. I love editing, the whole rhythm of it, the logic of it.

_DSC4795.arw

Talk about editing with Jennifer Lame. How did that work?
We work so well together, and our process really starts in the script stage. I’ll give her an early draft to get her feedback and, basically, we start editing the script. We’ll go through it and take out anything we know we’re not going to use. Then during the shoot she’ll sometimes come to the set, and we’ll also talk twice a day. We’ll discuss the day’s work before I start, and then at lunch we’ll go over the previous day’s dailies. So by the time we sit down to edit, we’re really in sync about the whole movie. I don’t work off an assembly, so she’ll put together stuff for herself to let me know a scene is working the way we designed it. If there’s a problem, she’ll let me know what we need.

What were the big editing challenges?
Besides the general challenges of getting a scene right, I think for some of the longer ones it was all about finding the right rhythm and pacing. And it was particularly true of this film that the pace of something early on could really affect something later. Then you have to fix the earlier bit first, and sometimes it’s the scene right before. For instance, the scene where Charlie and Nicole have a big argument that turns into a very emotional fight is really informed by the courtroom scene right before it. So we couldn’t get it right until we’d got the courtroom scene right.

A lot of directors do test screenings. Do you?
No, I have people I show it to and get feedback, but I’ve never felt the need for testing.

VFX play a role. What was involved?
The Artery did them. For instance, when Adam cuts his arm we used VFX in addition to the practical effects, and then there’s always cleanup.

Talk about the importance of sound to you as a filmmaker, as it often gets overlooked in this kind of film.
I’m glad you said that because that’s so true, and this doesn’t have obvious sound effects. But the sound design is quite intricate, and Chris Scarabosio (working out of Skywalker Sound), who did Star Wars, did the sound design and mix; he was terrific.

A lot of it was taking the real-world environments in New York and LA and building on that, and maybe taking some sounds out and playing around with all the elements. We spent a lot of time on it, as both the sound and image should be unnoticed in this. If you start thinking, “That’s a cool shot or sound effect,” it takes you out of the movie. Both have to be emotionally correct at all times.

Where did you do the DI and how important is it to you?
We did it at New York’s Harbor Post with colorist Marcy Robinson, who’s done several of my films. It’s very important, but we didn’t do anything too extreme, as there’s not a lot of leeway for changing the look that much. I’m very happy with the look and the way it all turned out.

Congratulations on all the Oscar noms. How important is that for a film like this?
It’s a great honor. We’re all still the kids who grew up watching movies and the Oscars, so it’s a very cool thing. I’m thrilled.

What’s next?
I don’t know. I just started writing, but nothing specific yet.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

DejaEdit collaborative editing platform available worldwide

DejaSoft has expanded the availability of its DejaEdit collaborative editing solution for Avid Media Composer, Avid Nexis and EditShare workflows. Already well-established in Scandinavia and parts of Europe, the software-defined network solution is now accessible across the UK, Europe, Latin America, Middle East, Asia, Africa, China and North America.

DejaEdit allows editors to transfer media files and timelines automatically and securely around the world without having to be online continuously. It effectively acts as a media file synchronizer for multiple remote Avid systems.

DejaEdit allows multi-site post facilities to work as one, enabling multiple remote editors to work together, allowing media exchanges with VFX houses and letting editors easily migrate between office and home or mobile-based editing installations throughout the lifecycle of a project.

DejaEdit is available in two applications: Client and Nexus. The Client version works directly with Media Composer, whereas the Nexus variant further enables synchronization with projects stored on Nexis or EditShare storage systems.

DejaSoft and the DejaEdit platform are a collaboration between CEO Clas Hakeröd and CTO Nikolai Waldman, both editors and post pros and founders of boutique post facility Can Film based in Sweden.

The tool is already being used by Oscar-nominated editor Yorgos Mavropsaridis, ACE, of The Favourite, The Lobster and recently Suicide Tourist; Scandinavian producer Daniel Lägersten, who has produced TV series such as Riverside and The Spiral; editor Rickard Krantz, who used it on The Perfect Patient (aka Quick), which has been nominated for Sweden’s Guldbagge Award (similar to a BAFTA) for editing; and post producer Anna Knochenhauer, known for her work on Euphoria featuring Alicia Vikander, The 100-Year-Old Man Who Climbed Out the Window and Disappeared, Lilya 4-Ever and Together.

Bill Baggelaar promoted at Sony Pictures, Sony Innovation Studios

Post industry veteran Bill Baggelaar has been promoted to executive VP and CTO, technology development at Sony Pictures and executive VP and general manager of Sony Innovation Studios. Prior to joining Sony Pictures almost nine years ago, he spent 13 years at Warner Bros. as VP of technology/motion picture imaging and head of technology/feature animation. His new role will start in earnest on April 1.

“I am excited for this new challenge that combines roles as both CTO of Sony Pictures and GM of Sony Innovation Studios,” says Baggelaar. “The CTO’s office works both inside the studio and with the industry to develop key standards and technologies that can be adopted across the various lines of business. Sony Innovation Studios is developing groundbreaking tools, methods and techniques for realtime volumetric virtual production — or as we like to say, “the future of movie magic” — with a level of fidelity and quality that is best in class. With the technicians, engineers and artisans at Sony Innovation Studios combined with our studio technology team, we will be able to bring new experiences and technologies to all areas of production and delivery.”

Baggelaar’s promotion is part of a larger announcement by Sony, which involves a new team established within Sony Pictures — the Entertainment Innovation & Technology Group, Sony Pictures Entertainment, which encompasses the following departments: Sony Innovation Studios (SIS), Technology Development, IP Acceleration and Branded Integration.

The group is headed by Yasuhiro Ito, executive VP, Entertainment Innovation & Technology Group. Don Eklund will be leaving his post as EVP /CTO of technology development at the end of March. Eklund has had a long history with SPE and has been in his current role since 2017, establishing the foundation of the studio’s technology development activities.

“This new role combines my years of experience in production, post and VFX; my work with the broader industry and organizations; and my work with Sony companies around the world over the past eight and a half years — along with my more recent endeavors into virtual production — to create a truly unique opportunity for technical innovation that only Sony can provide,” concludes Baggelaar, who will report directly to Ito.

Kevin Lau heads up advertising, immersive at Digital Domain

Visual effects studio Digital Domain has brought on Kevin Lau as executive creative director of advertising, games and new media. In this newly created position, Lau will oversee all short-form projects and act as a creative partner for agencies and brands.

Lau brings over 18 years of ad-based visual effects and commercial production experience, working on campaigns for brands such as Target, Visa and Sprint.

Most recently, he was the executive creative director and founding partner at Timber, an LA-based studio focused on ads (GMC, Winter Olympics) and music videos (Kendrick Lamar’s Humble). Prior to that, he held creative director positions at Mirada, Brand New School and Superfad. Throughout his career, his work has been honored with multiple awards including Clios, AICP Awards, MTV VMAs and a Cannes Gold Lion for Sprint’s “Now Network” campaign via Goodby.

Lau, who joins Digital Domain EPs Nicole Fina and John Canning as they continue to build the studio’s short-form business, will help unify the vision for the advertising, games and new media/experiential groups, promoting a consistent voice across campaigns.

Lau joins the team as the new media group prepares to unveil its biggest project to date: Time’s The March, a virtual reality recreation of the 1963 March on Washington for Jobs and Freedom. Digital Domain’s experience with digital humans will play a major role in the future of both groups as they continue to build on the photoreal cinematics and in-game characters previously created for Activision, Electronic Arts and Ubisoft.

Talking with 1917’s Oscar-nominated sound editing team

By Patrick Birk

Sam Mendes’ 1917 tells the harrowing story of Lance Corporals Will Schofield and Tom Blake, following the two young British soldiers on their perilous trek across no man’s land to deliver lifesaving orders to the Second Battalion of the Devonshire Regiment.

Oliver Tarney

The story is based on accounts of World War I by the director’s grandfather, Alfred Mendes. The production went to great lengths to create an immersive experience, placing the viewer alongside the protagonists in a painstakingly recreated world, woven together seamlessly, with no obvious cuts. The film’s sound department had to rise to the challenge of bringing this rarely portrayed sonic world to life.

We checked in with supervising sound editor Oliver Tarney and ADR/dialogue supervisor Rachael Tate, who worked out of London’s Shepperton Studios. Both Tarney and Tate are Oscar-nominated in the Sound Editing category. Their work was instrumental in transporting audiences to a largely forgotten time, helping to further humanize the monochrome faces of the trenches. I know that I will keep their techniques — from worldizing to recording more ambient Foley — in mind on the next project I work on.

Rachael Tate

A lot of the film is made up of quiet, intimate moments punctuated by extremely traumatic events. How did you decide on the most key sounds for those quiet moments?
Oliver Tarney: When Sam described how it was going to be filmed, it was expected that people would comment on how it was made from a technical perspective. But for Sam, it’s a story about the friendship between these two men and the courage and sacrifice that they show. Because of this, it was important to have those quieter moments when you aren’t just engaged in full-tilt action the whole time.

The other factor is that the film had no edits — or certainly no obvious edits (which actually meant many edits) — and was incredibly well-rehearsed. It would have been a dangerous thing to have had everything playing aggressively the whole way through. I think it would have been very fatiguing for the audience to watch something like that.

Rachael Tate: Also, you can’t rely on a cut in the normal way to inform pace and energy, so you are using things like music and sound to sort of ebb and flow the energy levels. So after the plane crash, for example, you’ll notice it goes very quiet, and also with the mine collapse, there’s a huge section of very little sound, and that’s on purpose so your ears can reacclimatize.

Absolutely, and I feel like that’s a good way to go — not to oversaturate the audience with the extreme end of the sound design. In other interviews, you said that you didn’t want it to seem overly processed.
Tarney: Well, we didn’t want the weapons to sound heroic in any way. We didn’t want it to seem like they were enjoying what they were doing. It’s very realistic; it’s brutal and harsh. Certainly, Schofield does shoot at people, but it’s out of necessity rather than enjoying his role there. In terms of dynamics, we broke the film up into a series of arcs, and we worked out that some would be five minutes, some would be nine minutes and so on.

In terms of the guns, we went more naturalistic in our recordings. We wanted the audience to feel everything from their perspective — that’s what Sam wanted with the entire film. Rather than having very direct recordings, we split our energies between that and very ambient recordings in natural spaces to make it feel more realistic. The distance that enemy fire was coming from is much more realistic than you would normally play in a film, and the same goes for the biplane recordings. We had microphones all across airfields to get that lovely phase-y kind of sound. For the dogfight with the planes, we sold the fact that you’re watching Blake and Schofield watching the dogfight rather than being drawn directly to the dogfight. I guess it was trying to mirror the visual, which would stick with the two leads.

Tate: We did the same with the crowd. We tried to keep it more realistic by using half actual territorial army guys, along with voice actors, rather than just being a crowdy-sounding crowd. When we put that into the mix, we also chose which bits to focus on — Sam described it as wanting it to be like a vignette, like an old photo. You have the brown edging that fades away in the corners. He wanted you to zoom in on them so much that the stuff around them is there, but at the level they would hear it. So, if there’s a crowd on the screen further back from them, in reality you wouldn’t really hear it. In most films you put something in everyone’s mouth, but we kept it pared right back so that you’re just listening to their voices and their breaths. This is similar to how it was done with the guns and effects.

You said you weren’t going for any Hollywood-type effects, but I did notice that there are some psychoacoustic cues, like when a bomb goes in the bunker, and I think a tinnitus-type effect.
Tarney: There are a few areas where you have to go with a more conventional film language. When the plane’s very close — on the bridge perhaps — once he’s being fired upon, we start going into something that’s a little more conventional, and then we set the lingo back into him. It was that thing that Sam mentioned, which was subjectivity, objectivity; you can flip between them a little bit, otherwise it becomes too linear.

Tate: It needed to pack a punch.

Foley plays a massive part in this production. Assuming you used period weaponry and vehicles?
Tarney: Sam was so passionate about this project. When you visited the sets, the detail was just beautiful. They set the bar in terms of what we had to achieve realism-wise. We had real World War I rifles and machine guns, both British and German, and biplanes. We also did wild track Foley at the first trench and the last trench: the muddy trench and then the chalk one at the end.

Tate: We even put Blakeys on the boots.

Tarney: Yes, we bought various boots with different hobnails and metal tips.

That’s what a Blakey is?
Tate: The metal things that they put in the bottom of their shoes so that they didn’t slip around.

Tarney: And we went over the various surfaces and found which worked the best. Some were real hobnail boots, and some had metal stuck into them. We still wanted each character to have a certain personality; you don’t want everything sounding the same. We also recorded them without the nails, so when we were in a quieter part of the film, it was more like a normal boot. If you’d had that clang, clang, clang all the way through the film…

Tate: It would throw your attention away from what they were saying.

Tarney: With everything we did on the Foley, it was important to keep focus on them the whole time. We would work in layers, and as we would build up to one of the bigger events, we’d start introducing the heavier, more detailed Foley and take away the more diffuse, mellow Foley.

You only hear webbing and that kind of stuff at certain times because it would be too annoying. We would start introducing that as they went into more dangerous areas. You want them to feel conspicuous, too — when they’re in no man’s land, you want the audience to think, “Wow, there are two guys, alone, with absolutely no idea what’s out there. Is there a sniper? What’s the danger?” So once you start building up that tension, you make them a little bit louder again, so you’re aware they are a target.

How much ADR did the film require? I’m sure there was a lot of crew noise in the background.
Tate: Yes, there was a lot of crew noise — there were only two lines of “technical” ADR, which is when a line needs to be redone because the original could not be used/cleaned sufficiently. My priority was to try and keep as much production as possible. Because we started a couple of weeks after shooting started, and as they were piecing it together, it was as if it was locked. It’s not the normal way.

With this, I had the time to go deep and spectrally remove all the crew feet from the mics because they had low-end thuds on their clip mics, which couldn’t be avoided. The recordist, Stuart Wilson, did a great job, giving me a few options with the clip mics, and he was always trying to get a boom in wherever he could.

He had multiple lavaliers on the actors?
Tate: Yes, he had up to three on both those guys most of the time, and we went with the one on their helmets. It was like a mini boom. But, occasionally, they would get wind on them and stuff like that. That’s when I used iZotope RX 7. It was great having the time to do it. Ordinarily people might say, “Oh no, let’s ADR all the breaths there,” but I could get the breaths out. When you hear them breathing, that’s what they were doing at the time. There’s so much performance in them, I would hate to get them standing in a studio in London, you know, in jeans, trying to recreate that feeling.

So even if there’s slight artifacting, the littlest bit, you’d still go with that over ADR?
Tate: Absolutely. I would hope there’s not too much there though.

Tarney: Film editor Lee Smith and Sam have such a great working relationship; they really were on the same page putting this thing together. We had a big decision to make early on: Do we risk being really progressive and organize Foley recording sessions whilst they were still filming? Because, if everything was going according to plan, they were going to be really hungry for sound since there was no cutting once they had chosen the takes. If it didn’t go to plan, then we’d be forever swapping out seven-minute takes, which would be a nightmare to redo. We took a gamble and budgeted to spend the resources front heavy, and it worked out.

Tate: Lee Smith used to be a sound guy, which didn’t hurt.

I saw how detailed they were with the planning. The model of the town for figuring out the trajectory of the flair for lighting, for example.
Tate: They also mapped out the trenches so they were long enough to cover the amount of dialogue the actors were going to say — so the trenches went on for 500 yards. Before that, they were on theater stages with cardboard boxes to represent trenches, walking through them again and again. Everything was very well-planned.

Apart from dialogue and breaths, were there any pleasant surprises from the production audio that you were able to use in the final cut?
Tate: In the woods, toward the end of the film, Schofield stumbles out of the river and hears singing, and the singing that you hear is the guy doing it live. That’s the take. We didn’t get him in to sing and then put it on; that’s just his clip mic, heavily affected. We actually took his recording out into the New Forest, which is south of London.

A worldizing-type technique?
Tate: Yes, we found a remote part, and we played it and recorded it from different distances, and we had that woven against the original with a few plugins on it for the reverbs.

Tarney: We don’t know if Schofield is concussed and if he’s hallucinating. So we really wanted it to feel sort of ethereal, sort of wafting in and out on the wind — is he actually hearing this or not?

Tate: Yeah, we played the first few lines out of sequence, so you can’t really catch if there’s a melody. Just little bits on the breeze so that you’re not even quite sure what you’re hearing at that point, and it gradually comes to a more normal-sounding tune.

Tarney: Basically, that’s the thing with the whole film; things are revealed to the audience as they’re revealed to the lead characters.

Tate: There are no establishing shots.

Were there any elements of the sound design you wouldn’t expect to be in there that worked for one reason or another?
Tarney: No, there’s nothing… we were pretty accurate. Even the first thing you hear in the film — the backgrounds that were recorded in April.

Tate: In the field.

Tarney: Rachael and I went to Ypres in Belgium to visit the World War I museum and immerse ourselves in that world a little bit.

Tate: We didn’t really know that much about World War I. It wasn’t taught in my school, so I really didn’t know anything before I started this; we needed to educate ourselves.

Can you talk about the loop groups and dialing down to the finest details in terms of the vocabulary used?
Tate: Oh, God, I’ve got so many books, and we got military guys for that sort of flat way they operate. You can’t really explain that fresh to a voice actor and get them to do it properly. But the voice actors helped those guys perform and get out of their shells, and the military guys helped the voice actors in showing them how it’s done.

I gave them all many sheets of key words they could use, or conversation starters, so that they could improvise but stay on the right track in terms of content. Things like slang, poems from a cheap newspaper that was handed out to the soldiers. There was an officer’s manual, so I could tell them the right equipment and stuff. We didn’t want to get anything wrong.

That reminds me of this series of color photographs taken in the early 1900s in Russia. Automatically, it brings you so much closer to life at that point in time. Do you feel like you were able to achieve that via the sound design of this film?
Tarney: I think the whole project did that. When you’ve watched a film every day for six months, day in and day out, you can’t help but think about that era more, and it’s slightly embarrassing that it’s one generation past your grandparents.

How much more worldizing did you do, apart from the nice moment with the song?
Tarney: The Foley that you hear in the trench at the beginning and in the trench at the end is a combination between worldizing and sound designer Mike Fentum’s work. We both went down about three weeks before we started because Stuart Wilson gave us a heads up that they were wrapping at that location, so we spoke to the producer, and he gave us access.

So, in terms of worldizing, it’s not quite worldizing in the conventional sense of taking a recording and then playing it in a space. We actually went to the space and recorded the feet in that space, and the Foley supervisor Hugo Adams went to Salisbury Plain (the chalk trench at the end), and those were the first recordings that we edited and gave to Lee Smith. And then, we would get the two Foley artists that we had — Andrea King and Sue Harding — to top that with a performed pass against a screen. The whole film is layered between real recordings and studio Foley, and it’s the blend of natural presence and the performed studio Foley, with all the nuance and detail that you get from that.

Tate: Similarly, the crowd that we recorded out on a field in the back lot of Shepperton, with a 50 array; we did as much as we could without a screen with them just acting and going through the motions. We had an authentic World War I stretcher, which we used with hilarious consequences. We got them to run up and down carrying their friends on stretchers and things like that and passing enormous tables to each other and stuff so that we had the energy of it. There is something about recording outside and that sort of natural slap that you get off the buildings. It was embedded with production quite seamlessly really, and you can’t really get the same from a studio. We had to do the odd individual line in there, but most of it was done out in a field.

When need be, were you using things like convolution reverbs, such as Audio Ease Altiverb, in the mix?
Tarney: Absolutely. As good as the recordings were, it’s only when you put it against picture that you really understand what it is you need to achieve. So we would definitely augment with a lot — Altiverb is a favorite. Re-recording mixer Mark Taylor and I, we would use that a lot to augment and just change perspective a little bit more.

Can you talk about the Atmos mix and what it brought to the film?
Tarney: I’ve worked on many films with Atmos, and it’s a great tool for us. Sam’s very performance-orientated and would like things to be more screen-focused. The minute you have to turn around, you’ve lost that connection with the lead characters. So, in general, we kept things a little more front-loaded than we might have done with another director, but I really liked the results. It’s actually all the more shocking when you hear the biplane going overhead when they’re in no man’s land.

Sam wanted to know all the way through, “Can I hear it in 5.1, 7.1 and Atmos?” We’d make sure that in the three mixes — other than the obvious — we had another
plane coming over from behind. There’s not a wild difference in Atmos. The low end is nicer, and the discrete surrounds play really well, but it’s not a showy kind of mix in that sense. That would not have been true to everything we were trying to achieve, which was something real.

So Sam Mendes knows sound?
Tarney: He’s incredibly hungry to understand everything, in the best way possible. He’s very good at articulating what he wants and makes it his business to understand everything. He was fantastic. We would play him a section in 5.1, 7.1 and Atmos, and he would describe what he liked and disliked about each format, and we would then try to make each format have the same value as the other ones.


Patrick Birk is a musician and sound engineer at Silver Sound, a boutique sound house based in New York City.

Quantum to acquire Western Digital’s ActiveScale business  

Quantum has entered into an agreement with Western Digital Technologies, a subsidiary of Western Digital Corp., to acquire its ActiveScale object storage business. The addition of the ActiveScale product line and engineers brings object storage software and erasure coding technology to Quantum’s portfolio and helps the company to expand in the object storage market.

The acquisition will extend the company’s role in storing and managing video and unstructured data using a software-defined approach. The transaction is expected to close by March 31, 2020. Financial terms of the transaction were not disclosed.

What are the benefits of object storage software?
• Scalability: Allows users to store, manage and analyze billions of objects and exabytes of capacity.
• Durable: ActiveScale object storage offers up to 19 nines of data durability using patented erasure coding protection technologies.
• Easy to Manage at Scale: Because object storage has a flat namespace (compared to a hierarchical file system structure), managing billions of objects and hundreds of petabytes of capacity is easier than using traditional network attached storage. This, according to Quantum, reduces operational expenses.

Quantum has been offering object storage and selling and supporting the ActiveScale product line for over five years. Object storage can be used as an active-archive tier of storage — where StorNext file storage is used for high-performance ingest and processing of data, object storage acts as lower cost online content repository, and tape acts as the lowest cost cold storage tier.

For M&E, object storage is used as a long-term content repository for video content, in movie and TV production, in sports video, and even for large corporate video departments. Those working in movie and TV production require very high performance ingest, edit, processing, rendering of their video files, which typically is done with a file system like StorNext. Once content is finished, it is preserved in an object store, with StorNext data management handling the data movement between file and object tiers.

“Object storage software is an obvious fit with our strategy, our go-to-market focus and within our technology portfolio,” says Jamie Lerner, president/CEO of Quantum. “We are committed to the product, and to making ActiveScale customers successful, and we look forward to engaging with them to solve their most pressing business challenges around storing and managing unstructured data. With the addition of the engineers and scientists that developed the erasure-coded object store software, we can deliver on a robust technical roadmap, including new solutions like an object store built on a combination of disk and tape.”

Visible Studios produces, posts Dance Monkey music video

If you haven’t heard about the Dance Monkey song by Tones and I, you soon will.  Australia’s Visible Studios provided production and post on the video to go with the song that has hit number one in more than 30 countries, went seven times platinum and remained at the top of the charts in Australia for 22 weeks. The video has been viewed on YouTube more than half a billion times.

Visible Studios, a full production and post company, is run by producer Tim Whiting and director and editor Nick Kozakis. The company features a team of directors, scriptwriters, designers, motion graphic artists and editors working on films, TV commercials and music videos.

For Dance Monkey, Visible Studios worked directly with Tones and I to develop the idea for the video. The video, which was shot on Red cameras at the beginning of the song’s meteoric rise, was completed in less than a week and on a small budget.

“The Dance Monkey music video was made on an extremely quick turnaround,” says Whiting. “[Tones] was blowing up at the time, and they needed the music video out fast. The video was shot in one day, edited in two, with an extra day and a half for color and VFX.”  Visible Studios called on Blackmagic Resolve studio for edit, VFX and color.

Dance Monkey features the singer dressed as Old Tones, an elderly man whisked away by his friends to a golf course to dance and party. On the day of production, the sun was nowhere to be found, and each shot was done against a gray and dismal background. To fix this, the team brought in a sky image as a matte and used Resolve’s match move tool, keyer, lens blur and power windows to turn gray footage to brilliant sunshine.

“In post we decided to replace the overcast skies with a cloudy blue sky. We ended up doing this all in Resolve’s color page and keyed the grass and plants to make them more lush, and we were there,” says Whiting.

Editor/directors Kozakis and Liam Kelly used Resolve for the entire editing process. “Being able to edit 6K raw footage smoothly on a 4K timeline, at a good quality debayer, means that we don’t have to mess around with proxies and that the footage gets out of the way of the editing process. The recent update for decompression and debayer on Nvidia cards has made this performance even better,” Kozakis says.

 

Missing Link, The Lion King among VES Award winners

The Visual Effects Society (VES), the industry’s global professional honorary society, held its 18th Annual VES Awards, the yearly celebration that recognizes outstanding visual effects artistry and innovation in film, animation, television, commercials, video games and special venues.

Comedian Patton Oswalt served as host for the 9th time to the more than 1,000 guests gathered at the Beverly Hilton to celebrate VFX talent in 25 awards categories. The Lion King was named the photoreal feature winner, garnering three awards. Missing Link was named top animated film, winning two awards. The Mandalorian was named best photoreal episode and garnered two awards, with Game of Thrones and Stranger Things 3 also winning two awards each. Hennessy: The Seven Worlds topped the commercial field with two wins.

Andy Serkis presented the VES Award for Creative Excellence to visual effects supervisor Sheena Duggal. Joey King presented the VES Visionary Award to director-producer-screenwriter Roland Emmerich. VFX supervisor Pablo Helman presented the Lifetime Achievement Award to director/producer/screenwriter Martin Scorsese, who accepted via video from New York. Scorsese’s The Irishman also picked up two awards, including Outstanding Supporting Visual Effects in a Photoreal Feature.

Presenters also included: directors J.J. Abrams, Jon Favreau, Rian Johnson and Josh Cooley.

Winners of the 18th Annual VES Awards are as follows:

Outstanding Visual Effects in a Photoreal Feature

THE LION KING

Robert Legato

Tom Peitzman

Adam Valdez

Andrew R. Jones

 

Outstanding Supporting Visual Effects in a Photoreal Feature

THE IRISHMAN

Pablo Helman

Mitchell Ferm

Jill Brooks

Leandro Estebecorena

Jeff Brink

 

Outstanding Visual Effects in an Animated Feature

MISSING LINK

Brad Schiff

Travis Knight

Steve Emerson

Benoit Dubuc

 

Outstanding Visual Effects in a Photoreal Episode

THE MANDALORIAN; The Child

Richard Bluff

Abbigail Keller

Jason Porter

Hayden Jones

Roy K. Cancino

 

Outstanding Supporting Visual Effects in a Photoreal Episode

CHERNOBYL; 1:23:45

Max Dennison

Lindsay McFarlane

Clare Cheetham

Paul Jones

Claudius Christian Rauch

 

Outstanding Visual Effects in a Real-Time Project

Control

Janne Pulkkinen

Elmeri Raitanen

Matti Hämäläinen

James Tottman

 

Outstanding Visual Effects in a Commercial

Hennessy: The Seven Worlds

Carsten Keller

Selçuk Ergen

Kiril Mirkov

William Laban

 

Outstanding Visual Effects in a Special Venue Project

Star Wars: Rise of the Resistance

Jason Bayever

Patrick Kearney

Carol Norton

Bill George

 

Outstanding Animated Character in a Photoreal Feature

ALITA: BATTLE ANGEL; Alita

Michael Cozens

Mark Haenga

Olivier Lesaint

Dejan Momcilovic

 

Outstanding Animated Character in an Animated Feature

MISSING LINK; Susan

Rachelle Lambden

Brenda Baumgarten

Morgan Hay

Benoit Dubuc

 

Outstanding Animated Character in an Episode or Real-Time Project

STRANGER THINGS 3; Tom/Bruce Monster

Joseph Dubé-Arsenault

Antoine Barthod

Frederick Gagnon

Xavier Lafarge

 

Outstanding Animated Character in a Commercial

Cyberpunk 2077; Dex

Jonas Ekman

Jonas Skoog

Marek Madej

Grzegorz Chojnacki

 

Outstanding Created Environment in a Photoreal Feature

THE LION KING; The Pridelands

Marco Rolandi

Luca Bonatti

Jules Bodenstein

Filippo Preti

 

Outstanding Created Environment in an Animated Feature

TOY STORY 4; Antiques Mall

Hosuk Chang

Andrew Finley

Alison Leaf

Philip Shoebottom

 

Outstanding Created Environment in an Episode, Commercial, or Real-Time Project

GAME OF THRONES; The Iron Throne; Red Keep Plaza

Carlos Patrick DeLeon

Alonso Bocanegra Martinez

Marcela Silva

Benjamin Ross

 

Outstanding Virtual Cinematography in a CG Project

THE LION KING

Robert Legato

Caleb Deschanel

Ben Grossmann

AJ Sciutto

 

Outstanding Model in a Photoreal or Animated Project

THE MANDALORIAN; The Sin; The Razorcrest

Doug Chiang

Jay Machado

John Goodson

Landis Fields IV

 

Outstanding Effects Simulations in a Photoreal Feature

STAR WARS: THE RISE OF SKYWALKER

Don Wong

Thibault Gauriau

Goncalo Cababca

François-Maxence Desplanques

 

Outstanding Effects Simulations in an Animated Feature

FROZEN 2

Erin V. Ramos

Scott Townsend

Thomas Wickes

Rattanin Sirinaruemarn

 

Outstanding Effects Simulations in an Episode, Commercial, or Real-Time Project

STRANGER THINGS 3; Melting Tom/Bruce

Nathan Arbuckle

Christian Gaumond

James Dong

Aleksandr Starkov

 

Outstanding Compositing in a Feature

THE IRISHMAN

Nelson Sepulveda

Vincent Papaix

Benjamin O’Brien

Christopher Doerhoff

 

Outstanding Compositing in an Episode

GAME OF THRONES; The Long Night; Dragon Ground Battle

Mark Richardson

Darren Christie

Nathan Abbott

Owen Longstaff

 

Outstanding Compositing in a Commercial

Hennessy: The Seven Worlds

Rod Norman

Guillaume Weiss

Alexander Kulikov

Alessandro Granella

 

Outstanding Special (Practical) Effects in a Photoreal or Animated Project

THE DARK CRYSTAL: THE AGE OF RESISTANCE; She Knows All the Secrets

Sean Mathiesen

Jon Savage

Toby Froud

Phil Harvey

 

Outstanding Visual Effects in a Student Project

THE BEAUTY

Marc Angele

Aleksandra Todorovic

Pascal Schelbli

Noel Winzen

 

Review: Neat Video 5 noise reduction plugin

By Brady Betzel

One of the best (and most underrated) tricks in an online editor’s tool kit is to have good image restoration techniques. Removing digital video imperfections — from flicker to digital video noise — is not easy, and not easy to do well. That is, unless you have good noise reduction software like Neat Video.

While Neat Video might not be that well-known, once you see how simply (or intricatly) Neat Video 5 works inside of apps like Blackmagic’s DaVinci Resolve, it will be hard to forget the company’s name.

(While the software was recently updated to 5.1.5 — with expanded GPU support as well as support for new versions of Resolve, Adobe and Nuke — nothing really changes for this review. You can check out a detailed list of the updates here.)

Neat Video 5 is a noise reduction plugin. In a Windows OS environment, Neat Video is compatible with apps like Adobe After Effects, Adobe Premiere Pro, DaVinci Resolve, Avid Media Composer, Vegas, Magix, Edius, Virtual Dub, and the OFX-compatible apps Nuke, Fusion, Scratch, HitFilm, Mamba, Natron, Flame, Baselight and DustBuster. In a macOS environment, Neat Video 5 is compatible with After Effects, Premiere, Final Cut Pro X, Motion 5, OFX, Resolve and Media Composer. In Linux, the software is compatible with OFX-compatible apps and Resolve.

Neat Video 5 comes in three flavors: Demo, Home and Pro. The Demo version works in up to 1280×720 resolution with a watermark. Home is literally made for the home user: It will process video up to 1920×1080 resolutions, it will use up to one GPU, and it is for non-commercial use. The cost is just $74.90 for most apps (Resolve is $89.90). The Pro version has no resolution restrictions, will work on two or more GPUs simultaneously, and can be used commercially. The Pro version starts at $129.90 per app ($159.90 for Resolve). Because Neat Video 5 for OFX works with so many apps, it only comes in Pro ($249.90) and Studio ($349.90) versions. The Studio version adds the ability for a floating license. You can see all of the pricing details here.

If there is one line you should take away from this review, it is this: Neat Video 5 is by far the easiest and best noise reduction software I have used in any application to date. And while this review is focusing on the Resolve version of Neat Video 5, all other apps work in much the same way. You can find Neat Video’s software-specific Quick Start Guides to help. Once you install and register your Neat Video 5 license, removing digital video noise is as easy as applying Neat Video 5 to a node in the color tab, clicking on “Prepare Noise Profile,” clicking on “Auto Profile,” and clicking “Apply.” Then, unless you want to fine-tune your noise reduction, you are done. Obviously, I have somewhat simplified how Neat Video 5 works, but essentially it can be done in as little as three steps per clip, and the results are typically amazing. If they aren’t amazing, you can jump back into Neat Video 5 and manually adjust specifics until the noise reduction looks correct. But I will say that in about 90% of cases, the Auto Profiling will do all of the noise reduction work necessary.

For tinkerers, or for those who need to go far beyond an Auto Profile, you can manually adjust your settings. But taking a step back, Neat Video needs an area of your image that has a uniform color and noise profile to process how it removes noise. The automatic profiling will do its best to find an area, but it doesn’t always work. What you need to keep in mind when building a good noise profile inside of Neat Video is that the area being processed needs to be as uniform as possible (think dark night sky or a wall painted in one color) — meaning no prominent features, a high noise level (something in the high four area is better), the largest possible sample area and no warnings from Neat Video.

So, if your automatic profile doesn’t do the job, you can find an area of your image that meets the above requirements and then build a profile. From there you can use one of the Neat Video 5 features, like “Profile Check.” Profile Check will highlight details that aren’t being affected by Neat Video, giving you a clear representation of what noise is being reduced and whether you need to adjust your profile to better reduce video noise.

At this point you might be wondering where you tweak advanced settings. When you load Neat Video, you will be in Beginner mode. To get into Advanced mode, go to the Tools menu, where you will see a lot of advanced functions that can help you fine-tune your noise profile. And if you still can’t get a good noise reduction profile, you can try out the “Generic Profile,” which can help you build a profile even if your video doesn’t have a large enough area of uniform noise. There are also presets — such as like light flicker, moire flicker, repeat frame issues, dust and scratch filters (including scan lines), jitter of details, artifact removal filter and more — that can solve certain problems.

Neat Video 5 is faster than previous generations. As in previous versions, there is even a tool that inside of Neat Video preferences that will run your CPU and GPU through a benchmark to specify whether you should run on CPU only, GPU only, or a combo of both. In Neat Video 5, if you have trouble with a clip, you can use up to four “Variants” of noise reduction in the new playback window to see how each profile works with your clip.

In terms of playback and rendering, noise reduction is never fast. However, inside of Neat Video the new playback window will typically play back your footage to preview the noise reduction before you jump back into Resolve. Inside of Resolve, even in just 1080p, my sequence would crawl to just a few frames of playback per second. It is one of the most processor- and GPU-intensive tasks you will run on your computer.

In my testing I applied Neat Video 5 to the first node in my color correction tree, followed by a basic color correction in a one-minute timeline. I took those same clips and compared my Neat Video results to Resolve’s Temporal and Spatial noise reduction tools. In terms of visual results, Neat Video 5 was superior. If that’s not the case for you, then jump into YCbCr viewer mode inside of Neat Video 5, isolate each channel and tweak each channel individually so you won’t affect your overall noise reduction if it isn’t necessary. Not only did Neat Video 5 handle normal noise in the shadows well but on clips with very tight lines, it was able to keep a lot of the details while removing the noise. Resolve’s noise reduction tools had a harder time removing noise but keeping detail. Temporal noise reduction really didn’t do much, and while Spatial noise reduction did work it would heavily blur and distort the image — essentially not acceptable.

To get a good example of how Neat Video 5 slams a computer system, I exported 1080p MP4s. Resolve’s built-in Temporal noise reduction took 1:03, while the Spatial noise reduction took 1:05. The Neat Video 5 render of the same one-minute timeline took 3:51 — almost four times as long. I was curious how much longer a 4K render would take. Using 4K (UHD) media, I applied a simple color correction and on a previous serial node that applied Neat Video 5. I exported a 4K (UHD) MP4, which took 52 seconds without Neat Video 5 applied and 16:27 with Neat Video applied — at least 16 times more render time! So while Neat Video 5 is an amazing tool, there is a trade-off in high render times.

To find additional training on more advanced noise reduction techniques in Neat Video, check out the video tutorials. I find myself watching these just because of how much you can learn about noise reduction in general. They aren’t as exciting as watching Game of Thrones or The Handmaid’s Tale, but they will push your knowledge in noise reduction to the next level.

Summing Up
I’ve used Neat Video for a while, so when I was approached to review Version 5 I immediately said yes. Noise reduction is post skill that not many possess.

If you are an online editor or colorist looking to separate yourself from the pack, learn all the noise reduction techniques you can and definitely check out Neat Video 5. Not only can Neat Video 5 work automatically, but you can fine-tune your noise reduction as much as you want.

And when demoing your color correction services, think about using Neat Video 5 to remove camera noise, flickering and chroma issues; color correcting your footage; and, finally, adding some grain back into your shot. Not only will your footage look better, but you’ll have a technical workflow that will definitely impress clients. Just don’t forget to account for the extra render time.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Oscar-nominated Jojo Rabbit editor Tom Eagles: blending comedy and drama

By Daniel Restuccio

As an editor, Tom Eagles has done it all. He started his career in New Zealand cutting promos before graduating to assistant editor then editor on television series such as Secret Agent Men and Spartacus. Eventually he connected with up-and-coming director Taika Waititi and has worked with him on the series What We Do in the Shadows and the critically acclaimed feature Hunt for the Wilderpeople. Their most recent feature collaboration, 20th Century Fox’s Jojo Rabbit, earned Eagles BAFTA and Oscar nominations as well as an ACE Eddie Award win.

Tom Eagles

We recently caught up with him to talk about the unique storytelling style of Taika films, specifically Jojo Rabbit.

(Warning: If you haven’t seen the film yet, there might be some spoilers ahead.)

How did your first conversation with Taika go?
Fairly early on, unprompted, he gave me a list of his top five favorite films. The kind of scope and variety of it was startling, but they were also my top five favorite films. We talked about Stalker, from filmmaker Andrei Tarkovsky, and I was a massive Tarkovsky fan at the time. He also talked about Annie Hall and Bad Lands.

At that point in time, there weren’t a lot of people doing the type of work that Taika does: that mix of comedy and drama. That was the moment I thought, “I’ve got to work with this guy. I don’t know if I’m going to find anyone else like this in New Zealand.”

How is Jojo different than your previous collaboration on Hunt for the Wilderpeople?
We had a lot more to work with on Jojo. There’s a lot more coverage in a typical scene, while the Wilderpeople was three shots: a master and two singles. With Jojo, we just threw everything at it. Taika’s learned over the years that it’s never a bad thing to have another shot. Same goes for improv. It’s never a bad thing to have a different line. Jojo was a much bigger beast to work on.

Jojo is rooted in a moment in history, which people know well, and they’re used to a certain kind of storytelling around that moment. I think in the Czech Republic, where we shot, they make five World War II movies a year. They had a certain idea of how things should look, and we weren’t doing that. We were doing Taika’s take, so we weren’t doing desaturated, handheld, grim, kitchen sink realism. We were creating this whole other world. I think the challenge was to try and bring people along on that journey.

I saw an early version of the script, and the Hitler character wasn’t in the opening scene. How did that come about?
One of the great things about working with Taika is he always does pick-ups. Normally, it’s something that we figure out that we need during the process of the edit. He rewrote a bunch of different options for the ending of the movie, a few scenes dotted throughout and the opening of the film.

He shot three versions. In one, it was just Jojo on his own, trying to psych himself up. Then there were variations on how much Adolf we would have in the film. What we found when we screened the film up to that point was that people were on board with the film, but it sometimes took them a while to get there … to understand the tone of the film. The moment we put imaginary Adolf in that scene, it was like planting a flag and saying, “This is what this film is. It’s going to be a comedy about Hitler and Nazis, and you’re either with us or you’re walking out, but if you’re with us, you will find out it’s about a lot more than that.”

Some directors sit right at the editor’s elbow, overlooking every cut, and some go away and leave the editor to make a first cut. What was this experience like?
While I’ve experienced both, Taika’s definitely in the latter category. He’s interested in what you have to say and what you might bring to the edit. He also wants to know what people think, so we screen the film a lot. Across the board — it’s not just isolated to me, but anyone he works with — he just wants more ideas.

After the shooting finished, he gave me two weeks. He went and had a break and encouraged me to do what I wanted with the assemble, to cut scenes and to not be too precious about including everything. I did that, but I was still relatively cautious; there were some things I wanted him to see.

We experimented with various structures. We tried an archiving thing for the end of the film. There was a fantasy sequence in which Elsa is talking about the story of the Jews, and we see flights of fancy of what she thinks … a way for her to escape into fantasy. That was an idea of Taika’s. He just left me to it for a couple of weeks, and we looked at it and decided against it in the end. It was a fun process because when he comes back, he’s super fresh. You offer up one idea and he throws five back.

How long was the first cut?
I asked my assistant the other day, and he said it was about two hours and forty minutes, so I guess I have to go with that, which sounds long to me. That might have been the first compile that had all of the scenes in it, and what I showed Taika was probably half an hour shorter. We definitely had a lot to play with.

Do you think there’s going to be a director’s cut?
I think what you see is the director’s cut. There’s not a version of the film that has more stuff in it than we wanted in it. I think it is pretty much the perfect direction. I might have cut a little bit more because I think I just work that way. There were definitely things that we missed, but I wouldn’t put them back in because of what we gained by taking them out.

We didn’t lean that heavily on comedy once we transitioned into drama. The longer you’re away from Jojo and Elsa, that’s when we found that the story would flounder a little bit. It’s interesting because when I initially read the script, I was worried that we would get bored of that room, and that it would feel too much like a stage play. So we added all of this color and widened the world out. We had these scenes where Jojo goes out into the world, but actually the relationship between the two of them — that’s the story. Each scene in that relationship, the kind of gradual progression toward each other, is what’s moving the story forward.

This movie messes with your expectations, in terms of where you think it’s going or even how it’s saying it. How did you go about creating your own rhythms for that style of storytelling?
I was fortunate in that I already had Taika’s other films to lean on, so partly it was just trying to wrestle this genre into his world … into his kind of subgenre of Taika. It’s really just a sensibility a lot of the time. I was aware that I wanted a breathlessness to the pace of things, especially for the first half of the movie in order to match Jojo’s slightly ADD, overexcited character. That slows down a little bit when it needs to and when he’s starting to understand the world around him a little bit more.

Can you talk about the music?
Music also was important. The needle drops. Taika had a bunch of them already. He definitely had The Beatles and Bowie, and it was fleshing out a few more of those. I think I found the Roy Orbison piece. Temp music was also really important. It was quite hard to find stuff. Taika’s brief was: I don’t want it to sound like all the other movies in the genre. As much as we respected Schindler’s List, he didn’t want it to sound like Schindler’s List.

You edited on Avid Media Composer?
We cut on Avid, and it was the first time I really used ScriptSync. I had been wary of it, to be honest. I watch all the dailies through from head to tail and see the performances in context and feel how they affect me. Once that’s done, ScriptSync is great for comparing takes or swapping out a read on a line. Because we had so much improv on this film, we had to go through and enter all of that in manually. Sometimes we’d use PhraseFind to search on a particular word that I’d remembered an actor saying in an ad-lib. It’s a much faster and more efficient way of finding that stuff.

That said, I still periodically go back and watch dailies. As the film starts to solidify, so does what I’m looking for in the dailies, so I’ll always go back and see if there’s anything that I view differently with a new in mind.

You mentioned the difference between Wilderpeople and Jojo in terms of coverage. How much more coverage did you have? Were there multiple cameras?
There were two and sometimes three cameras (ARRI Alexa). Some scenes were single camera, so there was a lot more material mastered. Some directors get a bit iffy about two cameras, but we just rolled it.

If we had the option, we would almost always lean on the A camera, and part of the trick was to try and make it look like his other movies. We wanted the coverage plan to feel simple; it should still feel like a master, couple of mediums and a couple of singles, all in that very flat framing approach of his. Often, the characters are interacting with each other perpendicular to the camera in these fixed static wides.

Again, one of the things Taika was concerned with was that it should feel like his other movies. Just because we have a dolly, we don’t have to use it every time. We had all of those shots, we had those options, and often it was about pairing things back to try and stay in time.

Does he give you a lot of takes, and does he create different emotional variations within those takes?
We definitely had a lot of takes. And, yes, there would be a great deal of variety of performance, whether it’s him just trying to push an actor and get them to a specific place, or sometimes we just had options.

Was there an average — five takes, 10 takes?
It’s really hard to say. These days everyone just does rolling resets. You look at your bin and you think, “Ah, great, they did five takes, and there’s only three set-ups. How long is it going to take me?” But you open it up, and each take is like half an hour long, and they’re reframing on the fly.

With Scarlett Johansson, you do five takes max, probably. But with the kids it would be a lot of rolling resets and sometimes feeding them lines, and just picking up little lines here and there on the fly. Then with the comedians, it was a lot of improv, so it’s hard to quantify takes, but it was a ton of footage.

If you include the archive footage, I think we had 300 to 400 hours. I’m not sure how much of that was our material, but it would’ve been at least 100 hours.

I was impressed by the way you worked the “getting real” scenes: the killing of the rabbit and the hanging scene. How did you conceptualize and integrate those really important moments?
For the hanging scene, I was an advocate for having it as early in the movie as possible. It’s the moment in the film where we’ve had all this comedy and good times [regarding] Nazis, and then it drives home that this film is about Nazis, and this is what Nazis do.

I wanted to keep the rabbit scene fun to a degree because of where it sits in the movie. I know, obviously, it’s quite a freaky scene for a lot of people, but it’s kind of scary in a genre way for me.

Something about those woods always remind me of Stand by Me. That was the movie that was in my mind, and just the idea of those older kids, the bullies, being dicks. Moments like that and, much more so, the moment when Jojo finds Elsa; I thought of that sequence as a mini horror film within the film. That was really useful to let the scares drive it because we were so much in Jojo’s point of view. It’s taking those genres and interjecting a little bit of humor or a little bit of lightness into them to keep them in tone with Taika’s overall sensibility.

I read that you tried to steer clear of the sentimentality. How did you go about doing that?
It’s a question of taste with the performance(s) and things that other people might like. I will often feel I’m feeding the audience or demanding of the audience an emotional response. The scene where Jojo finds Rosie. We shot an option seeing Rosie hanging there. It just felt too much. It felt like it was really bludgeoning people over the head with the horror of the moment. It was enough to see the shoes. Every time we screened the movie and Jojo stands up, we see the shoes and everyone gasps. I think people have gotten the information that they need.


Dan Restuccio is a writer/director with Realwork Entertainment and part of the Visual Arts faculty at California Lutheran University. He is a former Disney Imagineer. You can reach him at dansweb451@gmail.com.

VFX-heavy Skyworth OLED TV spot via The-Artery

The-Artery created a spot for Skyworth’s latest version of its W81|W81 Pro Wallpaper OLED TV, which debuted last month at the “See the Wonder” event at CES 2020.

Created using The-Artery‘s newly opened Resolve-based color room and expanded design capabilities —spearheaded by colorist Stephen Picano and design director Lauren Indovina — the commercial features a couple swimming through space-like waters, children battling origami dragons while floating in a paper boat and a traveler treking through snowy tundras while glowing jellyfish float overhead. Publicis, Skyworth’s agency, wanted the ad to reflect “the wonder” of the company’s newest television model.

“The campaign, helmed by director Eli Sverdlov, was very director-led in a way that I’ve never seen before,” explains The-Artery’s EP/MD, Deborah Sullivan. “Of course, there was still ongoing dialogue with the client and agency, but the level of creative control that was entrusted is almost unheard of. Everything was open from start to finish, including the ideation phase, color grading and design — to name a few. Our team had a lot of fun jumping straight into the edit to develop and launch what we consider as a high-end conceptual throwback to the nineties.”

Sverdlov agrees: “Our flexible creative process was in a condensed schedule and required a very unique collaboration. We were practically creating the ideas and visuals while editing and sourcing footage.”

Due to the production’s long shooting schedule and tight deadlines, the visual effects were designed via Autodesk Flame in realtime, all under one roof, while filming took place in Serbia. Additional footage was carefully curated as well as color graded and cut to fit the tone and flow of the rest of the piece. The creature imagery such as the jellyfish was done via CG.

In addition to Flame and Resolve, The-Artery called on SideFX Houdini, Autodesk Maya, Maxon’s RedShift, Otoy’s Octane, Autodesk’s Arnold, Adobe After Effects and Maxon’s Cinema 4D.

Editor David Cea joins Chicago’s Optimus  

Chicago-based production and post house Optimus has added editor David Cea to its roster. With 15 years of experience in New York and Chicago, Cea brings a varied portfolio of commercial editing experience to Optimus.

Cea has cut spots for brands such as Bank of America, Chevrolet, Exxon, Jeep, Hallmark, McDonald’s, Microsoft and Target. He has partnered with many agencies, including BBDO, Commonwealth, DDB, Digitas, Hill Holliday, Leo Burnett, Mother and Saatchi & Saatchi.

“I grew up watching movies with my dad and knew I wanted to be a part of that magical process in some way,” explains Cea. “The combination of Goodfellas and Monty Python gave me all the fuel I needed to start my film journey. It wasn’t until I took an editing class in college that I discovered the part of filmmaking I wanted to pursue. The editor is the one who gets to shape the final product and bring out the true soul of the footage.”

After studying film at Long Island’s Hofstra University, Cea met Optimus editor and partner Angelo Valencia while working as his assistant at Whitehouse New York in 2005. Cea then moved on to hone his craft further at Cosmo Street in New York. Chicago became home for him in 2013 as he spent three years at Whitehouse. After heading back east for a couple of years, he returned to Chicago to put down roots.

While Avid Media Composer is Cea’s go-to choice for editing, he is also proficient in Adobe Premiere.

Framestore launches FPS preproduction services

VFX studio Framestore has launched FPS (Framestore Pre-production Services) for the global film and content production industries. An expansion of Framestore’s existing capability, FPS is available to clients in need of standalone preproduction support or an end-to-end production solution.

The move builds out and aligns the company’s previz, virtual production, techviz and postviz services with Framestore’s art department (which operates either as part of the Framestore workflow or as a stand-alone creative service), virtual production team and R&D unit, and integrates with the company’s VFX and animation teams. The move builds on work on films such as Gravity and the knowledge gained during the company’s eight-year London joint venture with visualization company The Third Floor. FPS is working on feature film projects as part of an integrated offering and as a standalone visualization partner, with more projects slated in the coming months.

The new team is led by Alex Webster, who joins as FPS managing director after running The Third Floor London. He will report directly to Fiona Walkinshaw, Framestore’s global managing director, film.

“This work aligns Framestore’s singular VFX and animation craft with a granular understanding of the visualization industry,” says Webster. “It marries the company’s extraordinary legacy in VFX with established visualization and emergent virtual production processes, supported by bleeding-edge technology and dedicated R&D resource to inform the nimble approach which our clients need. Consolidating our preproduction services represents a significant creative step forward.”

“Preproduction is a crucial stage for filmmakers,” says chief creative officer Tim Webber. “From mapping out environments to developing creatures and characters to helping plot action sequences it provides unparalleled freedom in terms of seeing how a story unfolds or how characters interact with the worlds we create. Bringing together our technical innovation with an understanding of filmmaking, we want to offer a bespoke service for each film and each individual to help tell compelling, carefully crafted stories.”

“Our clients’ needs are as varied as the projects they bring to us, with some needing a start-to-finish service that begins with concept art and ends in post while others want a bespoke, standalone solution to specific creative challenges, be that in early stage concepting, through layout and visualization or in final animation and VFX” says Framestore CEO William Sargent. “It makes sense to bring all these services in-house — even more so when you consider how our work in adjacent fields like AR, VR and MR has helped the likes of HBO, Marvel and Warner Bros. bring their IP to new, immersive platforms. What we’ll ultimately deliver goes well beyond previz and beyond visualization.”

Main Image: (L-R) Tim Webber, Fiona Walkinshaw and Alex Webster.

CAS Awards recognize GOT, Fleabag, Ford v Ferrari, more

The CAS Awards were held this past weekend, with the sound mixing team from Ford v Ferrari  — Steven A. Morrow, CAS, Paul Massey CAS, David Giammarco CAS, Tyson Lozensky, David Betancourt and Richard Duarte — taking home the Cinema Audio Society Award for Outstanding Sound Mixing Motion Picture – Live Action.

Game of Thrones – The Bells

Top honors for Motion Picture – Animated went to Toy Story 4 and the sound mixing team of Doc Kane CAS, Vince Caro CAS, Michael Semanick CAS, Nathan Nance, David Boucher and Scott Curtis. The CAS Award for Outstanding Sound Mixing Motion Picture – Documentary went to Making Waves: The Art of Cinematic Sound and the team of David J. Turner, Tom Myers, Dan Blanck and Frank Rinella.

Held in the Wilshire Grand Ballroom of the InterContinental Los Angeles Downtown, the awards were presented in seven categories for Outstanding Sound Mixing Motion Picture and Television and two Outstanding Product Awards. The evening saw CAS president Karol Urban pay tribute to recently retired CAS executive board member Peter R. Damski for his years of service to the organization. The contributions of re-recording mixer Tom Fleischman, CAS, were recognized as he received the CAS Career Achievement Award. Presenter Gary Bourgeois spoke to Fleischman’s commitment to excellence demonstrated in a career that spans over 40 years,  nearly 200 films and collaborations with dozens of notable directors.  

James Mangold

James Mangold received the CAS Filmmaker Award in a presentation that included remarks  by re-recording mixer Paul Massey, CAS, who was joined in the presentation by Harrison Ford. Mangold had even more to celebrate as he watched his sound team take top honors for Outstanding Achievement in Sound Mixing Motion Picture – Live Action. 

Here is the complete list of winners:

MOTION PICTURE – LIVE ACTION

Ford v Ferrari

Ford v Ferrari team

Production Mixer – Steven A. Morrow CAS 

Re-recording Mixer – Paul Massey CAS 

Re-recording Mixer – David Giammarco CAS 

Scoring Mixer – Tyson Lozensky

ADR Mixer – David Betancourt 

Foley Mixer – Richard Duarte

MOTION PICTURE – ANIMATED 

Toy Story 4

Original Dialogue Mixer – Doc Kane CAS

Original Dialogue Mixer – Vince Caro CAS

Re-recording Mixer – Michael Semanick CAS 

Re-recording Mixer – Nathan Nance

Scoring Mixer – David Boucher

Foley Mixer – Scott Curtis

 

MOTION PICTURE – DOCUMENTARY

Making Waves: The Art of Cinematic Sound

Production Mixer – David J. Turner 

Re-recording Mixer – Tom Myers 

Scoring Mixer – Dan Blanck

ADR Mixer – Frank Rinella

 

TELEVISION SERIES – 1 HOUR

Game of Thrones: The Bells

Production Mixer – Ronan Hill CAS 

Production Mixer –Simon Kerr 

Production Mixer – Daniel Crowley 

Re-recording Mixer – Onnalee Blank CAS 

Re-recording Mixer – Mathew Waters CAS 

Foley Mixer – Brett Voss CAS

TELEVISION SERIES – 1/2 HOUR 

TIE

Barry: ronny/lily

Production Mixer – Benjamin A. Patrick CAS 

Re-recording Mixer – Elmo Ponsdomenech CAS 

Re-recording Mixer – Jason “Frenchie” Gaya 

ADR Mixer – Aaron Hasson

Foley Mixer – John Sanacore CAS

 

Fleabag: Episode #2.6

Production Mixer – Christian Bourne 

Re-recording Mixer – David Drake 

ADR Mixer – James Gregory

 

TELEVISION MOVIE or LIMITED SERIES

Chernobyl: 1:23:45

Production Mixer – Vincent Piponnier 

Re-recording Mixer – Stuart Hilliker 

ADR Mixer – Gibran Farrah

Foley Mixer – Philip Clements

 

TELEVISION NON-FICTION, VARIETY or MUSIC SERIES or SPECIALS

David Bowie: Finding Fame

Production Mixer – Sean O’Neil 

Re-recording Mixer – Greg Gettens

 

OUTSTANDING PRODUCT – PRODUCTION

Sound Devices, LLC

Scorpio

 

OUTSTANDING PRODUCT – POST PRODUCTION 

iZotope

Dialogue Match

 

STUDENT RECOGNITION AWARD

Bo Pang

Chapman University

 

Main Image: Presenters Whit Norris, Elisha Cuthbert, Award winners Onnalee Blank, Ronan Hill and Brett Voss at the CAS Awards. (Tyler Curtis/ABImages) 

 

 

Filmic DoubleTake allows multicam workflows via iPhone 11

Filmic’s DoubleTake, is a new iOS app designed to turn an Apple iPhone 11 into a multicam studio. Available now from the Apple App Store, Filmic DoubleTake enables iPhone 11 users to capture video from two cameras, simultaneously from a single device, to create a multi-angle viewing experience.

Filmic DoubleTake allows content creators to use the multiple cameras in the iPhone 11, iPhone 11 Pro and iPhone 11 Pro Max — as well as the iPhone XR, iPhone XS and iPhone XS Max — to create the effect of using multiple camera angles in a shot.

According to Filmic, DoubleTake was designed for content creators of any skill level and for multiple genres of content — from professional broadcast-style news interviews to YouTubers capturing multiple angles during live events, concerts or any situation that requires more than one perspective to capture the moment.

Key features include:
• Multicam: Enables users to capture two different focal lengths of the same subject at the same time. DoubleTake uses the Ultra Wide lens (iPhone 11 Pro Max, 11 Pro and 11 only) and the Tele lens to capture both an establishing shot and a punch-in shot on a subject simultaneously. Or they can use any other combination of front and rear lenses for unrivaled multicam capture.

• Camera Visualization: Similar to a director’s viewfinder, DoubleTake’s camera picker-view enables users to visualize all available cameras on the device. Users can employ this view to help decide how to frame a shot and which cameras to select.

• Shot/Reverse Shot: Shot/Reverse Shot: Enables users to capture all the organic and intimate interaction between two actors or interviewer and interviewee. Traditionally, filmmakers must employ two cameras and place them in cumbersome “over the shoulder” locations. With DoubleTake, users can place one device between the actors, effectively placing the audience in the middle of the conversation.

• PiP or Discrete: The DoubleTake interface allows users to see both cameras of the video capture at the same time through the use of a picture-in-picture (PiP) window. The PiP window can be moved around the screen, tapped to zoom in or swiped away if distracting; the second video will continue to record. With DoubleTake, users can record videos as discrete files or as a composite video that includes the PiP window animated as it is seen on screen.

• Split-Screen: DoubleTake can also use any two cameras to create a 50/50 split-screen effect that is saved as a single video. This allows for capturing engaging interviews or any scenario in which two sides of the story require equal weighting on screen.

• Focus and Exposure Controls: DoubleTake enables users to set and lock focus and exposure on both cameras during multicam capture with Filmic unified reticle. Users can tap anywhere to set an area of interest with the reticle, then tap again to lock or unlock. Filmic camera switcher effortlessly moves between A and B cameras during a recording to adjust the focus and exposure for each, independently of one another.

Select video specifications include:
• Full-frame focus and exposure for smooth and easy automated focus and exposure adjustments
• Selectable broadcast frame rates:  24fps, 25fps and 30fps depending on project requirements
• 1080p video at high-bit-rate encoding for maximum quality. (Note: 1080p video is the maximum resolution supported by Apple today for multicam capture.)
• Composited PiP or separate discrete video files recorded as H.264 Mov files are saved to DoubleTake’s internal library, which supports batch export directly to the camera roll.

Wylie Stateman on Once Upon a Time… in Hollywood‘s Oscar nod for sound

By Beth Marchant

To director Quentin Tarantino, sound and music are primal forces in the creation of his idiosyncratic films. Often using his personal music collection to jumpstart his initial writing process and later to set a film’s tone in the opening credits, Tarantino always gives his images a deep, multi-sensory well to swim in. According to his music supervisor Mary Ramos, his bold use of music is as much a character as each film’s set of quirky protagonists.

Wylie Stateman – Credit: Andrea Resnick

Less showy than those memorable and often nostalgic set-piece songs, the sound design that holds them together is just as critically important to Tarantino’s aesthetic. In Once Upon a Time… in Hollywood it even replaces the traditional composed score. That’s one of many reasons why the film’s supervising sound editor Wylie Stateman, a long-time Tarantino collaborator, relished his latest Oscar-nominated project with the director (he previously received nominations for Django Unchained and Inglourious Basterds and has a lifetime total of nine Oscar nominations).

Before joining team Tarantino, Stateman sound designed some of the most iconic films of the ‘80s and ‘90s, including Tron, Footloose, Ferris Bueller’s Day Off (among 15 films he made with John Hughes), Born on the Fourth of July and Jerry Maguire. He also worked for many years with Oliver Stone, winning a BAFTA for his sound work on JFK. He went on to cofound the Topanga, California-based sound studio Twentyfourseven.

We talked to Stateman about how he interpreted Tarantino’s sound vision for his latest film — about a star having trouble evolving to new roles in Hollywood and his stuntman — revealing just how closely the soundtrack is connected to every camera move and cut.

How does Tarantino’s style as a director influence the way you approach the sound design?
I believe that sound is a very important department within the process of making any film. And so, when I met Quentin many years ago, I was meeting him under the guise that he wanted help and he wanted somebody who could focus their time, experience and attention on this very specific department called sound.

I’ve been very fortunate, especially on Quentin’s films, to also have a great production sound mixer and great rerecording mixers. We have both sides of the process in really tremendously skilled hands and tremendously experienced hands. Mark Ulano, our production sound mixer, won an Oscar for Titanic. He knows how to deal with dialogue. He knows how to deal with a complex set, a set where there are a lot of moving parts.

On the other side of that, we have Mike Minkler doing the final re-recording mixing. Mike, who I worked with on JFK, is tremendously skilled with multiple Oscars to his credit. He’s just an amazing creative in terms of re-recording mixing.

The role that I like to play as  supervising sound editor and designer, is how to speak to the filmmaker in terms of sound. For this film, we realized we could drive the soundtrack without a composer by using the chosen songs and KHJ radio, and select these bits and pieces from the shows of infamous DJ “Humble Harve,” or from the clips of all the other DJs on KHJ radio who really defined 1969 in Los Angeles.

And as the film shows, most people heard them over the car radio in car-centric LA.
The DJs were powerful messengers of popular culture. They were powerful messengers of what was happening in the minds and in the streets and in popular culture of that time. That was Quentin’s idea. When he wrote the script, he had written into it all of the KHJ radio segments, and he listens a lot, and he’s a real student of the filmmaking process and a real master.

On the student side, he’s constantly learning and he’s constantly looking and he’s constantly listening. On the master side, he then applies that to the characters that he wants to develop and those situations that he’s looking to be at the base and basis of his story. So, basically, Quentin comes to me for a better understanding of his intention in terms of sound, and he has a tremendous understanding to begin with. That’s what makes it so exciting.

When talking to Quentin and his editor Fred Raskin, who are both really deeply knowledgeable filmmakers, it can be quite challenging to stay in front of them and/or to chase behind them. It’s usually a combination of the two. But Quentin is a very generous collaborator, meaning he knows what he wants, but then he’s able to stop, listen and evaluate other ideas.

How did you find all of the clips we hear on the various radios?
Quentin went through hundreds of hours of archival material. And he has a tremendous working knowledge of music to begin with, and he’s also a real student of that period.

Can you talk about how you approached the other elements of specific, Tarantino-esque sound, like Cliff crunching on a celery stick in that bar scene?
Quentin’s movies are bold in the sense of some of the subject matter that he tackles, but they’re highly detailed and also very much inside his actors’ heads. So when you talk about crunching on a piece of celery, I interpret everything that Quentin imparts on his characters as having some kind of potential vocabulary in terms of sound. And that vocabulary… it applies to the camera. If the camera hides behind something and then comes out and reveals something or if the camera’s looking at a big, long shot — like Cliff Booth’s walk to George Spahn’s house down that open area in the Spahn Ranch — every one of those moves has a potential sound component and every editorial cut could have a vocabulary of sound to accompany it.

We also use those [combinations] to alter time, whether it’s to jump forward or jump back or just crash in. He does a lot of very explosive editing moves and all of that has an audio vocabulary. It’s been quite interesting to work with a filmmaker that sees picture and sound as sort of a romance and a dance. And the sound could lead the picture, or it could lag the picture. The sound can establish a mood, or it can justify a mood or an action. So it’s this constant push-pull.

Robert Bresson, the father of the French New Wave, basically said, “When the ear leads the eye, the eye becomes impatient. When the eye leads the ear, the ear becomes impatient. Use those impatiences.” So what I’m saying is that sound and pictures are this wonderful choreographed dance. Stimulate peoples’ ears and their eye is looking for something; stimulate their eyes and their ears are looking for something, and using those together is a really intimate and very powerful tool that Quentin, I think, is a master at.

How does the sound design help define the characters of Rick Dalton (Leonardo DiCaprio) and Cliff Booth (Brad Pitt)?
This is essentially a buddy movie. Rick Dalton is the insecure actor who’s watching a certain period — when they had great success and comfort — transition into a new period. You’re going from the John Wayne/True Grit way of making movies to Butch Cassidy and the Sundance Kid or Easy Rider, and Rick is not really that comfortable making this transition. His character is full of that kind of anxiety.

The Cliff Booth character is a very internally disturbed character. He’s an unsuccessful crafts/below-the-line person who’s got personal issues and is kind of typical of a character that’s pretty well-known in the filmmaking process. Rick Dalton’s anxious world is about heightened senses. But when he forgets his line during the bar scene in the Lancer set, the world doesn’t become noisy. The world becomes quiet. We go to silence because that’s what’s inside his head. He can’t remember the line and it’s completely silent. But you could play that same scene 180 degrees in the opposite direction and make him confused in a world of noise.

The year 1969 was very important in the history of filmmaking, and that’s another key to Rick’s and Cliff’s characters. If you look at 1969, it was the turning point in Hollywood when indie filmmaking was introduced. It was also the end of a great era of traditional studio fair and traditional acting, and was more defined by the looser, run-and-gun style of Easy Rider. In a way, the Peter Fonda/Dennis Hopper dynamic of Hopper’s film is somewhat similar to that of Rick Dalton and Cliff Booth.

I saw Easy Rider again recently and the ending hit me like a ton of bricks. The cultural panic, and the violence it invokes, is so palpable because you realize that clash of cultures never really went away; it’s still with us all these years later. Tarantino definitely taps into that tension in this film.
It’s funny that you say that because my wife and I went to the Cannes Film Festival with the team, and they were playing Easy Rider on the beach on a giant screen with a thousand seats in the sand. We walked up on it and we stood there for literally an hour and a half transfixed, just watching it. I hadn’t seen it in years.

What a great use of music and location photography! And then, of course, the story and the ending; it’s like, wow. It’s such a huge departure from True Grit and that generation that made that film. That’s what I love about Quentin, because he plays off the tension between those generations in so many ways in the film. We start out with Al Pacino, and they’re drinking whiskey sours, and then we go all the way through the gambit of what 1969 really felt like to the counterculture.

Was there anything unusual that you did in the edit to manipulate sound to make a scene work?
Sound design is a real design-level responsibility. We invent sound. We go to the libraries and we go to great lengths to record things in nature or wherever we can find it. In this case, we recorded all the cars. We apply a very methodical approach to sound.

Sound design, for me, is the art of shaping noise to suit the picture and to enhance the story and great sound lives somewhere between the science of audio and the subjectivity of storytelling. The science part is really well-known, and it’s been perfected over many, many years with lots of talented artists and artisans. But the story part is what excites me, and it’s what excites Quentin. So it becomes what we don’t do that’s so interesting, like using silence instead of noise or creating a soundtrack without a composer. I don’t think you miss having score music. When we couldn’t figure out a song, we made sound design elements. So, yeah, we would make tension sounds.

Shaping noise is not something I could explain to you with an “an eye of newt plus a tail of yak” secret recipe. It’s a feeling. It’s just working with audio, shaping sound effects and noise to become imperceptibly conjoined with music. You can’t tell where the sound design is beginning and ending and where it transfers into more traditional song or music. That is the beauty of Quentin’s films. In terms of sound, the audio has shapes that are very musical.

His deep-cut versions of songs are so interesting, too. Using “California Dreaming” by the Mamas and Papas would have been way too obvious, so he uses a José Feliciano cover of it and puts the actual Mamas and Papas into the film as walk-on characters.
Yeah. I love his choice of music. From Sharon and Roman listening to “Hush” by Deep Purple in the convertible, their hair flying, to going straight into “Son of a Lovin’ Man” after they arrive at the Playboy Mansion. Talk about 1969 and setting it off! It’s not from the San Francisco catalog; it’s just this lovely way that Quentin imagines time and can relate to it as sound and music. The world as it relates to sound is very different than the world of imagery. And the type of director that Quentin is, he’s a writer, he’s a director, and he’s a producer, so he really understands the coalescing of these disciplines.

You haven’t done a lot of interviews in the past. Why not?
I don’t do what I do to call attention to either myself or my work. Over the first 35 years of my career, there’s very little record of any conversation that I had outside of my team and directly with my filmmakers. But at this point in life, when we’re at the cusp of this huge streaming technology shift and everything is becoming more politically sensitive, with deep fakes in both image and audio, I think it’s time sound should have somebody step up and point out, “Hey, we are invisible. We are transitory.” Meaning, when you stop the electricity going to the speakers, the sound disappears, which is kind of an amazing thing. You can pause the picture and you can study it. Sound only exists in real time. It’s just the vibration in the air.

And to be clear, I don’t see motion picture sound as an art form. I see it, rather, as a form of art and it takes a long time to become a sculptor in sound who can work in a very simple style. After all, it’s the simplest lines that just blow your mind!

What blew your mind about this film, either while you worked on it or when you saw the finished product?
I really love the whole look of the film. I love the costumes, and I have great respect for the team that Quentin consistently pulls together. When I work on Quentin’s films, I never turn around and find somebody that doesn’t have a great idea or deep experience in their craft. Everywhere you turn, you bump into extraordinary talent.

Dakota Fanning’s scene at the Spahn Ranch… I mean, wow! Knocks my socks off. That’s really great stuff. It’s a remarkable thing to work with a director who has that kind of love for filmmaking and that allows for really talented people to also get in the sandbox and play.


Beth Marchant is a veteran journalist focused on the production and post community and contributes to “The Envelope” section of the Los Angeles Times. Follow her on Twitter @bethmarchant.

Roger Deakins and 1917 win Theatrical prize at ASC Awards

The Theatrical Award for best cinematography in a motion picture went to Roger Deakins, ASC, BSC, for 1917 at the 34th American Society of Cinematographers Outstanding Achievement Awards.

Jarin Blaschke took the Spotlight Award for The Lighthouse and Fejmi Daut and Samir Ljuma won the inaugural Documentary Award for Honeyland. In the TV categories, winners included Colin Watkinson, ASC, BSC, for The Handmaid’s Tale; John Conroy, ISC, for The Terror: Infamy; and C. Kim Miles, CSC, MySC, for Project Blue Book.

TCM’s Ben Mankiewicz hosted the awards gala, which was held at the Ray Dolby Ballroom at Hollywood & Highland.

Below is the complete list of winners and nominees:

Theatrical Release Category – presented by Diane Lane

Roger Deakins, ASC, BSC – “1917” – WINNER

Phedon Papamichael, ASC, GSC – “Ford v Ferrari”

Rodrigo Prieto, ASC, AMC – “The Irishman”

Robert Richardson, ASC – “Once Upon a Time in Hollywood”

Lawrence Sher, ASC – “Joker”

Jarin Blaschke – The Lighthouse

Spotlight Award Category – presented by Bartosz Bielenia

 

Jarin Blaschke – “The Lighthouse” – WINNER

Natasha Braier, ASC, ADF – “Honey Boy”

Jasper Wolf, NSC – “Monos”


Documentary Category – presented by Todd Phillips

Fejmi Daut and Samir Ljuma – “Honeyland” – WINNER

Nicholas de Pencier – “Anthropocene: The Human Epoch”

Evangelia Kranioti – “Obscuro Barroco”


Episode of a Series for Non-Commercial Television – presented by Emily Deschanel

David Luther – “Das Boot,” Gegen die Zeit (episode 6)

M. David Mullen, ASC – “The Marvelous Mrs. Maisel,” Simone

Colin Watkinson – Handmaid’s Tale

Chris Seager, BSC – “Carnival Row,” Grieve No More

Brendan Steacy, CSC – “Titans,” Dick Grayson

Colin Watkinson, ASC, BSC – “The Handmaid’s Tale,” Night – WINNER


Episode of a Series for Commercial Television – presented by Jane Lynch

Dana Gonzales, ASC – “Legion,” Chapter 20

C. Kim Miles, CSC, MySC – “Project Blue Book,” The Flatwoods Monster – WINNER

Polly Morgan, ASC, BSC – “Legion,” Chapter 23

Peter Robertson, ISC – “Vikings,” Hell

David Stockton, ASC – “Gotham,” Ace Chemicals


Motion Picture, Miniseries, or Pilot Made for Television – presented by Michael McKean

John Conroy – The Terror: Infamy

John Conroy, ISC – “The Terror: Infamy,” A Sparrow in a Swallow’s Nest – WINNER

P.J. Dillon, ISC – “The Rook,” Chapter 1

Chris Manley, ASC – “Doom Patrol,” pilot

Martin Ruhe, ASC – “Catch-22,” Episode 5

Craig Wrobleski, CSC – “The Twilight Zone,” Blurryman

With the exception of Deakins, all of the awards were handed out to first-time winners. Deakins collected the top honor last year for “Blade Runner 2049” and previously for “Skyfall,” “The Man Who Wasn’t There,” and “The Shawshank Redemption.”

Honorary awards were also presented, including:

Frederick Elmes (left) talking about his award.

The ASC Board of Governors Award was given to Werner Herzog by Paul Holdengräber, interviewer/curator/writer and executive director of the Onassis Foundation. The award recognizes Herzog’s significant and indelible contributions to cinema. It is the only ASC Award not given to a cinematographer and is reserved for filmmakers who have been champions of the visual art form.

  • The ASC Lifetime Achievement Award was presented to Frederick Elmes, ASC, by writer-director Lisa Cholodenko. The duo collaborated on the Emmy-winning “Olive Kittridge.”
  • The ASC Career Achievement in Television Award was handed out to Donald A. Morgan, ASC, by actor Tim Allen. The two work together on the award-winning “Last Man Standing,” and previously collaborated on “Home Improvement.”
  • The International Award was bestowed upon Bruno Delbonnel, ASC, AFC, by writer-director Joel Coen. The duo has joined forces on several films, including the Oscar-nominated “Inside Llewyn Davis” and “The Ballad of Buster Scruggs.”
  • This year’s President’s Award went to Don McCuaig, ASC. It was given to him by longtime friend and actor-stuntman Mickey Gilbert.
  • The ASC Bud Stone Award of Distinction was given to Kim Snyder, president and CEO of Panavision. This award is presented to an ASC Associate Member who has demonstrated extraordinary service to the society and/or has made a significant contribution to the motion-picture industry.

 Main Image: Roger Deakins and his wife James Deakins

Colorist Chat: Light Iron supervising colorist Ian Vertovec

“As colorists, we are not just responsible for enhancing each individual shot based on the vision of the filmmakers, but also for helping to visually construct an emotional arc over time.”

NAME: Ian Vertovec

TITLE: Supervising Colorist

COMPANY: Light Iron

CAN YOU DESCRIBE YOUR ROLE IN THE COMPANY?
A Hollywood-based collaborator for motion picture finishing, with a studio in New York City as well.

GLOW

AS A COLORIST, WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
As colorists, we are not just responsible for enhancing each individual shot based on the vision of the filmmakers, but also for helping to visually construct an emotional arc over time. For example, a warm scene feels warmer coming out of a cool scene as opposed to another warm scene. We have the ability and responsibility to nudge the audience emotionally over the course of the film. Using color in this way makes color grading a bit like a cross between photography and editing.

ARE YOU SOMETIMES ASKED TO DO MORE THAN JUST COLOR ON PROJECTS?
Once in a while, I’ll be asked to change the color of an object, like change a red dress to blue or a white car to black. While we do have remarkable tools at our disposal, this isn’t quite the correct way to think about what we can do. Instead of being able to change the color of objects, it’s more like we can change the color of the light shining on objects. So instead of being able to turn a red dress to blue, I can change the light on the dress (and only the dress) to be blue. So while the dress will appear blue, it will not look exactly how a naturally blue dress would look under white light.

WHAT’S YOUR FAVORITE PART OF THE JOB?
There is a moment with new directors, after watching the first finished scene, when they realize they have made a gorgeous-looking movie. It’s their first real movie, which they never fully saw until that moment — on the big screen, crystal clear and polished — and it finally looks how they envisioned it. They are genuinely proud of what they’ve done, as well as appreciative of what you brought out in their work. It’s an authentic filmmaking moment.

WHAT’S YOUR LEAST FAVORITE?
Working on multiple jobs at a time and long days can be very, very draining. It’s important to take regular breaks to rest your eyes.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Something with photography, VFX or design, maybe.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I was doing image manipulation in high school and college before I even knew what color grading was.

Just Mercy

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
Just Mercy, Murder Mystery, GLOW, What We Do in the Shadows and Too Old to Die Young.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
Sometimes your perspective and a filmmaker’s perspective for a color grade can be quite divergent. There can be a temptation to take the easy way and either defer or overrule. I find tremendous value in actually working out those differences and seeing where and why you are having a difference of opinion.

It can be a little scary, as nobody wants to be perceived as confrontational, but if you can civilly explain where and why you see a different approach, the result will almost always be better than what either of you thought possible in the first place. It also allows you to work more closely and understand each other’s creative instincts more accurately. Those are the moments I am most proud of — when we worked through an awkward discord and built something better.

WHERE DO YOU FIND INSPIRATION?
I have a fairly extensive library of Pinterest boards — mostly paintings — but it’s real life and being in the moment that I find more interesting. The color of a green leaf at night under a sodium vapor light, or how sunlight gets twisted by a plastic water bottle — that is what I find so cool. Why ruin that with an Insta post?

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
FilmLight Baselight’s Base Grade, FilmLight Baselight’s Texture Equalizer and my Red Hydrogen.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
Instagram mostly.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
After working all day on a film, I often don’t feel like watching another movie when I get home because I’ll just be thinking about the color.  I usually unwind with a video game, book or podcast. The great thing about a book or video games is that they demand your 100% attention. You can’t be simultaneously browsing social media or the news  or be thinking about work. You have to be 100% in the moment, and it really resets your brain.

FXhome’s HitFilm Express 14, ‘Pay What You Want’ option

FXhome has a new “Pay What You Want” good-will program inspired by the HitFilm Express community’s requests to be able to help pay for development of the historically free video editing and VFX software. Pay What You Want gives users the option to contribute financially, ensuring that those funds will be allocated for future development and improvements to HitFilm.

Additionally, FXhome will contribute a percentage of the proceeds of Pay What You Want to organizations dedicated to global causes important to the company and its community. At its launch, the FXhome Pay What You Want initiative will donate a portion of its proceeds to the WWF and the Australia Emergency Bushfire Fund. The larger the contribution from customers, the more FXhome will donate.

HitFilm Express remains a free download, however, first-time customers will now have the option to “Pay What You Want” on the software. They’ll also receive some exclusive discounts on HitFilm add-on packs and effects.

Coinciding with the release of Pay What You Want, FXhome is releasing HitFilm Express 14, the first version of HitFilm Express to be eligible for the Pay What You Want initiative. HitFilm Express 14 features a new and simplified export process, new text controls, a streamlined UI and a host of new features.

For new customers who would like to download HitFilm Express 14 and also contribute to the Pay What You Want program, there are three options available:

• Starter Pack Level: With a contribution as little as $9, new HitFilm Express 14 customers will also receive a free Starter Pack of software and effects that includes:
o Professional dark mode interface
o Edit tools including Text, Split Screen Masking, PiP, Vertical Video, Action Cam Crop
o Color tools including Exposure, Vibrance, Shadows and Highlights, Custom Gray, Color Phase, Channel Mixer and 16-bit color
o Additional VFX packs including Shatter, 3D Extrusion, Fire, Blood Spray and Animated Lasers
• Content Creator Level: With contributions of $19 or more, users will receive everything included in the Starter Pack, as well as:
o Edit: Repair Pack with Denoise, Grain Removal and Rolling Shutter
o Color: LUT Pack with LUTs and Grading Transfer
o Edit: Beautify Pack with Bilateral Blur and Pro Skin Retouch
• VFX Artist Level: Users who contribute from $39 to $99 get everything in the Starter Pack and Content Creator levels plus:
o Composite Toolkit Pack with Wire Removal, Projector, Clone and Channel Swapper
o Composite Pro-Keying Pack for Chroma Keying
o Motion Audio Visual Pack with Atomic Particles, Audio Spectrum and Audio Waveform
o VFX Neon Lights Pack with Lightsword Ultra (2-Point Auto), Lightsword Ultra (4-Point Manual), Lightsword Ultra (Glow Only) and Neon Path
o VFX Lighting Pack with Anamorphic Lens Flares, Gleam, Flicker and Auto Volumetrics

What’s new in HitFilm Express 14
HitFilm Express 14 adds a number of VFX workflow enhancements to enable even more sophisticated effects for content creators, including a simplified export workflow that allows users to export content directly from the timeline and comps, new text controls, a streamlined UI and a host of new features. Updates include:

• Video Textures for 3D Models: For creators who already have the 3D: Model Render Pack, they can now use a video layer as a texture on a 3D model to add animated bullet holes, cracked glass or changing textures.
• Improvements to the Export Process: In HitFilm Express 14, the Export Queue is now an Export Panel, and is now much easier to use. Exporting can also now be done from the timeline and from comps. These “in-context” exports will export the content between the In and Out points set or the entire timeline using the current default preset (which can be changed from the menu).
• Additional Text Controls: Customizing text in HitFilm Express 14 is now even simpler, with Text panel options for All Caps, Small Caps, Subscript and Superscript. Users can also change the character spacing, horizontal or vertical scale, as well as baseline shift (for that Stranger-Things-style titling).
• Usability and Workflow Enhancements: In addition to the new and improved export process, FXhome has also implemented new changes to the interface to further simplify the entire post production process, including a new “composite button” in the media panel, double-click and keyboard shortcuts. A new Masking feature adds new automation to the workflow; when users double-click the Rectangle or Ellipse tools, a centered mask is automatically placed to fill the center of the screen. Masks are also automatically assigned colors, which can be changed to more easily identify different masks.
• Effects: Users can now double-click the effects panel to apply to the selected layer and drop 2D effects directly onto layers in the viewer. Some effects — such as the Chroma Key and Light Flares — can be dropped on a specific point, or users can select a specific color to key by. Users can also now favorite “effects” for quick and easy access to their five most recently used effects from the ‘Effects’ menu in the toolbar.
• Additional Improvements: Users can now use Behavior effects from the editor timeline, click-drag across multiple layers to toggle “solo,” “locked” or “visibility” settings in one action, and access templates directly from the media panel with the new Templates button. Menus have also been added to the tab of each panel to make customization of the interface easier.
• Open Imerge Pro files in HitFilm: Imerge Pro files can now be opened directly from HitFilm as image assets. Any changes made in the Imerge Pro project will be automatically updated with any save, making it easier to change image assets in real time.
• Introducing Light Mode: The HitFilm Express interface is now available in Light Mode and will open in Light Mode the first time you open the software. Users with a pre-existing HitFilm Express license can easily change back to the dark theme if desired.

HitFilm Express 14 is available immediately and is a free download. Customers downloading HitFilm Express 14 for the first time are eligible to participate in the new Pay What You Want initiative. Free effects and software packs offered in conjunction with Pay What You Want are only available at initial download of HitFilm Express 14.

Rob Legato talks The Lion King‘s Oscar-nominated visual effects

By Karen Moltenbrey

There was a lot of buzz before — and after — this summer’s release of Disney’s remake of the animated classic The Lion King. And what’s not to love? From the animals to the African savannas, Disney brought the fabled world of Simba to life in what is essentially a “live-action” version of the beloved 1994 2D feature of the same name. Indeed, the filmmakers used tenets of live-action filmmaking to create The Lion King, and themselves call it a visual effects film. However, there are those who consider this remake, like the original, an animated movie, as 2019’s The Lion King used cutting-edge CGI for the photoreal beasts and environments.

Rob Legato

Whether you call it “live action” or “animation,” one thing’s for sure. This is no ordinary film. And, it was made using no ordinary production process. Rather, it was filmed entirely in virtual reality. And it’s been nominated for a Best Visual Effects Oscar this year.

“Everything in it is a visual effect, created in the same way that we would make a visual effects-oriented film, where we augment or create the backgrounds or create computer-generated characters for a scene or sequence. But in this case, that spanned the entire movie,” says VFX supervisor Rob Legato. “We used a traditional visual effects pipeline and hired MPC, which is a visual effects studio, not an animation house.”

MPC, which created the animals and environments, crafted all the elements, which were CG, and handled the virtual production, working with Magnopus to develop the necessary tools that would take the filmmakers from previz though shooting and, eventually, into post production. Even the location scouting occurred within VR, with Legato, director Jon Favreau and others, including cinematographer Caleb Deschanel, simultaneously walking through the sets and action by using HTC Vive headsets.

Caleb Deschanel (headset) and Rob Legato. Credit: Michael Legato

The Animations and Environments
MPC, known for its photorealistic animals and more, had worked with Disney and Favreau on the 2016 remake of The Jungle Book, which was shot within a total greenscreen environment and used realistic CG characters and sets with the exception of the boy Mowgli. (It also used VR, albeit for previsualization only.) The group’s innovative effort for that work won an Oscar for visual effects. Apparently that was just the tip of the spear, so to speak, as the team upped its game with The Lion King, making the whole production entirely CG and taking the total filmmaking process into virtual reality.

“It had to look as believable as possible. We didn’t want to exaggerate the performances or the facial features, which would make them less realistic,” says Legato of the animal characters in The Lion King.

The CG skeletons were built practically bone for bone to match their real-life counterparts, and the digital fur matched the hair variations of the various species found in nature. The animators, meanwhile, studied the motion of the real-life animals and moved the digital muscles accordingly.

“Your eye picks up when [the animal] is doing something that it can’t really do, like if it stretches its leg too far or doesn’t have the correct weight distribution that’s affecting the other muscles when it puts a paw down,” says Legato, contending that it is almost impossible to tell the CG version of the characters from the real thing in a non-talking shot or a still frame.

To craft the animals and environments, the MPC artists used Autodesk’s Maya as the main animation program, along with SideFX Houdini for water and fire simulations and Pixar’s RenderMan for rendering. MPC also used custom shaders and tools, particularly for the fur, mimicking that of the actual animal. “A lion has so many different types of hair — short hair around the body, the bushy mane, thick eyebrow hairs and whiskers. And every little nuance was recreated and faithfully reproduced,” Legato adds.

MPC artists brought to life dozens and dozens of animals for the film and then generated many more unique variations — from lions to mandrills to hyenas to zebras and more, even birds and bugs. And then the main cast and background animals were placed within a photoreal environment, where they were shot with virtual cameras that mimicked real cameras.

The world comprises expansive, open landscapes. “There were many, many miles of landscapes that were constructed,” says Legato. The filmmakers would film within pockets that were dressed and populated for different scenes, from Pride Rock to the interior of a cave to the savanna to the elephant graveyard — all built in CGI.

“Everything was simulated to be the real thing, so the sum total of the illusion is that it’s all real. And everything supports each other — the grounds, the characters, what they are physically doing. The sum total of that adds up to where your brain just says, ‘OK, this must be real. I’ll stop looking for flaws and will now just watch the story,’” says Legato. “That was the creative intent behind it.”

Virtual Production
All the virtual camera work was accomplished within Unity’s engine, so all the assets were ported in and out of that game engine. “Everyone would then know where our cameras were, what our camera moves were, how we were following the action, our lens choices, where the lights were placed … all those things,” says Legato.

Magnopus created the VR tools specific for the film, which ran on top of Unity to get the various work accomplished, such as the operation of the cameras. “We had a crane, dolly and other types of cameras encoded so that it basically drove its mate in the computer. For instance, we created a dolly and then had a physical dolly with encoders on it, so everything was hand operated, and we had a dolly grip and a camera assistant pulling focus. There was someone operating the cameras, and sometimes there was a crane operator. We did SteadiCam as well through an actual SteadiCam with a sensor on it to work with OptiTrack [motion capture that was used to track the camera],” explains Legato. “We built a little rig for the SteadiCam as well as one for a drone we’d fly around the stage, and we’d create the illusion that it was a helicopter shot while flying around Africa.”

Because the area within VR was so vast, a menu system was created so the filmmakers could locate one another within the virtual environment, making location scouting much easier. They also could take snapshots of different areas and angles and share them with the group. “We were standing next to each other [on stage], but within the virtual environment, we could be miles apart and not see each other because we’re maybe behind trees or rocks.”

As Legato points out, the menu tool is pretty robust. “We basically built a game of film production. Everything was customizable,” he says. Using iPads, the group could play the animation. As the camera was in operation, they could stop the animation, wind it backward, speed it forward, shoot it in slow motion or faster motion. “These options were all accessible to us,” he adds.

Legato provides the following brief step-by-step overview of how the virtual production occurred. First, the art department created the sets — Africa with the trees, ponds, rivers, mountains, waterfalls and so forth. “Based on the script, you know somewhat where you need to be [in the set],” he says. Production designer James Chinlund would make a composite background, and then they — along with Favreau, Deschanel and animation supervisor Andrew Jones — would go into VR.

“We had built these full-size stationary chess pieces of the animals, and in VR, we’d have these tools that let us grab a lion, for instance, or a meerkat, and position them, and then we’d look through the lens and start from there,” says Legato. “We would either move them by hand or puppeteer a simple walk cycle to get the idea of the blocking.”

Jones and his team would animate that tableau and port it back into the game engine as an animation cycle. “We’d find camera angles and augment them. We’d change some of the animation or slow it down or move the animals in slightly different positions. And then we’d shoot it like it’s on a live-action stage,” explains Legato. “We’d put a dolly track down, cover the action with various types of lenses, create full-coverage film dailies… We could shoot the same scene in as many different angles as we’d wish. We could then play it out to a video deck and start editing it right away.” The shots they liked might get rendered with more light or motion blur, but a lot of the time, they’d go right off the video tap.

Meanwhile, MPC recorded everything the filmmakers did and moved — every leaf, rock, tree, animal. Then, in post, all of that information would be reconverted back into Maya sets and the animation fine-tuned.

“In a nutshell, the filmmakers were imparting a live-action quality to the process — by not faking it, but by actually doing it,” says Legato. “And we still have the flexibility of full CGI.”

The Same, But Different
According to Legato, it did not take the group long to get the hang of working in VR. And the advantages are many — chief among them, time savings when it comes to planning and creating the sequence editorially, and then instantly being able to reshoot or iterate the scene inexpensively. “There is literally no downside to exploring a bold choice or an alternate angle on the concept,” he points out.

Yes, virtual filmmaking is the future, contends Legato.

So, back to the original question: Is The Lion King a VFX film or an animated film? “It’s perhaps a hybrid,” says Legato. “But, if you didn’t know how we did it and if the animals didn’t talk, you’d think it was done in the traditional manner of a live-action film. Which it is, visually speaking. You wouldn’t necessarily describe it as looking like ‘an animated film’ because it doesn’t really look like an animated film, like a Pixar or DreamWorks movie. By labeling it as such, you’re putting it into a hole that it’s not. It’s truly just a movie. How we achieved it is immaterial, as it should be.”

Legato and his colleagues call it “live action,” which it truly is. But some, including the Golden Globes, categorized it as “animation.” (They also called 2015’s The Martian and 2010’s The Tourist “comedies.”)

Call it what you will; the bottom line is that the film is breathtaking and the storytelling is amazing. And the filmmaking is inventive and pushes traditional boundaries, making it difficult to perhaps fit into a traditional category. Therefore, “beautiful,” “riveting,” “creative” and “innovative” might be the only descriptions necessary.


Karen Moltenbrey is a veteran writer, covering visual effects and post production.

Check out MPC’s VFX breakdown on the film:

Joce Capper joins Cinelab London as creative director

Film lab and post facility Cinelab London has brought on Joce Capper as creative director/strategic business development. Capper has over 25 years of experience managing and growing post and VFX companies, most notably serving as managing director of Rushes, one of the UK’s most widely admired post houses, for 20 years.

Capper’s open approach in the leadership of talent, staff and new technologies — combined with her resourceful understanding of the scalable processes needed to meet budgets and timescales — has seen her play a huge part in delivering many multi-award-winning projects for leading brands and clients across the advertising, music, entertainment and feature film sectors during her career.

Since the closure of Rushes, Capper has spent the past 18 months in management consulting for Supernova Heights. In this role, she worked across multiple companies in the media industry, including Cinelab London. Now, she will take up her role there full-time.

”I’ve learned a huge amount about myself in the last year,” says Capper. “How much I enjoy the creative industries; and that being involved in productions with creative staff is key to my happiness. I am self-motivated, but I need to be part of a friendly and talented team, be somewhere I can make a difference and believe wholeheartedly in what the company is offering.”

Capper will work with the executive management and sales teams to continue to grow the profile of Cinelab London, its client base and its international reach. Part of her responsibility will also be to help educate the industry and the next generation of filmmakers on the skills and craft needed when shooting on film.

“I have known Joce for many years, previously working with her in operational and corporate roles at Ascent Media and Deluxe Entertainment,” explains Adrian Bull, co-founder/CEO of Cinelab London. “She will help us push the business forward; I am delighted to have her working with us full-time.”

Nomad Editorial hires eclectic editor Dan Maloney

Nomad Editing Company has added editor Dan Maloney to its team. Maloney is best known for his work cutting wry, eclectic comedy spots in addition to more emotional content. While his main tool is Avid Media Composer, he is also well-versed in Adobe Premiere.

“I love that I get to work in so many different styles and genres. It keeps it all interesting,” he says.

Prior to joining Nomad, Maloney cut at studios such as Whitehouse Post, Cut+Run, Spot Welders and Deluxe’s Beast. Throughout his career, Maloney has uses his eye for composition on a wide range of films, documentaries, branded content and commercials, including the Tide Interview spot that debuted at Super Bowl XLII.
“My editing style revolves mostly around performance and capturing that key moment,” he says. “Whether I’m doing a comedic or dramatic piece, I try to find that instance where an actor feels ‘locked in’ and expand the narrative out from there.”

According to Nomad editor/partner Jim Ulbrich, “Editing is all about timing and pace. It’s a craft and you can see Dan’s craftsmanship in every frame of his work. Each beat is carefully constructed to perfection across multiple mediums and genres. He’s not simply a comedy editor, visual storyteller, or doc specialist. He’s a skilled craftsman.”

Adobe Premiere Productions: film projects, collaborative workflows

By Mike McCarthy

Adobe announced a new set of features coming to its NLE Premiere Pro. They now support “Productions” within Premiere, which allows easier management of sets of projects being shared between different users. The announcement, which came during the Sundance Film Festival, is targeted at filmmakers working on large-scale projects with teams of people collaborating on site.

Productions extends and refines Premiere’s existing “Shared Project” model, making it easier to manage work spread across a large number of individual projects, which can become unwieldy with the current implementation.

Shared Projects should not be confused with Team Projects, which is an online project-sharing tool set across different locations that each have their own local media and Adobe Anywhere, which is a cloud based streaming editing platform with no local files. Shared Projects are used between users on a local network, usually with high-quality media, with simple mechanisms for passing work between different users. Shared Projects were introduced in Premiere Pro 2018 and included three components. Here, I’m going to tell you what the issues were and how the new Adobe Productions solves them:

1) The ability to add a shortcut to another project into the project panel, which was next to useless. The projects were in no other way connected with each other, and incrementing the target project to a new name (V02) broke the link. The only benefit was to see who might have the shortcut-ed project open and locked, which brings us to:

2) The ability to lock projects that were open on one system, preventing other users from inadvertently editing them at the same time and overwriting each other’s work, which should have been added a long time ago. This was previously managed through a process called “shout down the hall” before opening projects.

3) And most significantly, the inability to open more than one project at the same time. The previous approach was to import other projects into your existing project, but this resulted in massive project files that took forever to load, among other issues. Opening more than one project at once allowed projects to be broken into smaller individual parts, and then different people could more easily work on different parts at the same time.

For the last two-plus years, large films have been able to break down their work into many smaller projects and distribute those projects between numerous users who are working on various parts. And those users can pass the pieces back and forth without concern for overwriting each other’s work. But there was no central way to control all of those projects, and the master project/Shared Project shortcut system required you not to version your projects (bad file management) or to re-linking every project version to the master project (tedious).

You also end up with lots of copies of your media, as every time an asset is used in a different project, a new copy of it is copied into that project. If you update or edit an asset in one project, it won’t change the copies that are used in other projects (master clip effects, relinking, reinterpreting footage, proxies, etc.).

Problems Solved
Premiere’s new Production Panel and tool set are designed to solve those problems. First, it gives you a panel to navigate and explore all of the projects within your entire production, however you structure them within your master project folder. You can see who has what open and when.

When you copy an asset into a sequence from another project, it maintains a reference to the source project, so subsequent changes to that asset (color correction, attaching full-res media, etc.) can propagate to the instance in the sequence of the other project — if both projects are open concurrently to sync.

If the source project can’t be found, the child instance is still a freestanding piece of media that fully functions; it just no longer receives synchronized updates from the master copy. (So you don’t have a huge web of interconnected projects that will all fail if one of them is corrupted or deleted.)

All projects in a Production have the same project settings, (Scratch disks, GPU renderer, etc.) keeping them in sync and allowing you to update those settings across the production and share render files between users. And all files are stored on your local network for maximum performance and security.

In the application, this allows all of the source media to be in dedicated “dailies” projects, possibly a separate project for every day of filming. Then each scene or reel can be its own project, with every instance in the sequences referencing back to a master file in the dailies. Different editors and assistants can be editing different scenes, and all of them can have any source project open concurrently in read-only mode without conflict. As soon as someone saves changes, an icon will alert users that they can update the copy they have open and unlock it to continue working.

Some Limitations
Moving a sequence from one project to another doesn’t retain a link to the original because that could become a mess quickly. But it would be nice to be able to make edits to “reels” and have those changes reflected in a long-play project that strings those reels together. And with so many projects open at once, it can become difficult to keep track of what sequences go with what project panels.

Ideally, a color-coded panel system would help with that, either with random colors for contrast or with user-assigned colors by type of project. In that case it would still be good to highlight what other panels are associated with the selected panel, since two projects might be assigned the same color.

Summing Up
Regardless of those potential changes, I have been using Shared Projects to its fullest potential on a feature film throughout 2019, and I look forward to the improvements that the new Production panel will bring to my future workflows.

Check out this video rundown:


Mike McCarthy is an online editor/workflow consultant with over 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.

Behind the Title: Sound Lounge ADR mixer Pat Christensen

This ADR mixer was a musician as a kid and took engineering classes in college, making him perfect for this job.

Name: Pat Christensen

Company: Sound Lounge (@soundloungeny)

What’s your job title?
ADR mixer

What does Sound Lounge do?
Sound Lounge is a New York City-based audio post facility. We provide sound services for TV, commercials, feature films, television series, digital campaigns, games, podcasts and other media. Our services include sound design, editing and mixing; ADR recording and voice casting.

What does your job entail?
As an ADR mixer, I re-record dialogue for film and television. It is necessary when dialogue cannot be recorded properly on the set or for creative reasons or because additional dialogue is needed. My stage is set up differently from a standard mix stage as it includes a voiceover booth for actors.

We also have an ADR stage with a larger recording environment to support groups of talent. The stage also allows us to enhance sound quality and record performances with greater dynamics, high and low. The recording environment is designed to be “dead,” that is without ambient sound. That results in a clean recording so when it gets to the next stage, the mixer can add reverb or other processing to make it fit the environment of the finished soundtrack.

What would people find most surprising about your job?
People who aren’t familiar with ADR are often surprised how it’s possible to make an actor’s voice lipsync perfectly with the image on screen and indistinguishable from dialogue recorded on the day.

What’s your favorite part of the job?
Interacting with people — the sound team, the director or the showrunner, and the actors. I enjoy helping directors in guiding the actors and being part of the creative process. I act as a liaison between the technical and creative sides. It’s fun and it’s different every day. There’s never a boring session.

What’s your least favorite?
I don’t know if there is one. I have a great studio and all the tools that I need. I work with good people. I love coming to work every day.

What’s your most productive time of the day?
Whenever I’m booked. It could be 9am. It could be 7a.m. I do night sessions. When the client needs the service, I am ready to go.

If you didn’t have this job, what would you be doing instead?
In high school, I played bass in a punk rock band. I learned the ins and outs of being a musician while taking classes in engineering. I also took classes in automotive technology. If I’d gone that route, I wouldn’t be working in a muffler shop; I’d be fine-tuning Formula 1 engines.

How early on did you know that sound would be your path?
My mom bought me a four-string Washburn bass for Christmas when I was in the eighth grade, but even then I was drawn to the technical side. I was super interested in learning about audio consoles and other gear and how they were used to record music. Luckily, my high school offered a radio and television class, which I took during my senior year. I fell in love with it from day one.

Silicon Valley

What are some of your recent projects?
I worked on the last season of HBO’s Silicon Valley and the second season of CBS’ God Friended Me. We also did Starz’s Power and the new Adam Sandler movie Palm Springs. There are many more credits on my IMDB page. I try to keep it up-to-date.

Is there a project that you’re most proud of?
Power. We’ve done all seven seasons. It’s been exciting to watch how successful that show has become. It’s also been fun working with the actors and getting to know many of them on a personal level. I enjoy seeing them whenever they come it. They trust me to bridge the gap between the booth and the original performance and deliver something that will be seen, and heard, by millions of people. It’s very fulfilling.

Name three pieces of technology you cannot live without.
A good microphone, a good preamp and good speakers. The speakers in my studio are ADAM A7Xs.

What social media channels do you follow?
Instagram and Facebook.

What do you do to relax?
I play hockey. On weekends, I enjoy getting on the ice, expending energy and playing hard. It’s a lot of fun. I also love spending time with my family.

Quick Chat: Director Sorrel Brae on Rocket Mortgage campaign

By Randi Altman

Production company Native Content and director Sorrel Brae have collaborated once again with Rocket Mortgage’s in-house creative team on two new spots in the ongoing “More Than a House” campaign. Brae and Native had worked together on the campaigns first four offerings.

The most recent spots are More Than a Tradition and More Than a Bear. More Than a Tradition shows a ‘50s family sitting down to dinner and having a fun time at home. Then the audience sees the same family in modern times, hammering home how traditions become traditions.

More Than a Bear combines fantasy and reality as it shows a human-sized teddy bear on an operating table. Then viewers see a worried boy looking on as his mother is repairing the his stuffed animal. Each spot opens with the notes of Bob Dylan’s “The Man In Me,” which is featured in all the “More Than a House” spots.

More Than a Bear was challenging, according to Brae, because there was some darker material in this piece as compared to the others  —  viewers aren’t sure at first if the bear will make it. Brae worked closely with DP Jeff Kim on the lighting and color palette to find a way to keep the tone lighthearted. By embracing primary colors, the two were able to channel a moodier tone and bring viewers inside a scared child’s imagination while still maintaining some playfulness.

We reached out to director Brae to find our more.

Sorrel Brae

What did you shoot these two spots on, and why?
I felt that in order for the comedy to land and the idea to shine, the visual separation between fantasy and reality had to be immediate, even shocking. Shooting on an Alexa Mini, we used different lenses for the two looks: Hawk V-Lite Vintage ’74 anamorphic for epic and cinematic fantasy, and spherical Zeiss and Cooke S4 primes for reality. The notable exception was in the hospital for the teddy bear spot, where our references were the great Spielberg and Zemeckis films from the ‘80s, which are primarily spherical and have a warmer, friendlier feeling.

How did you work with the DP and the colorist on the look? And how would you describe the look of each spot, and the looks within each spot? 
I was fortunate to bring on longtime collaborators DP Jeffrey Kim and colorist Mike Howell for both spots. Over the years, Jeff and I have developed a shorthand for working together. It all starts with defining our intention and deciding how to give the audience the feelings we want them to have.

In Tradition, for example, that feeling is a warm nostalgia for a bygone era that was probably a fantasy then, just as it is now. We looked to period print advertisements, photographs, color schemes, fonts — everything that spoke to that period. Crucial to pulling off both looks in one day was Heidi Adams’ production design. I wanted the architecture of the house to match when cutting between time periods. Her team had to put a contemporary skin on a 1950s interior for us to shoot “reality” and then quickly reset the entire house back to 1950s to shoot “fantasy.”

The intention for More Than a Bear was trickier. From the beginning I worried a cinematic treatment of a traumatic hospital scene wouldn’t match the tone of the campaign. My solution with Jeff was to lean into the look of ‘80s fantasy films like E.T. and Back to the Future with primary colors, gelled lights, a continuously moving camera and tons of atmosphere.

Mike at Color Collective even added a retro Ektachrome film emulation for the hospital and a discontinued Kodak 5287 emulation for the bedroom to complete the look. But the most fun was the custom bear that costume designer Bex Crofton-Atkins created for the scene. My only regret is that the spot isn’t 60 seconds because there’s so much great bear footage that we couldn’t fit into the cut.

What was this edited on? Did you work with the same team on both campaigns?
The first four spots of this campaign were cut by Jai Shukla out of Nomad Edit. Jai did great work establishing the rhythm between fantasy and reality and figuring out how to weave in Bob Dylan’s memorable track for the strongest impact. I’m pretty sure Jai cuts on Avid, which I like to tease him about.

These most recent two spots (Tradition and Teddy Bear) were cut by Zach DuFresne out of Hudson Edit, who did an excellent job navigating scripts with slightly different challenges. Teddy Bear has more character story than any of the others, and Tradition relies heavily on making the right match between time periods. Zach cuts on Premiere, which I’ve also migrated to (from FCP 7) for personal use.

Were any scenes more challenging than the others?
What could be difficult about kids, complex set design, elaborate wardrobe changes and detailed camera moves on a compressed schedule? In truth, it was all equally challenging and rewarding.

Ironically, the shots that gave us the most difficulty probably look the simplest. In Tradition there’s a SteadiCam move that introduces us into the contemporary world, has match cuts on either end and travels through most of the set and across most of the cast. Because everyone’s movements had to perfectly align with a non-repeatable camera, that one took longer than expected.

And on Teddy Bear, the simple shot looking up from the patient’s POV as the doctor/mom looms overhead was surprisingly difficult. Because we were on an extremely wide lens (12mm or similar), our actress had to nail her marks down to the millimeter, otherwise it looked weird. We probably shot that one setup 20 times.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Review: FilmConvert Nitrate for film stock emulation

By Brady Betzel

If you’ve been around any sort of color grading forums or conferences, you’ve definitely heard some version of this: Film is so much better than digital. While I don’t completely disagree with the sentiment, let’s be real. We are in a digital age, and the efficiency and cost associated with digital recording is, in most cases, far superior to film.

Personally, I love the way film looks; it has an essence that is very difficult to duplicate — from the highlight roll-offs to the organic grain — but it is very costly. That is why film is hard to imitate digitally, and that is why so many companies try and often fail.

Sony A7iii footage

One company that has had grassroots success with digital film stock emulation is FilmConvert. The original plugin, known as FilmConvert Pro, works with Adobe’s Premiere and After Effects, Avid Media Composer and as an OFX plugin for apps like Blackmagic’s DaVinci Resolve.

Recently, FilmConvert expanded its lineup with the introduction of Nitrate, a film emulation plugin that can take Log-based video and transform it into full color corrected media with a natural grain similar to that of commonly loved film stocks. Currently, Nitrate works with Premiere and After Effects, with an OFX version for Resolve. A plugin for FCPX is coming in March.

The original FilmConvert Pro plugin works great, but it adjusts your image through an sRGB pipeline. That means FilmConvert Pro adjusts any color effects after your “base” grade is locked in while living in an sRGB world. While you download camera-specific “packs” that apply the film emulation — custom-made based on your sensor and color space — you are still locked into an sRGB pipeline, with little wiggle room. This means sometimes blowing out your highlights and muddying your shadows with little ability to recover any details.

SonyA7iii footage

I imagine FilmConvert Pro was introduced at a time when a lot of users shot with cameras like Canon 5D or other sRGB cameras that weren’t shooting in a Log color space. Think of using a LUT and trying to adjust the highlights and shadows after the LUT; typically, you will have a hard time getting any detail back, losing dynamic range even if your footage was shot Log. But if you color before a LUT (think Log footage), you can typically recover a lot of information as long as your shot was recorded properly. That blown-out sky might be able to be recovered if shot in a Log colorspace. This is what FilmConvert is solving with its latest offering, Nitrate.

How It Works
FilmConvert’s Nitrate works in a Cineon-Log processing pipeline for its emulation, as well as a full Log image processing pipeline. This means your highlights and shadows are not being heavily compressed into an sRGB color space, which allows you to fine-tune your shadows and highlights without losing as much detail. Simply, it means that the plugin will work more naturally with your footage.

In additional updates, FilmConvert has overhauled its GUI to be more natural and fluid. The Color Wheels have been redesigned, a new color tint slider has been added to quickly remove any green or magenta color cast, a new Color Curve control has been added, and there is now a Grain Response curve.

Grain Response

The Grain Response curve takes adding grain to your footage up a notch. Not only can you select between 8mm and 35mm grain sizes (with many more in between) but you can adjust the application of that grain from shadows to highlights. If you want your highlights to have more grain response, just point the Grain Response curve higher up. In the same window you can adjust the grain size, softness, strength and saturation via sliders.

Of the 19 film emulation options to choose from, there are many unique and great-looking presets. From the “KD 5207 Vis3” to the “Plrd 600,” there are multiple brands and film stocks offered. For instance, the “Kodak 5207 Vis3” is described on Kodak’s website in more detail:

“Vision3 250D Film offers outstanding performance in the extremes of exposure — including increased highlight latitude, so you can move faster on the set and pull more detail out of the highlights in post. You’ll also see reduced grain in shadows, so you can push the boundaries of underexposure and still get outstanding results.”

One of my favorite emulations in Nitrate — “Fj Velvia 100” or Fujichrome Velvia 100 — is described on FilmConvert’s website:

“FJ Velvia 100 is based on the Fujichrome Velvia 100 photographic film stock. Velvia is a daylight-balanced color reversal film that provides brighter ultra-high-saturation color reproduction. The Velvia is especially suited to scenery and nature photography as well as other subjects that require precisely modulated vibrant color reproduction.”

Accurate Grain

FilmConvert’s website offers a full list of the 19 film stocks, as well as examples and detailed descriptions of each film stock.

Working With FilmConvert Nitrate
I used Nitrate strictly in Premiere Pro because the OFX version (specifically for Resolve) wasn’t available at the time of this review.

Nitrate works pretty well inside of Premiere, and surprisingly plays back fluidly — this is probably thanks to its GPU acceleration. Even with Sony a7 III UHD footage, Premiere was able to keep up with Lumetri Color layered underneath the FilmConvert Nitrate plugin. To be transparent I tested Nitrate on a laptop with an Intel i7 CPU and an Nvidia RTX 2080 GPU, so that definitely helps.

At first, I struggled to see where I would fit FilmConvert’s Nitrate plugin into my normal workflow so I could color correct my own footage and add a grade later. However, when I started cycling through the different film emulations, I quickly saw that they were adding a lot of life to the images and videos. Whether it was the grain that comes from the updated 6K grain scans in Nitrate or the ability to identify which camera and color profile you used when filming via the downloadable camera packs, FilmConvert’s Nitrate takes well-colored footage and elevates it to finished film levels.

It’s pretty remarkable; I came in thinking FilmConvert was essentially a preset LUT plugin and wasn’t ready for it to be great. To my surprise, it was great and it will add the extra edge of professional feeling to your footage quickly and easily.

Test 1
In my first test, I threw some clips I had shot on a Sony a7 III camera in UHD (at SLog3 — SGamut3) into a timeline, applied the FilmConvert Nitrate plugin and realized I needed to download the Sony camera packs. This pack was about 1GB, but others —like the Canon 5D Mark II — came in at just over 300MB. Not the end of the world, but if you have multiple cameras, you are going to need to download quite a few packs and the download size will add up.

Canon 5D

I tried using just the Nitrate plugin to do color correction and film emulation from start to finish, but I found the tools a little cumbersome and not really my style. I am not the biggest fan of Lumetri color correction tools, but I used those to get a base grade and apply Nitrate over that grade. I tend to like to keep looks to their own layer, so coloring under Nitrate was a little more natural to me.

A quick way to cycle through a bunch of looks quickly is to apply Nitrate to the adjustment layer and hit the up or down arrows. As I was flicking through the different looks, I noticed that FilmConvert does a great job processing the film emulations with the specified camera. All of the emulations looked good with or without a color balance done ahead of time.

It’s like adding a LUT and then a grade all in one spot. I was impressed by how quickly this worked and how good they all looked. When I was done, I rendered my one-minute sequence out of Adobe Media Encoder, which took 45 seconds to encode a ProResHQ and 57 seconds for an H.264 at 10Mb/s. For reference, the uncolored version of this sequence took 1:17 for the ProResHQ and :56 for the H.264 at 10Mb/s. Interesting, because the Nvidia RTX 2080 GPU definitely kicked in more when the FilmConvert Nitrate effect was added. That’s a definite plus.

Test 2
I also shot some clips using the Blackmagic Pocket Cinema Camera (BMPCC) and the Canon 5D Mark II. With the BMPCC, I recorded CinemaDNG files in the film color space, essentially Log. With the 5D, the videos were recorded as Movie files wrapped in MP4 files (unless you shoot with the Magic Lantern hack, which allows you to record in the raw format). I brought in the BMPCC CinemaDNG files via the Media Browser as well as imported the 5D Movs and applied the FilmConvert Nitrate plugin to the clips. Keep in mind you will need to download and install those camera packs if you haven’t already.

Pocket Cinema Camera

For the BMPCC clips I identified the camera and model as appropriate and chose “Film” under profile. It seemed to turn my CinemaDNG files a bit too orange, which could have been my white balance settings and/or the CinemaDNG processing done by Premiere. I could swing the orange hue out by using the temperature control, but it seemed odd to have to knock it down to -40 or -50 for each clip. Maybe it was a fluke, but with some experimentation I got it right.

With the Canon 5D Mark II footage, I chose the corresponding manufacturer and model as well as the “Standard” profile. This worked as it should. But I also noticed some other options like Prolost, Marvel, VisionTech, Technicolor, Flaat and Vision Color — these are essentially color profiles people have made for the 5D Mark II. You can find them with a quick Google search.

Summing Up
In the end, FilmConvert’s Nitrate will elevate your footage. The grain looks smooth and natural, the colors in the film emulation add a modern take on nostalgic color corrections (that don’t look too cheesy), and most cameras are supported via downloads. If you don’t have a large budget for a color grading session you should be throwing $139 at FilmConvert for its Nitrate plugin.

Nitrate in Premiere

When testing Nitrate on a few different cameras, I noticed that it even made color matching between cameras a little bit more consistent. Even if you have a budget for color grading, I would still suggest buying Nitrate; it can be a great starting block to send to your colorist for inspiration.

Check out FilmConvert’s website and definitely follow them on Instagram, where they are very active and show a lot of before-and-afters from their users  — another great source of inspiration.

Main Image: Two-year-old Oliver Betzel shot with a Canon 5D with KD P400 Ptra emulsion applied.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

67th MPSE Golden Reel Winners

By Dayna McCallum

The Motion Picture Sound Editors (MPSE) Golden Reel Awards shared the love among a host of films when handing out awards this past weekend at their 67th annual ceremony.

The feature film winners included Ford v Ferrari for effects/Foley, 1917 for dialogue/ADR, Rocketman for the musical category, Jojo Rabbit for musical underscore, Parasite for foreign-language feature, Toy Story 4 for animated feature, and Echo in the Canyon for feature documentary.

The Golden Reel Awards, recognizing outstanding achievement in sound editing, were presented in 23 categories, including feature films, long-form and short-form television, animation, documentaries, games, special venue and other media.

Academy Award-nominated producer Amy Pascal (Little Women) surprised Marvel’s Victoria Alonso when she presented her with the 2020 MPSE Filmmaker Award (re-recording mixer Kevin O’Connell and supervising sound editor Steven Ticknor were honorary presenters).

The 2020 MPSE Career Achievement Award was presented to Academy Award-winning supervising sound editor Cecelia “Cece” Hall by two-time Academy Award-winning supervising sound editor Stephen H. Flick.

“Business models, formats and distribution are all changing,” said MPSE president-elect Mark Lanza during the ceremony. “Original scripted TV shows have set a record in 2019. There were 532 original shows this year. This number is expected to surge in 2020. Our editors and supervisors are paving the way and making our product and the user experience better every year.”

Here is the complete list of winners:

Outstanding Achievement in Sound Editing – Animation Short Form

3 Below “Tales of Arcadia”

Netflix

Supervising Sound Editor: Otis Van Osten
Sound Designer: James Miller
Dialogue Editors: Jason Oliver, Carlos Sanches
Foley Artists: Aran Tanchum, Vincent Guisetti
Foley Editor: Tommy Sarioglou 

Outstanding Achievement in Sound Editing – Non-Theatrical Animation Long Form

Lego DC Batman: Family Matters

Warner Bros. Home Entertainment

Supervising Sound Editor: Rob McIntyre, D.J. Lynch
Sound Designer: Lawrence Reyes
Sound Effects Editors: Ezra Walker
ADR Editor: George Peters
Foley Editor: Aran Tanchum, Derek Swanson
Foley Artists:  Vincent Guisetti 

Outstanding Achievement in Sound Editing – Feature Animation

Toy Story 4

Walt Disney Studios Motion Pictures

Supervising Sound Editor: Coya Elliott
Sound Designer: Ren Klyce
Supervising Dialogue Editor: Cheryl Nardi
Sound Effects Editors: Kimberly Patrick, Qianbaihui Yang, Jonathon Stevens
Foley Editors: Thom Brennan, James Spencer
Foley Artists:  John Roesch, MPSE, Shelley Roden, MPSE

Outstanding Achievement in Sound Editing – Non-Theatrical Documentary

Serengeti

Discovery Channel

Supervising Sound Editor: Paul Cowgill
Foley Editor: Peter Davies 
Music Editor: Alessandro Baldessari
Foley Artists: Paul Ackerman 

Outstanding Achievement in Sound Editing – Feature Documentary

Echo in the Canyon

Greenwich Entertainment

Sound Designer: Robby Stambler, MPSE
Dialogue Editor:  Sal Ojeda, MPSE

Outstanding Achievement in Sound Editing – Computer Cinematic

Call of Duty: Modern Warfare (2019)

Activision Blizzard
Audio Director: Stephen Miller
Supervising Sound Editor: Dave Rowe
Supervising Sound Designer: Charles Deenen, MPSE Csaba Wagner
Supervising Music Editor:  Peter Scaturro

Lead Music Editor: Ted Kocher
Principal Sound Designer: Stuart Provine
Sound Designers: Bryan Watkins, Mark Ganus, Eddie Pacheco, Darren Blondin
Dialogue Lead: Dave Natale
Dialogue Editors: Chrissy Arya, Michael Krystek
Sound Editors: Braden Parkes, Nick Martin, Tim Walston, MPSE, Brent Burge, Alex Ephraim, MPSE, Samuel Justice, MPSE
Music Editors: Anthony Caruso, Scott Bergstrom, Adam Kallibjian, Ernest Johnson, Tao-Ping Chen, James Zolyak, Sonia Coronado, Nick Mastroianni, Chris Rossetti
Foley Artists: Gary Hecker, MPSE, Rick Owens, MPSE

Outstanding Achievement in Sound Editing – Computer Interactive Game Play
Call of Duty: Modern Warfare (2019)
Infinity Ward
Audio Director: Stephen Miller
Senior Lead Sound Designer: Dave Rowe
Senior Lead Technical Sound Designer: Tim Stasica
Supervising Music Editor: Peter Scaturro
Lead Music Editor: Ted Kocher
Principal Sound Designer: Stuart Provine
Senior Sound Designers: Chris Egert, Doug Prior
Supervising Sound Designers: Charles Deenen, MPSE, Csaba Wagner
Sound Designers: Chris Staples, Eddie Pacheco, MPSE, Darren Blondin, Andy Bayless, Ian Mika, Corina Bello, John Drelick, Mark Ganus
Dialogue Leads: Dave Natale, Bryan Watkins, Adam Boyd, MPSE, Mark Loperfido
Sound Editors: Braden Parkes, Nick Martin, Brent Burge, Tim Walston, Alex Ephraim, Samuel Justice
Dialogue Editors: Michael Krystek, Chrissy Arya, Cesar Marenco>
Music Editors: Anthony Caruso, Scott Bergstrom, Adam Kallibjian, Ernest Johnson, Tao-Ping Chen, James Zolyak, Sonia Coronado, Nick Mastroianni, Chris Rossetti

Foley Artists: Gary Hecker, MPSE, Rick Owens, MPSE

Outstanding Achievement in Sound Editing – Non-Theatrical Feature

Togo

Disney+

Supervising Sound Editors: Odin Benitez, MPSE, Todd Toon, MPSE
Sound Designer: Martyn Zub, MPSE
Dialogue Editor: John C. Stuver, MPSE
Sound Effects Editors: Jason King, Adam Kopald, MPSE, Luke Gibleon, Christopher Bonis
ADR Editor: Dave McMoyler
Supervising Music Editor: Peter “Oso” Snell, MPSE
Foley Artists: Mike Horton, Tim McKeown
Supervising Foley Editor: Walter Spencer

Outstanding Achievement in Sound Editing – Special Venue

Vader Immortal: A Star Wars VR Series “Episode 1”

Oculus

Supervising Sound Editors: Kevin Bolen, Paul Stoughton
Sound Designer: Andy Martin
Supervising ADR Editors: Gary Rydstrom, Steve Slanec
Dialogue Editors: Anthony DeFrancesco, Christopher Barnett, MPSE Benjamin A. Burtt, MPSE
Foley Artists: Shelley Roden, MPSE Jana Vance

Outstanding Achievement in Sound Editing – Foreign Language Feature

Parasite

Neon

Supervising Sound Editor: Choi Tae Young
Sound Designer: Kang Hye Young
Supervising ADR Editor: Kim Byung In
Sound Effects Editors: Kang Hye Young
Foley Artists: Park Sung Gyun, Lee Chung Gyu
Foley Editor: Shin I Na
 

Outstanding Achievement in Sound Editing – Live Action Under 35:00

Barry “ronny/lily”

HBO

Supervising Sound Editors:  Sean Heissinger, Matthew E. Taylor
Sound Designer:  Rickley W. Dumm, MPSE
Sound Effects Editor: Mark Allen
Dialogue Editors:  John Creed, Harrison Meyle
Music Editor:  Michael Brake
Foley Artists:  Alyson Dee Moore, Chris Moriana 
Foley Editors:  John Sanacore, Clayton Weber

Outstanding Achievement in Sound Editing – Episodic Short Form – Music

Wu Tang: An American Saga “All In Together Now”

Hulu 

Music Editor: Shie Rozow

Outstanding Achievement in Sound Editing – Episodic Short Form – Dialogue/ADR

Modern Love “Take Me as I Am”

Prime Video
Supervising Sound Editor: Lewis Goldstein
Supervising ADR Editor: Gina Alfano, MPSE
Dialogue Editor:  Alfred DeGrand

Outstanding Achievement in Sound Editing – Episodic Short Form – Effects / Foley

The Mandalorian “Chapter One”

Disney+

Supervising Sound Editors: David Acord, Matthew Wood
Sound Effects Editors: Bonnie Wild, Jon Borland, Chris Frazier, Pascal Garneau, Steve Slanec
Foley Editor: Richard Gould
Foley Artists: Ronni Brown, Jana Vance

Outstanding Achievement in Sound Editing – Student Film (Verna Fields Award)

Heatwave

National Film and Television School

Supervising Sound Editor: Kevin Langhamer

Outstanding Achievement in Sound Editing – Single Presentation

El Camino: A Breaking Bad Movie

Netflix

Supervising Sound Editors: Nick Forshager, Todd Toon, MPSE
Supervising ADR Editor: Kathryn Madsen
Sound Effects Editor: Luke Gibleon
Dialogue Editor: Jane Boegel
Foley Editor: Jeff Cranford
Supervising Music Editor: Blake Bunzel
Music Editor: Jason Tregoe Newman
Foley Artists: Gregg Barbanell, MPSE, Alex Ullrich 

Outstanding Achievement in Sound Editing – Episodic Long Form – Music

Game of Thrones “The Long Night”

HBO 

Music Editor: David Klotz

Outstanding Achievement in Sound Editing – Episodic Long Form – Dialogue/ADR

Chernobyl “Please Remain Calm”

HBO

Supervising Sound Editor: Stefan Henrix
Supervising ADR Editor:  Harry Barnes
Dialogue Editor: Michael Maroussas

Outstanding Achievement in Sound Editing – Episodic Long Form – Effects / Foley

Chernobyl “1:23:45”

HBO

Supervising Sound Editor: Stefan Henrix
Sound Designer: Joe Beal
Foley Editors: Philip Clements, Tom Stewart
Foley Artist:  Anna Wright

Outstanding Achievement in Sound Editing – Feature Motion Picture – Music Underscore

JoJo Rabbit

Fox Searchlight Pictures

Music Editor: Paul Apelgren

Outstanding Achievement in Sound Editing – Feature Motion Picture – Musical

Rocketman

Paramount Pictures

Music Editors: Andy Patterson, Cecile Tournesac

Outstanding Achievement in Sound Editing – Feature Motion Picture – Dialogue/ADR

1917

Universal Pictures

Supervising Sound Editor: Oliver Tarney, MPSE
Dialogue Editor: Rachael Tate, MPSE

Outstanding Achievement in Sound Editing – Effects / Foley

Ford v Ferrari

Twentieth Century Fox 

Supervising Sound Editor: Donald Sylvester

Sound Designers: Jay Wilkenson, David Giammarco

Sound Effects Editor: Eric Norris, MPSE

Foley Editor: Anna MacKenzie

 Foley Artists: Dan O’Connell, John Cucci, MPSE, Andy Malcolm, Goro Koyama


Main Image Caption: Amy Pascal and Victoria Alonso

 

Director James Mangold on Oscar-nominated Ford v Ferrari

By Iain Blair

Filmmaker James Mangold has been screenwriting, producing and directing for years. He has made films about country legends (Walk the Line), cowboys (3:10 to Yuma), superheroes (Logan) and cops (Cop Land), and has tackled mental illness (Girl Interrupted) as well.

Now he’s turned his attention to race car drivers and Formula 1 with his movie Ford v Ferrari, which has earned Mangold an Oscar nomination for Best Picture. The film also received nods for its editing, sound editing and sound mixing.

James Mangold (beard) on set.

The high-octane drama was inspired by a true-life friendship that forever changed racing history. In 1959, Carroll Shelby (Matt Damon) is on top of the world after winning the most difficult race in all of motorsports, the 24 Hours of Le Mans. But his greatest triumph is followed quickly by a crushing blow — the fearless Texan is told by doctors that a grave heart condition will prevent him from ever racing again.

Endlessly resourceful, Shelby reinvents himself as a car designer and salesman working out of a warehouse space in Venice Beach with a team of engineers and mechanics that includes hot-tempered test driver Ken Miles (Christian Bale). A champion British race car driver and a devoted family man, Miles is brilliant behind the wheel, but he’s also blunt, arrogant and unwilling to compromise.

After Shelby’s vehicles make a strong showing at Le Mans against Italy’s venerable Enzo Ferrari, Ford Motor Company recruits the firebrand visionary to design the ultimate race car, a machine that can beat even Ferrari on the unforgiving French track. Determined to succeed against overwhelming odds, Shelby, Miles and their ragtag crew battle corporate interference, the laws of physics and their own personal demons to develop a revolutionary vehicle that will outshine every competitor. The film culminates in the historic showdown between the US and Italy at the grueling 1966 24 hour Le Mans race.

Mangold’s below-the-line talent, many of whom have collaborated with the director before, includes Academy Award-nominated director of photography Phedon Papamichael; film editors Michael McCusker, ACE, and Andrew Buckland; visual effects supervisor Olivier Dumont; and composers Marco Beltrami and Buck Sanders.

L-R: Writer Iain Blair and Director James Mangold

I spoke with Mangold — whose other films include Logan, The Wolverine and Knight and Day — about making the film and his workflow.

You obviously love exploring very different subject matter in every film you make.
Yes, and I do every movie like a sci-fi film — meaning inventing a new world that has its own rules, customs, language, laws of physics and so on, and you need to set it up so the audience understands and they get it all. It’s like being a world-builder, and I feel every film should have that, as you’re entering this new world, whether it’s Walk the Line or The French Connection. And the rules and behavior are different from our own universe, and that’s what makes the story and characters interesting to me.

What sort of film did you set out to make?
Well, given all that, I wanted to make an exciting racing movie about that whole world, but it’s also that it was a moment when racing was free of all things that now turn me off about it. The cars were more beautiful then, and free of all the branding. Today, the cars are littered with all the advertising and trademarks — and it’s all nauseating to me. I don’t even feel like I’m watching a sport anymore.

When this story took place, it was also a time when all the new technology was just exploding. Racing hasn’t changed that much over the past 20 years. It’s just refining and tweaking to get that tiny edge, but back in the ‘60s they were still inventing the modern race car, and discovering aerodynamics and alternate building materials and methods. It was a brand-new world, so there was this great sense of discovery and charm along with all that.

What were the main technical challenges in pulling it all together?
Trying to do what I felt all the other racing movies hadn’t really done — taking the driving out of the CG world and putting it back in the real world, so you could feel the raw power and the romanticism of racing. A lot of that’s down to the particulates in the air, the vibrations of the camera, the way light moves around the drivers — and the reality of behavior when you’re dealing with incredibly powerful machines. So right from the start, I decided we had to build all the race cars; that was a huge challenge right there.

How early on did you start integrating post and all the VFX?
Day one. I wanted to use real cars and shoot the Le Mans and other races in camera rather than using CGI. But this is a period piece, so we did use a lot of CGI for set extensions and all the crowds. We couldn’t afford 50,000 extras, so just the first six rows or so were people in the stands; the rest were digital.

Did you do a lot of previz?

A lot, especially for Le Mans, as it was such a big, three-act sequence with so many moving parts. We used far less for Daytona. We did a few storyboards and then me and my second unit director, Darrin Prescott — who has choreographed car chases and races in such movies as Drive, Deadpool 2, Baby Driver and The Bourne Ultimatum — planned it out using matchbox cars.

I didn’t want that “previzy” feeling. Even when I do a lot of previz, whether it’s a Marvel movie or like this, I always tell my previz team “Don’t put the camera anywhere it can’t go.” One of the things that often happens when you have the ability to make your movie like a cartoon in a laboratory — which is what previz is — is that you start doing a lot of gimmicky shots and flying the camera through keyholes and floating like a drone, because it invites you to do all that crazy shit. It’s all very show-offy as a director — “Look at me!” — and a turnoff to me. It takes me out of the story, and it’s also not built off the subjective experience of your characters.

This marks your fifth collaboration with DP Phedon Papamichael, and I noticed there’s no big swooping camera moves or the beauty shot approach you see in all the car commercials.
Yes, we wanted it to look beautiful, but in a real way. There’s so much technology available now, like gyroscopic setups and arms that let you chase the cars in high-speed vehicles down tracks. You can do so much, so why do you need to do more? I’m conservative that way. My goal isn’t to brand myself through my storytelling tricks.

How tough was the shoot?
It was one of the most fun shoots I’ve ever had, with my regular crew and a great cast. But it was also very grueling, as we were outside a lot, often in 115-degree heat in the desert on blacktop. And locations were big challenges. The original Le Mans course doesn’t exist anymore like it used to be, so we used several locations in Georgia to double for it. We shot the races wide-angle anamorphic with a team of a dozen professional drivers, and with anamorphic you can shoot the cars right up into the lens — just inches away from camera, while they’d be doing 150 mph or 160 mph.

Where did you post?
All on the Fox lot at my offices. We scored at Capitol Records and mixed the score in Malibu at my composer’s home studio. I really love the post, and for me it’s all part of the same process — the same cutting and pasting I do when I’m writing, and even when I’m directing. You’re manipulating all these elements and watching it take form — and particularly in this film, where all the sound design and music and dialogue are all playing off one another and are so key. Take the races. By themselves, they look like nothing. It’s just a car whipping by. The power of it all only happens with the editing.

You had two editors — Michael McCusker and Andrew Buckland. How did that work?
Mike’s been with me for 20 years, so he’s kind of the lead. Mike and Drew take and trade scenes, and they’re good friends so they work closely together. I move back and forth between them, which also gives them each some space. It’s very collaborative. We all want it to look beautiful and elegant and well-designed, but no one’s a slave to any pre-existing ideas about structure or pace. (Check out postPerspective‘s interview with the editing duo here.)

What were the big editing challenges?
It’s a car racing movie with drama, so we had to hit you with adrenalin and then hold you with what’s a fairly procedural and process-oriented film about these guys scaling the corporate wall to get this car built and on the track. Most of that’s dramatic scenes. The flashiest editing is the races, which was a huge, year-long effort. Mike was cutting the previz before we shot a foot, and initially we just had car footage, without the actors, so that was a challenge. It all transformed once we added the actors.

Can you talk about working on the visual effects with Method’s VFX supervisor Olivier Dumont?
He did an incredible job, as no one thinks there are so many. They’re really invisible, and that’s what I love — the film feels 100% analog, but of course it isn’t. It’s impossible to build giant race tracks as they were in the ‘60s. But having real foregrounds really helped. We had very few scenes where actors were wandering around in a green void like on so many movies now. So you’re always anchored in the real world, and then all the set extensions were in softer focus or backlit.

This film really lends itself to sound.
Absolutely, as every car has its own signature sound, and as we cut rapidly from interiors to exteriors, from cars to pits and so on. The perspective aural shifts are exciting, but we also tried to keep it simple and not lose the dramatic identity of the story. We even removed sounds in the mix if they weren’t important, so we could focus on what was important.

Where did you do the DI, and how important is it to you?
At Efilm with Skip Kimball (working on Blackmagic DaVinci Resolve), and it was huge on this, especially dealing with the 24-hour race, the changing light, rain and night scenes, and having to match five different locations was a nightmare. So we worked on all that and the overall look from early on in the edit.

What’s next?
Don’t know. I’ve got two projects I’m working on. We’ll see.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Talking with Franki Ashiruka of Nairobi’s Africa Post Office

By Randi Altman

After two decades of editing award-winning film and television projects for media companies throughout Kenya and around the world, Franki Ashiruka opened Africa Post Office, a standalone, post house in Nairobi, Kenya. The studio provides color grading, animation, visual effects, motion graphics, compositing and more. In addition, they maintain a database of the Kenyan post production community that allows them to ramp up with the right artists when the need arises.

Here she talks about the company, its workflow and being a pioneer in Nairobi’s production industry.

When did you open Africa Post Office, and what was your background prior to starting this studio?
Africa Post Office (APO) opened its doors in February 2017. Prior to starting APO, I was a freelance editor with plenty of experience working with well-established media houses such as Channel 4 (UK), Fox International Channels (UK), 3D Global Leadership (Nigeria), PBS (USA), Touchdown (New Zealand), Greenstone Pictures (New Zealand) and Shadow Films (South Africa).

In terms of Kenya-based projects, I’ve worked with a number of production houses including Quite Bright Films, Fat Rain Films, Film Crew in Africa, Mojo Productions, Multichoice, Zuku, Content House and Ginger Ink Films.

I imagine female-run, independent studios in Africa are rare?
On the contrary, Kenya has reached a point where more and more women are emerging as leaders of their own companies. I actually think there are more women-led film production companies than male-led. The real challenge was that before APO, there was nothing quite like it in Nairobi. Historically, video production here was very vertical — if you shot something, you’d need to also manage post within whatever production house you were working in. There were no standalone post houses until us. That said, with my experience, even though hugely daunting, I never thought twice about starting APO. It is what I have always wanted to do, and if being the first company of our kind didn’t intimidate me, being female was never going to be a hindrance.

L-R: Franki Ashiruka, Kevin Kyalo, Carole Kinyua and Evans Wenani

What is the production and post industry like in Nairobi? 
When APO first opened, the workload was commercial-heavy, but in the last two years that has steadily declined. We’re seeing this gap filled by documentary films, corporate work and television series. Feature films are also slowly gaining traction and becoming the focus of many up-and-coming filmmakers.

What services do you provide, and what types of projects do you work on?
APO has a proven track record of successful delivery on hundreds of film and video projects for a diverse range of clients and collaborators, including major corporate entities, NGOs, advertising and PR agencies, and television stations. We also have plenty of experience mastering according to international delivery standards. We’re proud to house a complete end-to-end post ecosystem of offline and online editing suites.

Most importantly, we maintain a very thorough database of the post production community in Kenya.
This is of great benefit to our clients who come to us for a range of services including color grading, animation, visual effects, motion graphics and compositing. We are always excited to collaborate with the right people and get additional perspectives on the job at hand. One of our most notable collaborators is Ikweta Arts (Avatar, Black Panther, Game of Thrones, Hacksaw Ridge), owned and run by Yvonne Muinde. They specialize in providing VFX services with a focus in quality matte painting/digital environments, art direction, concept and post visual development art. We also collaborate with Keyframe (L’Oréal, BMW and Mitsubishi Malaysia) for motion graphics and animations.

Can you name some recent projects and the work you provided?
We are incredibly fortunate to be able to select projects that align with our beliefs and passions.

Our work on the short film Poacher (directed by Tom Whitworth) won us three global Best Editing Awards from the Short to the Point Online Film Festival (Romania, 2018), Feel the Reel International Film Festival (Glasgow, 2018) and Five Continents International Film Festival (Venezuela, 2019).

Other notable work includes three feature documentaries for the Big Story segment on China Global Television Network, directed by Juan Reina (director of the Netflix Original film Diving Into the Unknown), Lion’s Den (Quite Bright Films) an adaptation of ABC’s Shark Tank and The Great Kenyan Bake Off (Showstopper Media) adopted from the BBC series The Great British Bake Off. We also worked on Disconnect, a feature film produced by Kenya’s Tosh Gitonga (Nairobi Half Life), a director who is passionate about taking Africa’s budding film industry to the next level. We have also worked on a host of television commercials for clients extending across East Africa, including Kenya, Rwanda, South Sudan and Uganda.

What APO is most proud of though, is our clients’ ambitions and determination to contribute toward the growth of the African film industry. This truly resonates with APO’s mantra.

You recently added a MAM and some other gear. Can you talk about the need to upgrade?
Bringing on the EditShare EFS 200 nodes has significantly improved the collaborative possibilities of APO. We reached a point where we were quickly growing, and the old approach just wasn’t going to cut it.

Prior to centralizing our content, projects lived on individual hard disks. This meant that if I was editing and needed my assistant to find me a scene or a clip, or I needed VFX on something, I would have to export individual clips to different workstations. This created workflow redundancies and increased potential for versioning issues, which is something we couldn’t afford to be weighed down with.

The remote capabilities of the EditShare system were very appealing as well. Our color grading collaborator, Nic Apostoli of Comfort and Fame, is based in Cape Town, South Africa. From there, he can access the footage on the server and grade it while the client reviews with us in Nairobi. Flow media asset management also helps in this regard. We’re able to effectively organize and index clips, graphics, versions, etc. into clearly marked folders so there is no confusion about what media should be used. Collaboration among the team members is now seamless regardless of their physical location or tools used, which include the Adobe Creative Suite, Foundry Nuke, Autodesk Maya and Maxon Cinema 4D.

Any advice for others looking to break out on their own and start a post house?
Know what you want to do, and just do it! Thanks Nike …


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

ACE Eddie Awards: Parasite and Jojo Rabbit among winners

By Dayna McCallum

The 70th Annual ACE Eddie Awards concluded with wins for Parasite (edited by Jinmo Yang) for Best Edited Feature Film (Dramatic) and Jojo Rabbit (edited by Tom Eagles) for Best Edited Feature Film (Comedy). Yang’s win marks the first time in ACE Eddie Awards history that a foreign language film won the top prize.

The winner of the Best Edited Feature Film (Dramatic) category has gone on to win the Oscar for film editing in 11 of the last 15 years. In other feature categories, Toy Story 4 (edited by Axel Geddes, ACE) won Best Edited Animated Feature Film and Apollo 11 (edited by Todd Douglas Miller) won Best Edited Documentary.

For the second year in a row, Killing Eve won for Best Edited Drama Series (Commercial Television) for “Desperate Measures” (edited by Dan Crinnion). Tim Porter, ACE, took home his second Eddie for Game of Thrones “The Long Night” in the Best Edited Drama Series (Non-Commercial Television) category, and Chernobyl “Vichnaya Pamyat” (edited by Jinx Godfrey and Simon Smith) won Best Edited Miniseries or Motion Picture for Television.

Other television winners included Better Things “Easter” (edited by Janet Weinberg) for Best Edited Comedy Series (Commercial Television), and last year’s Eddie winner for Killing Eve,  Gary Dollner, ACE, for Fleabag “Episode 2.1″ in the Best Edited Comedy Series (Non-Commercial Television) category.

Lauren Shuler Donner received the ACE’s Golden Eddie honor, presented to her by Marvel’s Kevin Feige. In her heartfelt acceptance speech, she noted to an appreciative crowd, “I’ve witnessed many times an editor make chicken salad our of chicken shit.”

Alan Heim and Tina Hirsch received Career Achievement awards presented by filmmakers Nick Cassavetes and Ron Underwood respectively. Cathy Repola, national executive director of the Motion Picture Editors Guild, was presented  with the ACE Heritage Award. American Cinema Editors president Stephen Rivkin, ACE, presided over the evening’s festivities for the final time, as his second term is ending.  Actress D’Arcy Carden, star of NBC’s The Good Place, served as the evening’s host.

Here is the complete list of winners:

BEST EDITED FEATURE FILM (DRAMA):
Parasite 
Jinmo Yang

Tom Eagles – Jojo Rabbit

BEST EDITED FEATURE FILM (COMEDY):
Jojo Rabbit
Tom Eagles

BEST EDITED ANIMATED FEATURE FILM:
Toy Story 4
Axel Geddes, ACE

BEST EDITED DOCUMENTARY (FEATURE):
Apollo 11
Todd Douglas Miller

BEST EDITED DOCUMENTARY (NON-THEATRICAL):
What’s My Name: Muhammad Ali
Jake Pushinsky, ACE

BEST EDITED COMEDY SERIES FOR COMMERCIAL TELEVISION:
Better Things: “Easter”
Janet Weinberg, ACE

BEST EDITED COMEDY SERIES FOR NON-COMMERCIAL TELEVISION:
Fleabag: “Episode 2.1”
Gary Dollner, ACE

BEST EDITED DRAMA SERIES FOR COMMERCIAL TELEVISION: 
Killing Eve: “Desperate Times”
Dan Crinnion

BEST EDITED DRAMA SERIES FOR NON-COMMERCIAL TELEVISION:
Game of Thrones: “The Long Night”
Tim Porter, ACE

BEST EDITED MINISERIES OR MOTION PICTURE FOR TELEVISION:
Chernobyl: “Vichnaya Pamyat”
Jinx Godfrey & Simon Smith

BEST EDITED NON-SCRIPTED SERIES:
VICE Investigates: “Amazon on Fire”
Cameron Dennis, Kelly Kendrick, Joe Matoske, Ryo Ikegami

ANNE V. COATES AWARD FOR STUDENT EDITING
Chase Johnson – California State University, Fullerton


Main Image: Parasite editor Jinmo Yang

The 71st NATAS Technology & Engineering Emmy Award winners

The National Academy of Television Arts & Sciences (NATAS) has announced the recipients of the 71st Annual Technology & Engineering Emmy Awards. The event will take place in partnership with the National Association of Broadcasters, during the NAB Show on Sunday, April 19 in Las Vegas.

The Technology & Engineering Emmy Awards are awarded to a living individual, a company or a scientific or technical organization for developments and/or standardization involved in engineering technologies that either represent so extensive an improvement on existing methods or are so innovative in nature that they materially have affected television.

A committee of engineers working in television considers technical developments in the industry and determines which, if any, merit an award.

“The Technology & Engineering Emmy Award was the first Emmy Award issued in 1949, and it laid the groundwork for all the other Emmys to come,” says Adam Sharp, CEO/president of NATAS. “We are especially excited to be honoring Yvette Kanouff with our Lifetime Achievement Award in Technology & Engineering.”

Kanouff has held CTO and president roles at various companies in the cable and media industry. Over the years, she has spearheaded transformational technologies, such as video on demand, cloud DVR, digital and on-demand advertising, streaming security and privacy.

And now the Awards recipients:

Pioneering System for Live Performance-Based Animation Using Facial Recognition
– Adobe

HTML5 Development and Deployment of a Full TV Experience on Any Device
– Apple
– Google
– LG
– Microsoft
– Mozilla
– Opera
– Samsung

Pioneering Public Cloud-Based Linear Media Supply Chains
– AWS
– Discovery
– Evertz
– Fox Neo (Walt Disney Television)
– SDVI

Pioneering Development of Large Scale, Cloud Served, Broadcast Quality,
Linear Channel Transmission to Consumers
– Sling TV
– Sony PlayStation Vue
– Zattoo

Early Development of HSM Systems That Created a Pivotal Improvement in Broadcast Workflows
– Dell (Isilon)
– IBM
– Masstech
– Quantum

Pioneering Development and Deployment of Hybrid Fiber Coax Network Architecture
– Cable Labs

Pioneering Development of the CCD Image Sensor
– Bell Labs
– Michael Tompsett

VoCIP (Video over Bonded Cellular Internet)
– Aviwest
– Dejero
– LiveU
– TVU Networks

Ultra-High Sensitivity HDTV Camera
– Canon
– Flovel

Development of Synchronized Multi-Channel Uncompressed Audio Transport Over IP Networks
– ALC NetworX
– Audinate
– Audio Engineering Society
– Kevin Gross
– QSC
– Telos Alliance
– Wheatstone

Emmy statue image courtesy of ATAS/NATAS

Behind the Title: Design director Liron Eldar-Ashkenazi

NAME: Liron Eldar-Ashkenazi  (@iamlirona)

WHAT’S YOUR JOB TITLE?
Design Director

WHAT DOES THAT ENTAIL?
I help companies execute on their creative hopes and dreams, both hands-on and as a consultant and director.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
Educating my clients about the lay of the land when it comes to getting what they want creatively. People typically think coming up with creative concepts is easy and quick. A big part of my job is helping companies see the full scope of taking a project from beginning to end with success while being mindful of timeline and budget.

HOW LONG HAVE YOU BEEN WORKING IN MOTION GRAPHICS?
I was accepted to the prestigious position of motion graphics artist in the Israeli defense force when I was 18 — all women and men have to serve in the military. It’s now been about 12 years that I’ve been creating and animating.

HOW HAS THE INDUSTRY CHANGED IN THE TIME YOU’VE BEEN WORKING? WHAT’S BEEN GOOD, WHAT’S BEEN BAD?
I see a lot more women 3D artists and animators. It’s so refreshing! It used to be a man’s world and I’m so thrilled to see the shift. Overall, it’s becoming a bit more challenging as screens are changing so fast and there are so many of them. Everything you create has to suit a thousand different use-cases and coming up with the right strategy for that takes longer than it did when we were only thinking in 15’s and 30’s 16:9.

WHAT’S YOUR FAVORITE PART OF THE JOB?
I love that there are so many facets to my work under one title. Coming up with concepts, designing, animating, creating prints and artworks, working with typography is just so much more rewarding than in the days when you only had one job — lighting, texturing, animating, designing. Now an artist is free to do multiple things, and it’s well appreciated.

WHAT’S YOUR LEAST FAVORITE?
Long rendering times. I think computers are becoming stronger, but we also demand more and more from them. I still hate sitting and waiting for a computer to show me what I’m working on.

WHAT IS YOUR MOST PRODUCTIVE TIME OF THE DAY?
Morning! I’m a morning person who loves to start early and finish when there’s still light out.

WHY DID YOU CHOOSE THIS PROFESSION?
I didn’t really choose it; it chose me.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
At age 16 I knew I would never be great at sitting on my behind and just studying the text. I knew I needed to create in order to succeed. It’s my safe space and what I do best.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Some other form of visual artist, or a psychologist.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
Right before I left The-Artery as a design director, where I’d been working the past three years, we created visuals for a really interesting documentary. All the content was created in 3D using Cinema 4D and Octane. We produced about 18 different spots explaining different concepts. My team and I did everything from concept to rendering. It’ll be amazing to see it when it comes out.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
At The-Artery, I was in charge of a really interesting branding project for a Fin-tech company. We created an entire visual language in 3D for their everyday marketing, website, and blog use. All content was designed and rendered using Cinema 4D and it was so great combining a branding exercise with motion graphics to bring all the visuals to life.

YOU HAVE RECENTLY PRESENTED YOUR WORKFLOW AT TRADES SHOWS AND ROAD TOURS. TELL US ABOUT SHARING YOUR WORK PUBLICLY.
I’ve was invited by Maxon, the developers of Cinema 4D, to give a live-demo presentation at SIGGRAPH 2019. It was an exceptional experience, and I received really lovely responses from the community and artists looking to combine more graphic design into their motion graphics and 3D pipeline. I’ve shared some cool methods I’ve developed in Cinema 4D for creating fine-art looks for renders.

PRESENTLY, YOU ARE WORKING AS ARTIST IN RESIDENCE AT FACEBOOK. HOW DID THIS COME ABOUT AND WHAT KIND OF WORK ARE YOU DOING?
Facebook somehow found me. I assume it was through my Instagram account, where I share my wild, creative experiments. The program is a six-week residency at their New York office, where I get to flex my analog muscles and create prints at their Analog lab. In the lab, they have all the art supplies you can ask for along with an amazing Risograph printer. I’ve been creating posters and zines from my 3D rendered illustrations.

WHAT SOFTWARE TOOLS DO YOU USE DAY-TO-DAY?
Maxon Cinema 4D is my primary tool. I design almost everything I create in it, including work that seems to be flat and graphic.

WHERE DO YOU FIND INSPIRATION NOW?
I find talking to people and brainstorming has always been the thing that sparks the most creativity in me. Solving problems is another way I tackle every design assignment. I always need to figure out what needs to be fixed, be better or change completely, and that’s what I find most inspires me to create.

THIS IS A HIGH-STRESS JOB WITH DEADLINES AND CLIENT EXPECTATIONS. WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
Planning is critical for me to feel confident about projects and helps me avoid stress in general. Giving my work 100% and not promising any false expectations to my clients also helps limit stress. It’s key to be honest from the get-go if I think something wouldn’t work in the timeline, or if late changes would hurt the final product. If I do get to a point that I’m really stressed, I find that running, going out dancing or dancing to my favorite music at home, and generally listening to music are all helpful.

Directing bookend sequences for Portals, a horror anthology film

By Hasraf “HaZ” Dulull

Portals is a genre-bending feature film anthology focusing on a series of worldwide blackouts — after which millions of mysterious objects appear everywhere across the planet. While many flee from the sentient objects, some people are drawn toward and into them with horrifying consequences.

Portals

The film was in the final stages of post when writer/director Liam O’Donnell (Beyond Skyline and the upcoming Skylines film) called to see if I would like to get involved and direct some bookend sequences to add more scope and setup, which the producers felt was very much needed. I loved the premise and the world of the anthology, so I said yes. I pitched an idea for an ending, that quickly evolved into an extra segment at the end of the film, which I directed. That’s why there are officially four directors on the show, with me getting executive producer and “end-segment created by” credits.

Two of the other sequences are around 20 to 25 minutes each, and O’Donnell’s sequence was around 35 minutes. The film is 85 minutes long. Eduardo Sanchez and Gregg Hale (The Blair Witch Project) co-directed their segments. So the anthology feature film is really three long segments with my bookend sequences. The only connections among all the stories are the objects that appear, the event itself and the actual “portal,” but everything else was unique to each segment’s story. In terms of production, the only consistencies throughout the anthology were the camera language — that slight hand-held feel — and, of course, the music/sound

I had to watch the latest cut of the entire anthology film to get my head into that world, but I was given freedom to bring my own style to my sequences. That is exactly the point of an anthology — for each director to bring his or her own sensibilities to the individual segments. Besides Liam, the main producers I worked closely with on this project were Alyssa Devine and Griffin Devine from Pigrat Productions. They are fans of my first feature film, The Beyond, so they really encouraged the grounded tone I had demonstrated in that film.

The portal in Portals.

I’ve been a huge advocate of Blackmagic cameras and technology for a long time. Additionally, I knew I had to a lot to shoot in a very short time space (two days!), so I needed a camera that was light and flexible yet able to shoot 4K. I brought on cinematographer Colin Emerson, who shoots in a very loose way but always makes his stuff look cinematic. We watched the cut of the film and noticed the consistent loose nature to the cinematography on all the segments. Colin uses the Fig Rig a lot and I love the way that rig works and the BMD Pocket Cinema 4K fits nicely on it along with his DSLR lenses he likes to use. The other reason was to be able to use Blackmagic’s new BRaw format too.

We also shot the segment using a skeleton crew, which comprised of myself as director/producer; VFX supervisor/1st AD John Sellings, who also did some focus pulling; James De Taranto (sound recording); DP/camera op Colin Emerson, FX makeup artists Kate Griffith and Jay James; and our two actors, Georgina Blackledge and Dare Emmanuel. I worked with both of them on my feature film The Beyond.

The Post
One thing I wanted to make sure of was that the post team at The Institution in LA was able to take my Resolve files and literally work from that for the picture post. One of the things I did during prep of the project (before we even cast) was to shoot some tests to show what I had in mind in terms of look and feel. We also tested the BRaw and color workflow between my setup in London and the LA team. Colin and I did this during location recce. This proved to be extremely useful to ensure we set our camera to the exact specs the post house wanted. So we shot at 23.98, 4K (4096×1716) 2:39 cropped, Blackmagic color design log color space.

HaZ’s segments were captured with the Blackmagic Pocket Cinema Camera.

During the test, I did some quick color tests to show the producers in LA the tone and mood I was going for and to make sure everyone was on board before I shot it. The look was very post apocalyptic, as it’s set after the main events have happened. I wanted the locations to be a contrast with each other, one interior and one exterior with greens.

Colin is used to shooting most of his stuff on the Panasonic GH, but he had the Cinema Pocket Camera and was looking for the right project to use it on. He found he could use all of his usual lenses because the Cinema Pocket Camera has the same mount. Lenses used were the Sigma 18-35mm f/1.8 + Metabones Speedbooster; the Olympus 12mm f2; and the Lumix 35-100mm f2.8

Colin used the onboard monitor screen on the Pocket Cinema Camera, while I used a tethered external monitor — the Ikan DH5e — for directing. We used a 1TB Samsung external SSD securely attached to the rig cage along with a 64GB CFast card. The resolution we shot in was determined by the tests we did. We set up the rushes for post after each of the two days of the shoot, so during the day we would swap out drives and back things up. At the end of the day, we would bring in all the picture and sound rushes and use the amazing autosync feature in Blackmagic DaVinci Resolve to set it all up. This way, when I headed back home I could start editing right away inside Resolve.

Resolve

I have to admit, we were hesitant at first because I was shooting and capturing Log in QuickTime ProRes 4:4:4:4, and I always avoided DNG raw because of the huge file size and data transfer. But the team at Blackmagic has always been so supportive and provided us with support right up till the end of the shoot, so after testing BRaw I was impressed. We had so much control as all that information is accessed within Resolve. . I was able to set the temp look during editing, and the colorist worked from there. Skin tones were of utmost importance; because of the intimate nature of the drama, I wanted a natural look to the skin tones. I am really happy with the way they came out at the end.

They couldn’t believe how cinematic the footage was when we told them we shot using the Pocket Cinema Camera, since the other segments were shot on cameras like Red. We delivered the same 4K deliverables spec as the other segments in the film.

HaZ on set, second from right.

I used the AMD Radeon RX Vega 56 version of the Blackmagic eGPU. The reason was because I wanted to edit on my MacBook Pro (late 2017) and needed the power to run 4K in realtime. I was so impressed with how much power it provided; it was like having a new MacBook Pro without having to buy one. The eGPU had all the connectivity (two Thunderbolt and four USB-3) I needed, which is a limitation of the MacBook Pro.

The beauty of keeping everything native was that there wasn’t much work to do when porting, as it’s just plug and play. And the Resolve detects the eGPU, which you can then set as default. The BRaw format makes it all so manageable to preview and playback in real time. Also, since it’s native, Resolve doesn’t need to do any transcoding in the background. I have always been a huge fan of the tracking in Resolve, and I was able to do eye effects very easily without it being budgeted or done as a VFX shot. I was able to get the VFX render assets from the visual effects artist (Justin Martinez ) in LA and do quick-slap comps during editing. I love the idea that I can set looks and store them as memories, which I can then recall very quickly to apply on a bunch of shots. This allows me to have a slick-looking preview rough cut of the film.

Portals

I sent a hard drive containing all the organized rushes to the team in LA while I was doing the final tweaks to the edit. Once the edit was signed off, or if any last-minute notes came in, I would do them and email them my Resolve file. It was super simple, and the colorists (Oliver Ojeil) and post team (Chad Van Horn and Danny Barone) in LA appreciated the simple workflow because there really wasn’t any conforming for them to do apart from a one-click relink of media location; they would just take my Resolve file and start working away with it.

We used practical effects to keep the horror as real and grounded as possible, and used VFX to augment further. We were fortunate to be able to get special effects makeup artist Kate Griffiths. Given the tight schedule she was able to create a terrifying effect, which I won’t give away. You need to watch the film to see it! We had to shoot those make-up FX-heavy shots at the end of the day, which meant we had to be smart about how we scheduled the shoot given the hours-long make-up process. Kate was also on hand to provide effects like the liquid coming out of the eyes and sweat etc. — every detail of which the camera picked up for us so we could bring it out in the grade.

The Skype-style shots at the start of the film (phone and computer monitor shots) had their VFX screen elements placed as a separate layer so the post team in LA could grade them separately and control the filters applied on them. For some of the wide shots showing our characters entering and leaving the portal, we keyframed some movement of the 4K shot along with motion blur to give the effect of in-camera movement. I also used the camera shake within Resolve, which comes with so many options to create bespoke movement on static frames.

Portals is now available on iTunes and other VOD platforms.


HaZ Dulull is known for his sci-fi feature films The Beyond and 2036 Origin Unknown, also in television for his pilot and episodes on Disney’s Fast Layne. He is currently busy on projects at various stages of development and production at his production company, hazfilm.com.

Picture Shop VFX acquires Denmark’s Ghost VFX

Burbank’s Picture Shop VFX has acquired Denmark’s Ghost VFX. This Copenhagen-base studio, founded in 1999, provides high-end visual work for film, television and several streaming platforms. The move helps Picture Shop “increase its services worldwide and broaden its talent and expertise,” according to Picture Shop VFX’s president Tom Kendall.

Over the years, Ghost has contributed to more than 70 feature films and titles. Some of Ghost’s work includes Star Wars: The Rise of Skywalker, The Mandalorian, The Walking Dead, See, Black Panther and Star Trek Discovery.

“As we continue to expand our VFX footprint into the international market, I am extremely excited to have Ghost join Picture Shop VFX,” says Bill Romeo, president of Picture Head Holdings.

Christensen says the studio takes up three floors and 13,000 square feet in a “vintage and beautifully renovated office building” in Copenhagen. Their main tools are Autodesk Maya, Foundry Nuke and SideFX Houdini.

“We are really looking forward to a tight-nit collaboration with all the VFX teams in the Picture Shop group,” says Christensen. “Right now Ghost will continue servicing current clients and projects, but we’re really looking forward to exploring the massive potential of being part of a larger and international family.”

Picture Shop VFX is a division of Picture Head Holdings. Picture Head Holdings has locations in Los Angeles, Vancouver, the United Kingdom, and Denmark.

Main Image: Ghost artists at work.

Sohonet beefs up offerings with Exchange acquisition

Sohonet, which provides connectivity, media services and network security for media and entertainment, has acquired Exchange Communications, which has been providing IT services to film and television productions for more than 20 years. The acquisition broadens the range of connectivity and collaboration solutions that each organization can offer its customers.

Sohonet has a global network of over 500 media companies as well as realtime collaboration, cloud-acceleration and file-transfer tools, while Exchange offers fixed production studio services for phones and video surveillance and rapidly available remote production communications. Together, the companies will serve the rapidly growing and changing production industry across features, episodic and advertising.
Sohonet will invest in the expansion of Exchange Communications services in other geographies, initially focusing on Canada and the UK.

Review: HP’s ZBook G6 mobile workstation

By Brady Betzel

In a year that’s seen AMD reveal an affordable 64-core processor with its Threadripper 3, it appears as though we are picking up steam toward next-level computing.

Apple finally released its much-anticipated Mac Pro (which comes with a hefty price tag for the 1.5TB upgrade), and custom-build workstation companies — like Boxx and Puget Systems — can customize good-looking systems to fit any need you can imagine. Additionally, over the past few months, I have seen mobile workstations leveling the playing field with their desktop counterparts.

HP is well-known in the M&E community for its powerhouse workstations. Since I started my career, I have either worked on a MacPro or an HP. Both have their strong points. However, workstation users who must be able to travel with their systems, there have always been some technical abilities you had to give up in exchange for a smaller footprint. That is, until now.

The newly released HP ZBook 15 G6 has become the rising the rising tide that will float all the boats in the mobile workstation market. I know I’ve said it before, but the classification of “workstation” is technically much more than just a term companies just throw around. The systems with workstation-level classification (at least from HP) are meant to be powered on and run at high levels 24 hours a day, seven days a week, 365 days a year.

They are built with high-quality, enterprise-level components, such as ECC (error correcting code) memory. ECC memory will self-correct errors that it sees, preventing things like blue screens of death and other screen freezes. ECC memory comes at a cost, and that is why these workstations are priced a little higher than a standard computer system. In addition, the warranties are a little more inclusive — the HP ZBook 15 G6 comes with a standard three-year/on-site service warranty.

Beyond the “workstation” classification, the ZBook 15 G6 is amazingly powerful, brutally strong and incredibly colorful and bright. But what really matters is under the hood. I was sent the HP ZBook 15 G6 that retails for $4,096 and contains the following specs:
– Intel Xeon E-2286M (eight cores/16 threads — 2.4GHz base/5GHz Turbo)
– Nvidia Quadro RTX 3000 (6GB VRAM)
15.6-inch UHD HP Dream Color display, anti-glare, WLED backlit 600 nits, 100% DCI-P3
– 64GB DDR4 2667MHz
– 1TB PCIe Gen 3 x4 NVMe SSD TLC
– FHD webcam 1080p plus IR camera
– HP collaboration keyboard with dual point stick
– Fingerprint sensor
– Smart Card reader
– Intel Wi-Fi 6 AX 200, 802.11ac 2×2 +BT 4.2 combo adapter (vPro)
– HP long-life battery four-cell 90 Wh
– Three-year limited warranty

The ZBook 15 G6 is a high-end mobile workstation with a price that reflects it. However, as I said earlier, true workstations are built to withstand constant use and, in this case, abuse. The ZBook 15 G6 has been designed to pass up to 21 extensive MIL-STD 810G tests, which is essentially worst-case scenario testing. For instance, drop testing of around four feet, sand and dust testing, radiation testing (the sun beating down on the laptop for an extended period) and much more.

The exterior of the G6 is made of aluminum and built to withstand abuse. The latest G6 is a little bulky/boxy, in my opinion, but I can see why it would hold up to some bumps and bruises, all while working at blazingly fast speeds, so bulk isn’t a huge issue for me. Because of that bulk, you can imagine that this isn’t the lightest laptop either. It weighs in at 5.79 pounds for the lowest end and measures 1 inch by 14.8 inches by 10.4 inches.

On the bottom of the workstation is an easy-to-access panel for performing repairs and upgrades yourself. I really like the bottom compartment. I opened it and noticed I could throw in an additional NVMe drive and an SSD if needed. You can also access memory here. I love this because not only can you perform easy repairs yourself, but you can perform upgrades or part replacements without voiding your warranty on the original equipment. I’m glad to see that HP kept this in mind.

The keyboard is smaller than a full-size version but has a number keypad, which I love using when typing in timecodes. It is such a time-saver for me. (I credit entering in repair order numbers when I fixed computers at Best Buy as a teenager.) On the top of the keyboard are some handy shortcuts if you do web conferences or calls on your computer, including answering and ending calls. The Bang & Olufsen speakers are some of the best laptop speakers I’ve heard. While they aren’t quite monitor-quality, they do have some nice sound on the low end that I was able to fine-tune in the Bang & Olufsen audio control app.

Software Tests
All right, enough of the technical specs. Let’s get on to what people really want to know — how the HP ZBook 15 G6 performs while using apps like Blackmagic’s DaVinci Resolve and Adobe Premiere Pro. I used sample Red and Blackmagic Raw footage that I use a lot in testing. You can grab the Red footage here and the BRaw footage here. Keep in mind you will need to download the BRaw software to edit with BRaw inside of Adobe products, which you can find here).

Performance monitor while exporting in Resolve with VFX.

For testing in Resolve and Premiere, I strung out one-minute of 4K, 6K and 8K Red media in one sequence and the 4608×2592 4K and 6K BRaw media in another. During the middle of my testing Resolve had a giant Red API upgrade to allow for better realtime playback of Red Raw files if you have an Nvidia CUDA-based GPU.

First up is Resolve 16.1.1 and then Resolve 16.1.2. Both sequences are set to UHD (3840×2160) resolution. One sequence of each codec contains just color correction, while another of each codec contains effects and color correction. The Premiere sequence with color and effects contains basic Lumetri color correction, noise reduction (50) and a Gaussian blur with settings of 0.4. In Resolve, the only difference in the color and effects sequence is that the noise reduction is spatial and set to Enhanced, Medium and 25/25.

In Resolve, the 4K Red media would play in realtime while the 6K (RedCode 3:1) would jump down to about 14fps to 15fps, and the 8K (RedCode 7:1) would play at 10fps at full resolution with just color correction. With effects, the 4K media would play at 20fps, 6K at 3fps and 8K at 10fps. The Blackmagic Raw video would play at real time with just color correction and around 3fps to 4fps with effects.

This is where I talk about just how loud the fans in the ZBook 15 G6 can get. When running exports and benchmarks, the fans are noticeable and a little distracting. Obviously, we are running some high-end testing with processor- and GPU-intensive tests but still, the fans were noticeable. However, the bottom of the mobile workstation was not terribly hot, unlike the MacBook Pros I’ve tested before. So my lap was not on fire.

In my export testing, I used those same sequences as before and from Adobe Premiere Pro 2020. I exported UHD files using Adobe Media Encoder in different containers and codecs: H.264 (Mov), H.265 (Mov), ProResHQ, DPX, DCP and MXF OP1a (XDCAM). The MXF OP1a was at 1920x1080p export.
Here are my results:

Red (4K,6K,8K)
– Color Only: H.264 – 5:27; H.265 – 4:45; ProResHQ – 4:29; DPX – 3:37; DCP – 10:38; MXF OP1a – 2:31

Red Color, Noise Reduction (50), Gaussian Blur .4: H.264 – 4:56; H.265 – 4:56; ProResHQ – 4:36; DPX – 4:02; DCP – 8:20; MXF OP1a – 2:41

Blackmagic Raw
Color Only: H.264 – 2:05; H.265 – 2:19; ProResHQ – 2:04; DPX – 3:33; DCP – 4:05; MXF OP1a – 1:38

Color, Noise Reduction (50), Gaussian Blur 0.4: H.264 – 1:59; H.265 – 2:22; ProResHQ – 2:07; DPX – 3:49; DCP – 3:45; MXF OP1a – 1:51

What is surprising is that when adding effects like noise reduction and a Gaussian blur in Premiere, the export times stayed similar. While using the ZBook 15 G6, I noticed my export times improved when I upgraded driver versions, so I re-did my tests with the latest Nvidia drivers to make sure I was consistent. The drivers also solved an issue in which Resolve wasn’t reading BRaw properly, so remember to always research drivers.

The Nvidia Quadro RTX 3000 really pulled its weight when editing and exporting in both Premiere and Resolve. In fact, in previous versions of Premiere, I noticed that the GPU was not really being used as well as it should have been. With the Premiere Pro 2020 upgrade it seems like Adobe really upped its GPU usage game — at some points I saw 100% GPU usage.

In Resolve, I performed similar tests, but instead of ProResHQ I exported a DNxHR QuickTime file/package instead of a DCP and IMF package. For the most part, they are stock exports in the Deliver page of Resolve, except I forced Video Levels, Forced Debayer and Resizing to Highest Quality. Here are my results from Resolve version 16.1.1 and 16.1.2. (16.1.2 will be in parenthesis.)

– Red (4K, 6K, 8K) Color Only: H.264 – 2:17 (2:31); H.265 – 2:23 (2:37); DNxHR – 2:59 (3:06); IMF – 6:37 (6:40); DPX – 2:48 (2:45); MXF OP1A – 2:45 (2:33)

Color, Noise Reduction (Spatial, Enhanced, Medium, 25/25), Gaussian Blur 0.4: H.264 – 5:00 (5:15); H.265 – 5:18 (5:21); DNxHR – 5:25 (5:02); IMF – 5:28 (5:11); DPX – 5:23 (5:02); MXF OP1a – 5:20 (4:54)

-Blackmagic Raw Color Only: H.264 – 0:26 (0:25); H.265 – 0:31 (0:30); DNxHR – 0:50 (0:50); IMF – 3:51 (3:36); DPX – 0:46 (0:46); MXF OP1a – 0:23 (0:22)

Color, Noise Reduction (Spatial, Enhanced, Medium, 25/25), Gaussian Blur 0.4: H.264 – 7:51 (7:53); H.265 – 7:45 (8:01); DNxHR – 7:53 (8:00); IMF – 8:13 (7:56); DPX – 7:54 (8:18); MXF OP1a – 7:58 (7:57)

Interesting to note: Exporting Red footage with color correction only was significantly faster from Resolve, but for Red footage with effects applied, export times were similar between Resolve and Premiere. With the CUDA Red SDK update to Resolve in 16.1.2, I thought I would see a large improvement, but I didn’t. I saw an approximate 10% increase in playback but no improvement in export times.

Puget

Puget Systems has some great benchmarking tools, so I reached out to Matt Bach, Puget Systems’ senior labs technician, about my findings. He suggested that the mobile Xeon could possibly still be the bottleneck for Resolve. In his testing he saw a larger increase in speed with AMD Threadripper 3 and Intel i9-based systems. Regardless, I am kind of going deep on realtime playback of 8K Red Raw media on a mobile workstation — what a time we are in. Nonetheless, Blackmagic Raw footage was insanely fast when exporting out of Resolve, while export time for the Blackmagic Raw footage with effects was higher than I expected. There was a consistent use of the GPU and CPU in Resolve much like in the new version of Premiere 2020, which is a trend that’s nice to see.

In addition to Premiere and Resolve testing, I ran some common benchmarks that provide a good 30,000-foot view of the HP ZBook 15 G6 when comparing it to other systems. I decided to use the Puget Systems benchmarking tools. Unfortunately, at the time of this review, the tools were only working properly with Premiere and After Effects 2019, so I ran the After Effects benchmark using the 2019 version. The ZBook 15 G6 received an overall score of 802, render score of 79, preview score of 75.2 and tracking score of 86.4. These are solid numbers that beat out some desktop systems I have tested.

Corona

To test some 3D applications, I ran the Cinebench R20, which gave a CPU score of 3243, CPU (single core) score of 470 and an M/P ratio of 6.90x. I recently began running the Gooseberry benchmark scene in Blender to get a better sense of 3D rendering performance, and it took 29:56 to export. Using the Corona benchmark, it took 2:33 to render 16 passes, 3,216,368 rays/s. Using Octane Bench the ZBook 15 G6 received a score of 139.79. In the Vray benchmark for CPU, it received 9833 Ksamples, and in the Vray GPU testing, 228 mpaths. I’m not going to lie; I really don’t know a lot about what these benchmarks are trying to tell me, but they might help you decide whether this is the mobile workstation for your work.

Cinebench

One benchmark I thought was interesting between driver updates for the Nvidia Quadro RTX 3000 was the Neat Bench from Neat Video — the noise reduction plugin for video. It measures whether your system should use the CPU, GPU or a combination thereof to run Neat Video. Initially, the best combination result was to use the CPU only (seven cores) at 11.5fps.

After updating to the latest Nvidia drivers, the best combination result was to use the CPU (seven cores) and GPU (Quadro RTX 3000) at 24.2fps. A pretty incredible jump just from a driver update. Moral of the story: Make sure you have the correct drivers always!

Summing Up
Overall, the HP ZBook 15 G6 is a powerful mobile workstation that will work well across the board. From 3D to color correction apps, the Xeon processor in combination with the Quadro RTX 3000 will get you running 4K video without a problem. With the HP DreamColor anti-glare display using up to 600 nits of brightness and covering 100% of the DCI-P3 color space, coupled with the HDR option, you can rely on the attached display for color accuracy if you don’t have your output monitor attached. And with features like two USB Type-C ports (Thunderbolt 3 plus DP 1.4 plus USB 3.1 Gen 2), you can connect external monitors for a larger view of your work

The HP Fast Charge will get you out of a dead battery fiasco with the ability to go from 0% to 50% charge in 45 minutes. All of this for around $4,000 seems to be a pretty low price to pay, especially because it includes a three-year on-site warranty and because the device is certified to work seamlessly with many apps that pros use with HP’s independent software vendor verifications.

If you are looking for a mobile workstation upgrade, are moving from desktop to mobile or want an alternative to a MacBook Pro, you should price a system out online.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Conductor Companion app targets VFX boutiques and freelancers

Conductor Technologies has introduced Conductor Companion, a desktop app designed to simplify the use of the cloud-based rendering service. Tailored for boutique studios and freelance artists, Companion streamlines the Conductor on-ramp and rendering experience, allowing users to easily manage and download files, write commands and handle custom submissions or plug-ins from their laptops or workstations. Along with this release, Conductor has added initial support for Blender creative software.

“Conductor was originally designed to meet the needs of larger VFX studios, focusing our efforts on maximizing efficiency and scalability when many artists simultaneously leverage the platform and optimizing how Conductor hooks into those pipelines,” explains CEO Mac Moore. “As Conductor’s user base has grown, we’ve been blown away by the number of freelance artists and small studios that have come to us for help, each of which has their own unique needs. Conductor Companion is a nod to that community, bringing all the functionality and massive render resource scale of Conductor into a user-friendly app, so that artists can focus on content creation versus pipeline management. And given that focus, it was a no-brainer to add Blender support, and we are eager to serve the passionate users of that product.”

Moore reports that this app will be the foundation of Conductor’s Intelligence Hub in the near future, “acting as a gateway to more advanced functionality like Shot Analytics and Intelligent Bid Assist. These features will leverage AI and Conductor’s cloud knowledge to help owners and freelancers make more informed business decisions as it pertains to project-to-project rendering financials.”

Conductor Companion is currently in public beta. You can download the app here.

In addition to Blender, applications currently supported by Conductor include Autodesk Maya and Arnold; Foundry’s Nuke, Cara VR, Katana, Modo and Ocula; Chaos Group’s V-Ray; Pixar’s RenderMan; Isotropix’s Clarisse; Golaem; Ephere’s Ornatrix; Yeti; and Miarmy.

The Mill opens boutique studio in Berlin

Technicolor’s The Mill has officially launched in Berlin. This new boutique studio is located in the heart of Berlin, situated in the creative hub of Mitte, near many of Germany’s agencies, production companies and brands.

The Mill has been working with German clients for years. Recent projects include the Mercedes’ Bertha Benz spot with director Sebastian Strasser; Netto’s The Easter Surprise, directed in-house by The Mill; and BMW The 8 with director Daniel Wolfe. The new studio will bring The Mill’s full range of creative services from color to experiential and interactive, as well as visual effects and design.

The Mill Berlin crew

Creative director Greg Spencer will lead the creative team. He is a multi-award winning creative, having won several VES, Cannes Lions and British Arrow awards. His recent projects include Carlsberg’s The Lake, PlayStation’s This Could Be You and Eve Cuddly Toy. Spencer also played a role in some of Mill Film’s major titles. He was the 2D supervisor for Les Misérables and also worked on the Lord of the Rings trilogy. His resume also includes campaigns for brands such as Nike and Samsung.

Executive producer Justin Stiebel moves from The Mill London, where he has been since early 2014, to manage client relationships and new business. Since joining the company, Stiebel has produced spots such as Audi’s Next Level and the Mini’s “The Faith of a Few” campaign. He has also collaborated with directors such as Sebastian Strasser, Markus Walter and Daniel Wolfe while working on brands like Mercedes, Audi and BMW.

Sean Costelloe is managing director of The Mill London and The Mill Berlin.

Main Image Caption: (L-R) Justin Stiebel and Greg Spencer

Quantum F1000: a lower-cost NVMe storage option

Quantum is now offering the F1000, a lower-priced addition to the Quantum F-Series family of NVMe storage appliances. Using the software-defined architecture introduced with the F2000, the F1000 offers “ultra-fast streaming” performance and response times at a lower entry price. The F-Series can be used to accelerate the capture, edit and finishing of high-definition content and to accelerate VFX and CGI render speeds up to 100 times for developing augmented and virtual reality.

The Quantum F-Series was designed to handle content such as HD video used for movie, TV and sports production, advertising content or image-based workloads that require high-speed processing. Pros are using F-Series NVMe systems as part of Quantum’s StorNext scale-out file storage cluster and leveraging the StorNext data management capabilities to move data between NVMe storage pools and other storage pools. Users can take advantage of the performance boost NVMe provides for workloads that require it, while continuing to use lower-cost storage for data where performance is less critical.

Quantum F-Series NVMe appliances accelerate pro workloads and also help customers move from Fibre Channel networks to less expensive IP-based networks. User feedback has shown that pros need a lower cost of entry into NVMe technology, which is what led Quantum to develop the F1000. According to Quantum, the F1000 offers performance that is five to 10 times faster than an equivalent SAS SSD storage array at a similar price.

The F1000 is available in two capacity points: 39TB and 77TB. It offers the same connectivity options as the F2000 — 32Gb Fibre Channel or iSER/RDMA using 100Gb Ethernet — and is designed to be deployed as part of a StorNext scale out file storage cluster.

DP Chat: The Grudge’s Zachary Galler

By Randi Altman

Being on set is like coming home for New York-based cinematographer Zachary Galler, who as a child would tag along with his father while he directed television and film projects. The younger Galler started in the industry as a lighting technician and quickly worked his way up to shooting various features and series.

His first feature as a cinematographer, The Sleepwalker, premiered at the in 2014 and was later distributed by IFC. His second feature, She’s Lost Control, was awarded the C.I.C.A.E. Award at the Berlin International Film Festival later that year. Other television credits include all eight episodes of Discovery’s scripted series Manhunt: Unabomber, Hulu’s The Act and USA’s Briarpatch (coming in February). He recently completed the feature Nicolas Pesce-directed thriller The Grudge, which stars John Cho and Betty Gilpin and is in theaters now.

Tell us about The Grudge. How early did you get involved in planning, and what direction were you given by the director about the look he wanted?
Nick and I worked together on a movie he directed called Piercing. That was our first collaboration, but we discovered that we had very similar ideas and working styles and we formed a special relationship. Shortly after that project, we started talking about The Grudge, and about a year later we were shooting. We talked a lot about how this movie should feel, and how we could achieve something new and different from something neither of us had done before. We used a lot of look-books and movie references to communicate, so when it came time to shoot we had the visual language down fluently and that allowed us keep each other consistent in execution.

How would you describe the look?
Nick really liked the bleach-bypass look from David Fincher’s Se7en, and I thought about a mix of that and (photographer) Bill Henson. We also knew that we had to differentiate between the different storyline threads in the movie, so we had lots to figure out. One of the threads is darker and looks very yellow, while another is warmer and more classic. Another is slightly more desaturated and darker. We did keep the same bleach-bypass look throughout, but adjusted our color temperature, contrast and saturation accordingly. For a horror movie like this, I really wanted to be able to control where the shadow detail turned into black, because some of our scare scenes relied on that so we made sure to light accordingly, and were able to fine-tune most of that in-camera.

How did you work with the director and colorist to achieve that look?
We worked with FotoKem colorist Kostas Theodosiou (who used Blackmagic Resolve). I was shooting a TV show during the main color pass, so I only got to check in to set looks and approve final color, but Nick and Kostas did a beautiful job. Kostas is a master of contrast control and very tastefully helped us ride that line of where there should be detail and where it should not be detail. He was definitely an important part of the collaboration and helped make the movie better.

Where was it shot and how long was the shoot?
We shot the movie in 35 days in Winnipeg, Canada.

How did you go about choosing the right camera and lenses for this project and why these tools?
Nick decided early on that he wanted to shoot this film anamorphic. Panavision has been an important partner for me on most of my projects, and I knew that I loved their glass. We got a range of different lenses from Panavision Toronto to help us differentiate our storylines — we shot one on T Series, one on Primo anamorphics and one on G Series anamorphics. The Alexa Mini was the camera of choice because of its low light sensitivity and more natural feel.

Now more general questions…

How did you become interested in cinematography?
My father was a director, so I would visit him on set a lot when I was growing up. I didn’t know quite what I wanted to do when I was young but I knew that it was being on set. After dropping out of film school, I got a job working in a lighting rental warehouse and started driving trucks and delivering lights to sets in New York. I had always loved taking pictures as a kid and as I worked more and learned more, I realized that what I wanted to do was be a DP. I was very lucky in that I found some great collaborators early on in my career that both pushed me and allowed me to fail. This is the greatest job in the world.

What inspires you artistically? And how do you simultaneously stay on top of advancing technology that serves your vision?
Artistically, I am inspired by painters, photographers and other DPs. There are so many people doing such amazing work right now. As far as technology is concerned, I’m a bit slow with adopting, as I need to hold something in my hands or see what it does before I adopt it. I have been very lucky to get to work with some great crews, and often a camera assistant, gaffer or key grip will bring something new to the table. I love that type of collaboration.

 

DP Zachary Galler (right) and director Nicolas Pesce on the set of Screen Gems’ The Grudge.

What new technology has changed the way you works?
For some reason, I was resistant to using LUTs for a long time. The Grudge was actually the first time I relied on something that wasn’t close to just plain Rec 709. I always figured that if I could get the 709 feeling good when I got into color I’d be in great shape. Now, I realize how helpful they can be, and that you can push much further. I also think that the Astera LED tubes are amazing. They allow you to do so much so fast and put light in places that would be very hard to do with other traditional lighting units.

What are some of your best practices or rules you try to follow on each job?
I try to be pretty laid back on set, and I can only do that because I’m very picky about who I hire in prep. I try and let people run their departments as much as possible and give them as much information as possible — it’s like cooking, where you try and get the best ingredients and don’t do much to them. I’ve been very lucky to have worked with some great crews over the years.

What’s your go-to gear — things you can’t live without?
I really try and keep an open mind about gear. I don’t feel romantically attached to anything, so that I can make the right choices for each project.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years.