Audionamix – 7.1.20

Category Archives: post production

SGO’s Mistika Boutique now compatible with AJA’s T-Tap

SGO has partnered with AJA Video Systems to make its subscription-based full-finishing software solution Mistika Boutique compatible with AJA’s T-Tap portable Thunderbolt-powered I/O device in the latest Mistika 10 release. SGO also launched a special promotion offering a free 90-day trial of Mistika Boutique with new AJA T-Tap, Kona 1, Kona 4, Io 4K and Io 4K Plus purchases.

Mistika Boutique, which is designed for Windows and macOS, runs on industry-standard, off-the-shelf hardware. It features the complete spectrum of professional finishing tools, from conform to VFX, color grading, Stereo 3D, VR and more. Pricing ranges from $112 to $338 per month. When combined with AJA T-Tap — an affordable, compact video and audio output device that allows professionals to monitor high-quality 10-bit HD, SD, HDR and 2K video with embedded audio output from any compatible Mac or PC — Mistika Boutique offers users a new cost-effective option for feeding Mac and PC outputs to preferred displays for an enhanced finishing experience.

“In recent months, the importance of strong remote post production capabilities has become paramount, and our partnership with SGO aims to make mobile finishing that much simpler, providing an affordable way to get your Mistika Boutique output from your laptop or computer to a range of supported 3G-SDI and HDMI displays,” says Nick Rashby, president of AJA Video Systems.

“Mistika Boutique was created for any type of post facility, including smaller studios or even freelance artists who want to take full advantage of the capabilities that our Mistika Technology software provides, with the added flexibility to work with their preferred hardware. T-Tap is an intuitive and cost-efficient device, making it a perfect complement for Mistika Boutique users finishing 2D, 3D and even VR content on a laptop or computer,” reports Geoff Mills, managing director at SGO. “Having worked with AJA previously to integrate Kona and Corvid cards into our high-end, turnkey Mistika Ultima finishing systems, delivering Mistika Boutique support for T-Tap was a natural progression.”

Media Composer 2020: customizable UI, ProRes for Windows, Catalina

Avid’s new Media Composer 2020 release includes a redesigned customizable user interface, a new Universal Media Engine, finishing and delivery tools, and support for Apple ProRes for Windows and Catalina. These updates are based on user feedback and are available now.

Here are some details of the updates:
– Customization: Users can tailor their workspace to exactly how they want to work. Improvements to the paneled UI increase ease of use and accelerate editing and mastering. A new Timeline Sequence Map increases efficiency by letting creators navigate their entire sequence without taking up the whole screen, while the Blank Panel unclutters the UI and stops panels from resizing.

– More precise finishing/delivery: Expanding on the editing and finishing capabilities introduced a year ago, Media Composer 2020 is offering users the ability to fine-tune color with greater precision and make more granular gain value adjustments when working in ACES (Academy Color Encoding System) spaces. Users can finish high-resolution and HDR projects with total color precision and interoperability, ensuring pristine picture quality throughout their workflow.

– Next-generation Avid Media Engine: The Universal Media Engine enables users to accelerate their workflows by reducing the reliance on QuickTime to deliver better media importing, playback, editing and export performance. The media engine increases processing speed of hi-res HDR media and provides native support for a wider range of formats, including direct media access and Open EXR for over-the-top services such as Netflix. Media Composer 2020 allows users to more easily create content for mobile video platforms and social media by providing 9×16 and 1:1 Mask Margins and FrameFlex framing presets

– Apple ProRes for Windows and Catalina: Like Mac users, Windows users can now create, edit, collaborate and export ProRes media natively with encoding supported on Windows machines for media creation and exporting to Mov export, MXF OP1a and MXF OP-Atom workflows. Creators also can use Media Composer on Apple’s latest macOS Catalina, a 64-bit OS that provides superior performance while leveraging the power of the new Mac Pro.

– Media Composer | Enterprise: Media Composer | Enterprise expands its role-based customization capabilities to enable users to deploy or update site settings across an organization and deploy user settings independently to individuals or groups quickly without impacting any existing site settings.

 

Audionamix – 7.1.20

DP Chat: The Baby-Sitters Club’s Adam Silver talks collaboration and color

By Randi Altman

Netflix’s The Baby-Sitters Club, based on the best-selling book series by Ann M. Martin, follows a group of entrepreneurial middle-school girls as they start a babysitting business in the town of Stoneybrook, Connecticut. There are dad dilemmas, crushes, Halloween spookiness and more. The show stars Sophie Grace, Momona Tamada, Shay Rudolph, Malia Baker, Xochitl Gomez, Alicia Silverstone and Mark Feuerstein.

The cinematographer on The Baby-Sitters Club is Adam Silver, founder of the Santa Monica-based production company National Picture Show, which creates content across multiple platforms. Silver’s recent DP projects include Pen15, Into the Dark and the Valley Girl remake. He also served dual roles as director and DP on the TV adaption of Heathers and producer and DP on the films Daddio and A Deadly Adoption. Proving his ability to move between types of projects, Silver also works shooting commercial campaigns, such as those for Bud Light, 3M and Meta.

His most recent endeavor, The Baby-Sitters Club, started streaming on Netflix on July 3. Here Silver talks to us about the show, his process and inspiration.

How early did you get involved in planning for the season? And what direction did showrunner Rachel Shukert give you about the vision she had for this new series?
I came onto the project with about six weeks of prep before we started shooting in Vancouver. I’d known EP/director Lucia Aniello socially and had seen a lot of her comedy work. I had also watched Rachel’s work on GLOW and other shows. It was exciting to do a project with both of them.

From the outset, Rachel and Lucia envisioned a look that was naturalistic and felt real but also poppy and fun to look at. So I took this initial guideline and then got to run with it and hone it to a specific set of aesthetics and grammar, all while creating space for each director to come in and personalize it. Working closely with Lucia, I put our ideas into a visual presentation for the EPs, studio and network. They loved it, so we were off and running.

Can you talk about developing that happy and bright look?
I felt the coolest version of the show was something grounded in naturalism and realism — something that felt truthful and authentic. We wanted to enable the audience to connect emotionally with the characters, but balance that with something visually dynamic and fun to watch. We wanted something that had a sense of childlike whimsy and playfulness to serve the comedy and was inherent in the book-to-series adaptation.

How much did the books the show is based on play into the look of the show, if at all?
We were very inspired by the spirit of the books. Lucia and Rachel were superfans, to put it lightly, and we all wanted something that felt like a compelling friendship/adventure story — for and about girls.

As I was doing visual research in prep, it was very easy to find references set in the world of boys — I had grown up with films like The Goonies, E.T. and Stand by Me. Now there’s Stranger Things, etc., but it was surprisingly hard to find visual references or an equivalent series for girls. Which is, of course, what the books are, and which meant that this was such a great time to make this show.

We wanted the visual style to capture a sense of excitement and adventure and I felt there were ways to reflect that in the photography — with a dynamic camera, a sense of playfulness, a richness and vibrancy to the color all while staying grounded in realism. And I really wanted to stay away from the type of old-school kids show that is too cutesy or bubble gum; I think kid audiences are way too sophisticated for that now.

There’s also an iconography associated with the original books from the cover art and other renderings. For example, the classic cover of the five main characters framed in Claudia’s room, sitting around the rotary telephone, which is another iconic device from the books. We wanted to keep those very much alive in the Netflix version, but with a modern twist.

How did you work with director Lucia Aniello and Light Iron colorist Corinne Bogdanowicz to achieve that look?
It always starts with story and what the show is about at its core. The drama and comedy of this show are born from the relationships between the five main characters. I thought a lot about how to visualize these relationship dynamics and how to use the frame to help tell this part of the story.

Lucia and I really liked the idea of a widescreen aspect ratio that could capture four of five kids in the same shot. We felt a wider frame could help articulate themes about group versus the individual, together versus alone, etc. I find the wider frame works well to isolate a character that’s feeling alone.

While 16:9 didn’t feel wide enough, traditional anamorphic 2.40 actually felt too wide for the streaming format. We felt it might lose a sense of intimacy. I had gone through a similar process on Heathers (Paramount TV) and suggested we do some tests and find our own proprietary frame that felt right to the show. I got the network and post team to approve the idea, and after testing we settled on a ratio of 2.1:1. Very specific, but I liked it, and that’s what felt right to Lucia so we made it happen!

Working with Lucia, our general process was to hone the look using visual references, then I proposed a couple different lens and camera options to test during prep. She came into Sim Camera (our camera partner) with me, and we went through a few setups. Then, using our test footage up in Vancouver, I did a remote color session with colorist Corinne Bogdanowicz, who was in LA working on the FilmLight Baselight.

Huge props to Light Iron’s Katie Fellion for setting that up and figuring out the tech. Corinne helped create a show LUT and some looks, which were very helpful during production. Throughout prep, in addition to exhaustive location scouting, Lucia and I went on to shot-list most of her episodes, which was key for production efficiency, especially given the limited hours with the kid cast.

What was it like shooting in Vancouver, and how long was the shoot?
It was fantastic; we had some of the best technical crews I’ve ever had: 1st AC Mikah Sharkey, who was the anchor of the camera department; operators Mikey Jechort and Brett Manyluk; gaffer Mark Alexander; and key grip Amrit Bawa.

But the town also had its challenges. We were one of 70 or 80 TV productions working at the time, which put a strain on resources. We also had tricky situations with the weather and shooting outdoors. For scheduling reasons, we had to shoot some of our summer episodes in the fall when the weather had turned, so rain became a regular part of our production. We tried to embrace it as much as possible, and Rachel and the writers did an amazing job of adjusting the scripts to incorporate the rain.

How did you choose the right camera and lenses for this project? Why was this the right combination of tools?
I’ve traditionally been a huge fan of the ARRI Alexa Mini for fast-paced TV production, but with the Netflix 4K requirement, I took it as an opportunity to try some new stuff. I hadn’t shot Red for several years but had heard great things about the Monstro chip and was excited to test it.

I paired the DSMC2 Monstro with a couple different lens packages, including both spherical and anamorphic. We liked the feel of the anamorphics right away; they captured the wider aspect ratio. We also liked the bokeh and rendering of an out-of-focus background. Even though we weren’t using its full width (essentially chopping off the extreme sides of the frame for a 2.1:1 finish), there was something about the bendiness on the wider anamorphic primes when framing a group of actors in close proximity that we felt encircled the viewer, drawing them into the group. Though I love the Cooke anamorphic/i primes, I thought this show needed a bit crisper, cleaner look. After testing both, we went with the ARRI/Zeiss Master anamorphics.

When testing the Red Monstro, I paid close attention to its color rendition, since my preferences for the Alexa were a lot about the color science, the system’s filmic color rendition and smooth skin tones. I ended up really liking the Monstro’s color.

DIT Mason Denysek helped to keep our color consistent with his live grade on set and into dailies. Then, in final grade at Light Iron, I was able to dial it in with Corinne and supervise most of it directly.

Any challenging scenes that you are particularly proud of or found most challenging?
Overall, the trickiest part of the production was having enough time with our amazing kid actors. All our young leads were so professional and prepared, but because of their ages, we had very limited hours with them. Each day became both a race and a math puzzle to figure out how to shoot all their scene work before we had to wrap them. Our producer Meg Shave and the AD team worked some magic with scheduling and other tricks to give us what we needed.

Adam Silver on set of After with director Jenny Gage.

How did you become interested in cinematography?
I started in the business in New York, moving there after college and working on set. I spent four or five years coming up in the lighting and grip departments. I had studied still photography in college and always liked the visual side of filmmaking.

After a few years working in the industry in New York, I went on to graduate film school. I mostly trained in writing and directing, but because I brought a lighting and photography background, I gravitated to cinematography, shooting dozens of my classmates shorts. These days I’m a director as well, but I will always be a cinematographer; I truly love the craft and it’s in many ways the backbone of filmmaking.

What inspires you artistically?
I’m often driven by wanting to work with a particular artist or filmmaker and will go after projects that have interesting people attached to them.

How do you keep up on new technology?
I’m not the kind of DP that attends gear conferences or anything, and I’ve never wanted to own equipment. I stay on top of it by being as truthful as I can to the story: The story will create a need for a certain type of approach or technique or grammar or style, and if it’s something I haven’t done before, I’ll be forced to learn the tech of it. Prep is key; it’s where all that research happens.

Any best practices that you try to follow on each job?
The longer I do this job, the simpler my lighting gets. I also feel a sense of duty to the idea of truth. That may sound amorphous, and it can mean a lot of things, but just one example is in lighting. There is truth in lighting the way it is in writing or performance.

Not long ago, I was shooting Pen15, and that’s a great example of this. The creators (also the leads), Maya Erskine and Anna Konkle, wrote the show based on their very personal experiences from middle school, and they have an infallible barometer for truth. If anything in the show feels inauthentic, including the lighting, they immediately flag it. I love this. It keeps all of us honest, and it’s one reason the show is so good.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 


Vidispine relaunch integrates Arvato Systems IT services

IT specialist Arvato Systems has integrated its media and entertainment-related product portfolio with the Vidispine media supply chain business, which it acquired in 2017, presenting them both under a relaunched Vidispine brand. The move is designed to bring the strengths of the two brands together in a more cohesive way and give users a clearer understanding of the product offerings.

Behind the new Vidispine brand is a complete content ecosystem of the company’s own product portfolio and professional services together with a community of partner vendors, consultants, service providers and developers.

The restructured portfolio contains solutions for the broadcast and media and entertainment industries, including enterprise media asset management, content planning and rights management, ad tech and the cross-industry content platform VidiNet.

From the Vidispine site: “Since the acquisition of Vidispine by Arvato Systems, despite being increasingly integrated in terms of technology and operations, the two brands have continued to go to market as separate entities with parallel marketing and account management activities. As the product portfolios become increasingly joined, such as applications becoming available on VidiNet for example, it seems only natural to unify the technologies under one ‘go-to-market’ brand that reflects the audience that joined portfolio addresses.”


Culture Clash: The sound design of Mrs. America

By Patrick Birk

I think it’s fair to say that America is divided… and changing. But with the perfect storm that has been 2020 thus far, polarization has hit a fever pitch many have not seen in their lifetime. It may be apt, then, that FX and Hulu would release Mrs. America, a limited series depicting the fierce struggle that erupted in the US surrounding the movement to ratify the Equal Rights Amendment.

Scott Gershin

Set in the 1970s, the show explores one of the most contentious elements of the culture war and tells the stories of Phyllis Schlafly (Cate Blanchett) — a conservative activist who led the charge against the women’s liberation movement — and feminists such as Gloria Steinem (Rose Byrne), Shirley Chisholm (Uzo Aduba), Jill Ruckelshaus (Elizabeth Banks) and Betty Friedan (Tracey Ullman).

Scott Gershin was the supervising sound editor and designer for the series. His long list of credits includes Nightcrawler, American Beauty, Pacific Rim, Team America, Hellboy II, JFK, The Doors, Shrek and The Book of Life. The methods Gershin and his team put together to complete the show during quarantine give me hope for those of us in the arts during these clearly changing times.

Gershin and his editorial team, part of Sound Lab (at Keywords Studio), partnered up with walla group The Loop Squad to record Episodes 1 through 8 at the Todd-AO ADR stage in Los Angeles. The show was mixed at Burbank’s Westwind Sound with a team that included mixers Christian Minkler (dialogue and music) and Andrew King (sound effects).

Mrs. America takes place throughout the ‘70s. Do you enjoy working on period pieces?
I love it. You have to research and learn about the events of that period. You need to be able to smell it and hear it. I believe that we captured that time, its tone and its vernacular. A lot of it is very subtle, but if we did it wrong, you would notice it.

The subtlety in the sound design served the show well. You never get the impression that it was there for its own sake.
I have worked on a range of projects. On the quiet side is American Beauty. On the loud side is Pacific Rim. In both cases, nobody should know I exist. If the illusion is correct, you enjoy the story, you buy the illusion. Interestingly enough, there was so much design in American Beauty that nobody knows about. An example is the use of silence; it was done strategically to create an aural contrast to support the pace and the actors’ performances. We recreated subtle sounds, such as when they were eating at the table. It was all manufactured to match the dialogue’s ambience in that scene. As the audience watches a show, they should think that everything they’re hearing was recorded at that time, whether it’s fanciful and sci-fi, or it’s realistic.

What’s an example of what you thought this show needed?
I come from movies, so a major goal was to make sure this show could have the same level of detail that I would put into a film, despite budgetary limitations. The first thing I did was to go into my library, which is pretty big. I realized I had no women-only crowd recordings, so I called some fellow sound pros. They had men and women, and the occasional solo woman laughing or crying, but not crowds of women. That’s when I realized I had to create it myself. While I do this often on my films, I had to find a way to accomplish this within the budget I had and across nine episodes.

That was the fun — trying to capture that variety of accents, the vernaculars, in which different cultures and areas within the United States communicated during that time. Then there was capturing the acoustical spaces needed for the show, thinking about the right microphones to use, where they should be placed and how I could combine them with certain sound effects to help the illusion of very large venues, such as rallies or political conventions.

Scott Gershin (center) and the Mrs. America walla group.

Like during the Reagan-era toward the end?
Yes. In a couple of episodes, there were chants and singalongs. I combined walla group recording (which was somewhere between six to 15 female actresses, depending on the episode) with concert crowds, which I had to manipulate to sound like women. I’d envelope ( a form of precision blending) those crowds against the recorded walla group to give the illusion that a convention hall of women was chanting and singing, even though they weren’t.

We created a tickle of a certain sound to give it that reverb-y, mass-y kind of thing. It’s a lot of experimenting and a lot of “No, that didn’t work. Ooh, that worked. That’s kind of cool.” Then occasionally we’d be lucky that music was in the right place to mask it a little bit. So it’s a bit of a sonic puzzle, an audio version of smoke and mirrors.

It’s like being a painter. I love minimalism for the right shows and rocking the room for others. This show wasn’t about either. In discussions with Dahvi Waller (writer and showrunner), Anna Boden and Ryan Fleck (directors and executive producers), Stacey Sher (executive producer) Ebony Jones (post producer) and the picture editors for each episode (Todd Downing, Emily E. Greene and Robert Komatsu), we agreed on dense textures and details. We didn’t want to go the route of dialogue, music and six sound effects; we wanted to create a rich tapestry of details within the environments, using Foley to enhance (while not interfering with) the actors’ performances while hearing the voice and the sound of the times. (Check out our interview with Mrs. America‘s editors here.)

When you did need more specific varieties and dialects to come through in crowds and walla, how did you go about it?
I get very detail-oriented. For instance, when we talk about capturing the language of the time, a lot of this was embellished with The Loop Squad, our walla group. I wanted to make sure we were accurate. We didn’t want typical accents that are sometimes associated with conservatives or liberals; we wanted to capture the different tone and dialects of the region each group was from. The principal actresses did an amazing job portraying the different characters, so I wanted to follow suit and continue that approach.

For example, the scenes with Shirley Chisholm and the members of the Black feminist movement at the party. All the times you saw people’s mouths moving, there was no sound (the whole show was shot this way). It was all reproduced, so I wanted to make sure that we had the right vernacular, the right sonic style, the right representation – capturing the voice and sound of the times, the region, the culture.

So an emphasis on respectability politics?
Absolutely. At the party, there was a combination of different issues within the black community. In addition to women’s rights, it was about black rights and lesbian rights, and there were conflicts within that group of women.

Patty Connolly and Mark Sussman of The Loop Squad and I had to do a lot of research. It was important to find the right (loop) actresses who could portray that era, that time and culture, and come up with what the issues were that were being discussed within the different timelines that were covered in the show.

We had the opportunity to record a political rally held in LA. For the scene where Phyllis shows up in DC, and there’s a large group of women activists in front of the government building, the Bernie Sanders rally provided the exterior spatial perspective I needed. Adding in the walla group made it feel like it was all women discussing the issues of that time period.

What recording methods did you use?
Because I didn’t have a massive budget to record enormous amounts of people, I had to create hundreds of people with a small group of actors and actresses. For Mrs. America, I grabbed the big ADR room at the old Todd-AO building. Working with our ADR mixer, Jeffrey Roy, I brought in a bunch of my own mics and placed them in different places within the room. Traditionally, ADR stages use shotgun microphones to get rid of any ambience or size of the room. I didn’t do that at all. I wanted to use the acoustics of the room as an important component of the performance.

In using the room, I had to position the actors in strategic places within the room to accomplish a given scene. To get another perspective, I had them stand facing the wall one or two feet away, or in the middle of the room facing each other, or back to back in a line.

In Episode 3, when all the men were running to take back their seats in the convention center, I had them (the loop actors) running really fast in two opposing circles to try to create the feeling of motion and energy. By combining these perspectives and placing them in different speakers during the mix, it gave the scene a certain “spatial-ness” and energy. I loved using the acoustics of the room as a color and a major part of the illusion.

What mics were you using, and did you use any shotgun mics despite not relying on them?
The stage had a Sennheiser MKH 416 shotgun for specific lines, but I prefer using a Sennheiser MKH 800 more often than not. I like the midrange clarity better. For spatial effect, I used a pair of MKH 8040s in ORTF pattern in front (with the MKH 800 in the middle), while in the back I used the Sanken CSS-5 or the DPA 5100, which I moved around a bunch. This gave me the option to have a 5.0 perspective or to use the rear mics for an offstage or defocused perspective.

Each mic and their placement served as a kind of paint brush. When I sent my tracks to effects mixer Andy King at Westwind, I didn’t want to just bathe it in reverb because that would smear the spatial image. I wanted to preserve a 5-channel spatial spread or ambience of the room, so the left was different from the right and the front was different than the back, giving a kind of a movement within the room.

Working from home during the COVID-19 shutdown.

Did quarantine affect the post process?
Halfway through the mix, the virus hit. So little by little, we didn’t feel comfortable being in the same room together for safety reasons. We looked at different streaming technologies, which we had to figure out quickly, and decided to go with Streambox for broadcasting our mix in real time.

We ended up broadcasting privately to the showrunner, the producers and the picture editors. Our music editor Andrew Silver and I were online most of the time. At the end, the only people on the stage at Westwind were our two mixers, with our mix tech Jesse Ehredt in a room next to the dubbing stage and our first assistant Chris Richardson in his edit room down the hall. Everybody else was remote.

Doug Kent introduced us to Flemming Laursen and Dave Weathers of Center Point Post who supplied us with Streambox. We came up with something that worked within the bandwidth of everyone’s download speeds at their houses, since the whole country was working and going to school online. This challenged everyone’s capabilities. When picture and audio started to degrade, Flemming and I decided to increase the buffer size and decrease the picture quality a little bit, which seemed to solve a lot of our issues during peak usage times.

We used Zoom to communicate, allowing us to give each other notes in real time to the stage. I’ve got a similar setup at my home studio to what I have in Burbank, so I was able to listen in a quality environment. At the end of the day, we sent out QuickTimes in both 5.1 and stereo for everyone to listen to, which supported their schedules. Also, if a streaming glitch happened while we were Zooming or streaming, we could verify that it wasn’t in the mix.

It added more time to the process, but we still got it done while maintaining the quality we strived for. Being online made the process efficient. Using Zoom, I would contact dialogue editor Mike Hertlein, who was working from home, for an alternate line or a fix during the mix (with clients on Streambox). Fifteen minutes later we had it in the session and were mixing it.

Did you record walla groups remotely?
Yes, for some of Episode 8 and all of Episode 9. I’d normally record 10 to 15 actors at a time, recording five to eight takes of those 10 to 15 actors, each with a different acoustical perspective. Since Todd-AO was closed due to the pandemic, I had to come up with a different solution. I decided to have all the actors record in their closets or booths if they had them. They recorded into their own recording systems, with each actor having his or her own unique setup. The first thing I had to do was teach a number of actors how to record (basic audio and delivery).

I used Zoom to communicate and direct them through the different scenes. I could hear well enough through group chat on Zoom, and I was able to direct them and provide them with picture by sharing my second screen, like we do on an ADR stage. They would all record at once. From that point, I could direct an actor, saying, “You’re doing too much of this” or “You’re too loud.” I needed to maintain what we had done in previous episodes and keep that blended feel.

Can you talk about benefits and negatives to working this way?
A benefit was that every actor was on a separate track. When I record everybody in a group at Todd-AO, if one person’s off, the whole recording had to be scrapped. Separation let me choose whether I would use someone’s take or not. They didn’t pollute each other’s performances.

When it came to editing, instead of being five or six tracks (each containing eight to 15 actors), now it was 100 tracks. I had five to eight takes of each actor, so when combined, it made for a lot of tracks. Editing those took quite a bit more time. I had to EQ and clean up each actor’s setups, using different types of reverbs to fit the room (which Andy King and Christian Minkler did as well). We had created such cool sounds from previous episodes; the goal was to see if we could match them. It was a bit of a white-knuckle ride. We honestly weren’t sure we could pull it off. But when we were finished, Dahvi let me know she really couldn’t hear a difference between Episode 9 and the previous episodes.

How did you approach the scene in Episode 8, where Alice mixes cocktails with a “Christian pill” and ends up sharing a meal with a group of lesbian feminists? Did you consciously lean toward the surreal given how much time it took to make the home recordings blend naturally?
We had lots of discussions. At first, we wanted to try doing something a little out there. Basically, “How does Alice hear this?” We wanted to be consistent, but we wanted to be able to tell the story. Sarah Paulson did such a great job of portraying being drugged that we thought maybe we should take a step back and let her run with it a little bit, rather than trying to make something that we don’t see. Picture editor Todd Downing did a fantastic job of editing, which enhanced Sarah’s performance — giving it a psychedelic feel without going way over the top

We wanted to stay organic. We manipulated the mother’s voice on the phone a little when Alice’s pill started to take effect. For that scene, we recorded Alice’s mother’s lines on a phone during quarantine, and it worked out because the futz coming from recording on a phone translated quite well. To keep it organic, I did some subtle things: slowed down the crowds without affecting pitch and inserted backward and forward voices and blended them together so they would sound a little odd.

During the scene with the nun, at a certain point we replaced the nun’s voice with Cate’s voice, so she heard Cate’s voice talking through the nun’s performance. We did a number of other things and supported the hard cuts and time travel feel.

Overall it seemed like half my job was coming up with ways to keep working, creating new workflows, dealing with constant change. You’d have an hour’s notice to come up with plan B, plan C, plan D, and “How do we do this?” We’d all talk about it and say, “Let’s try this.” If that worked, cool. On to the next challenge!


Patrick Birk is a musician, sound engineer and post pro at Silver Sound, a boutique sound house based in New York City.


Editing Ozark: Cindy Mollo, ACE, talks importance of tone

By Randi Altman

The Netflix drama Ozark, now streaming Season 3, has always been a fan favorite, but since the COVID-19 shutdown, it’s been credited with making quarantine just a bit more tolerable for a lot of people. The series stars Jason Bateman — who also directs and executive produces — and Laura Linney as Marty and Wendy Byrde, middle-aged parents who also happen to run a money-laundering business in the Ozarks. You know, your typical family story. Along with the Byrdes, there are a host of complex characters, including Wendy’s bipolar brother Ben, their business manager Ruth, and the calmly frightening drug cartel lawyer Helen.

Cindy Mollo

Veteran television editor Cindy Mollo, ASC, (House of Cards, Mad Men, Homicide: Life on the Street) has edited 13 episodes over the show’s three seasons. One of the first things she asked about after reading the script for Episode 1 was tone. She says that while a discussion of tone is always important when starting a show, it was particularly important on Ozark.

“Knowing that Jason Bateman was going to star in the series, I needed somebody to tell me whether they wanted it to be funny or whether it was meant to be a drama” Mollo says. “There were lines of dialogue that, depending on the cadence, could be delivered as jokes, and if you cut it with a comedy tempo, you would have a classic Jason Bateman comedy. But I had seen Jason in Bad Words, a film he directed, and The Gift, neither of which is a comedy, and knew he might be going in a different direction.”

Mollo spoke to executive producer Chris Mundy, who was having similar conversations with Bateman. They were all leaning toward a dark drama with some humor sprinkled in. “From the beginning we talked about how we had to steer the show in a certain direction, and while there could be things that were funny, we wouldn’t edit to make it funny. We had to just let things play out and allow them to be funny organically — because even criminals do funny things from time to time.”

We recently chatted with Mollo, who shared editing responsibilities with Viks Patel and Heather Goodwin Floyd (Mollo’s former assistant on the first two seasons of the show), about editing Ozark, her workflow and more.

What is your typical process like on Ozark?
You get the script for your episode, and sometime before the first day of shooting you have a tone meeting with showrunner Chris Mundy as well as the director of the episode. Chris goes through every scene. This is so important — I’ve been on shows that don’t do tone meetings, and the intent of the scene can be missed. But thanks to the tone meetings, I always know what Chris and the director are intending.

I also take detailed notes because it might be four weeks until I get the scene that was discussed, and if I have forgotten the intention, then I might go off in the wrong direction. As dailies come in, I look at my notes so I know what I’m looking for. I watch the dailies and pay attention to the best performances and the best way to bring the audience into a scene and the best way to end the scene.

Most of our directors have a definite plan, particularly Jason. His coverage is very lean and purposeful, so when I open a bin, I see that he has pretty much planned how he wants to get into a scene. You have that in mind as you are watching more and more setups. “Well, I’ll start with that shot, and this performance here is fabulous. How will I build a scene around this performance with these shots?” On Ozark, all of the performances are very good, so you have a wealth of riches.

I’m assuming some surprises come up when editing a show?
Yes, they do. An example is Season 2’s finale episode. I had three takes on Marty’s reaction in a scene between him and Wendy. In the third take, Jason cried, took some long pauses and was choking back tears. It threw me at first because it didn’t match; Jason hadn’t done that when the camera was over his shoulder and on Wendy. That was something that had evolved over the course of the performance of the scene, and it was beautiful. We wanted to use it.

I had been assembling the scene in my head as I watched dailies, but after that take, I had to go back and re-watch Wendy’s coverage. I wanted to see where she took some long pauses so it could seem like she was listening to him cry, or some long moments when she was waiting for him to compose himself. That was a very simple scene, but it took a little longer to put together because something wonderful happened in the footage that was unexpected.

Wendy’s brother Ben is manic. I remember feeling very anxious when he was on screen. I imagine that a lot of that is the acting, but you must have also helped to amp that up?
I have to give actor Tom Pelphrey and the directors so much credit. We first see Ben teaching in a school, and he seems to be a very nice, well-intentioned man, but then he goes ballistic. Then we meet Ben at the casino, and we are already looking at this guy for what craziness he might bring. There were a couple of scenes in my episode, directed by Alik Sakharov, where we talked about not minding that Ben is unpredictable because of his mental illness. We also knew he had to be a slow burn through earlier scenes and then peak.

Tom did all the work; he never went too big. Even at the gala at the casino. He never went as big as he went in front of Helen and her daughter when he confronted them at her house in the eighth episode. He attenuated that perfectly. It is frightening when you are looking at the shot of Helen protecting and holding her daughter. There was a version of the end of that scene when she actually said, “Oh, you are dead,” but we realized we didn’t need it. The look on her face tells you everything. As she’s watching him go, we are on a tight shot of her and the music is tense, and you just know that that was the nail in his coffin.

When you are editing scenes like that one at Helen’s, do you have a specific process?
In the tone meeting with Chris Mundy and the director, we focused on Ben’s arc over Episodes 7 and 8, which I was cutting. We talked a lot about how to modulate his behavior and not have him get too manic too soon. I also thought a lot about shot sizes and wanted to make sure that in Ben’s most crazed moments we were in his tighter coverage.

I built the scene from the middle back to the beginning, and then from the middle to the end because I wanted to control that peak eruption. I think in some cases we switched to a more medium shot just to keep Helen and her daughter in the frame instead of using an isolated closeup of Ben. But because his performances were so consistent, that was kind of an easy swap.

We had some takes where he was so animated and crazy that he was spitting, and I loved that because you are out of control when you are spitting. I thought that showed how far gone he was, but you wouldn’t want him spitting for the whole scene. You just want to use that selectively. It was a fun scene, and it was a hand-held camera, so it’s all really kinetic.

How many takes do they tend to do on Ozark?
On that Helen scene, in particular, we had eight different setups. Probably four takes of their close-up setups, two each of the mediums, and various wide shots of Helen and her daughter at the table in their yard. I don’t think I used the extreme wide in the final cut — it was low and pretty distant from the action. Sometimes I play with everything even though I already have an idea in my head of how I want to cut a scene, but this time I didn’t.

How did you get dailies? You were in LA, and they were in Atlanta?
We shot in the Atlanta area. At wrap, they would take the camera cards to Company 3 Atlanta, process the dailies and send them to Company 3 in LA. Then we got the footage piped over to us the morning after it had been shot. Our assistants then took the dailies, sorted through them and grouped the cameras together — if something was shot with two or three cameras, they get hooked up together. I then got a bin for each scene that had been shot.

This season we were in the same building as the writers in LA, and it felt really luxurious. I could just walk into Chris’ office to ask him a question, or he would poke his head in and ask, “Can I see how this scene was shot for Episode 5 because we are going to reference it in Episode 8.” It allows you to be really fluid and interactive because we are all in the same physical space.

The first two seasons were shot on the Panasonic VariCam, but that changed this year. How did that affect how you work?
We switched to the 6K Sony Venice for Season 3. Thanks to that higher resolution, you can go in and blow up a part of the frame, which means you can take a two-shot of the characters and make it a single, assuming that the original image is sharply focused. You have a lot of latitude, so when we needed to, we were blowing shots up and making close-ups where there weren’t any.

Which episodes did you cut this season, and how did you work with the other editors? Did you ever ask them to look at your footage?
I edited Episodes 1, 2, 5, 7 and 8. I work closely with my assistant, Mary Chin, and I’m always trying to mentor her — to help her get a little more experience with editing and learning how to change a scene, how to talk about a scene, etc. So I usually have her looking at scenes with me, or since Chris is so close, I’ll bring him in. When I did interact with the other editors, it would be to talk about how we were handling new characters in the show.

You use an Avid Media Composer?
Yes, Version 8.9.4. We use Avid Nexis for storage, and in terms of storage per episode we use 750GB to 800GB for dailies. With score, sound effects, etc., it’s about 1TB and 1.5TB, and for the season, we average between 13TB and 14TB.

Do you have a special bin set up with selects?
Yes. I work in frame view, which uses thumbnails of each take. Mary will lay out the different setups in order. If there’s a setup that is meant for the open of the scene, she will put that one first, even if it was shot last. And if there were four takes, they all get laid out next to each other … one, two, three, four.

For takes shot with multiple cameras, I will have one frame that is an icon that represents the group of these two cameras, and below it a thumbnail for the A camera and the B camera. This allows me to see what angles each camera was shooting in the thumbnail, then I have an icon I can drag into my timeline that is the two cameras married together. I can switch between the two. I try to use the selected takes as a guide, but you have to feel free to pull from anywhere because there might be a take that wasn’t selected but has one great moment, and you have to use it.

Can you talk about the differences between editing episodics versus films? Less time on the TV shows?
On Ozark, we cross-board — meaning we shoot two episodes at once, so it is in the area of 22 days. This is similar to the amount of time you might shoot on a small, low-budget feature, but you are doing two stories in that time, and it’s with the same characters, so they are not totally isolated stories … but, again, you are doing two episodes in the time that someone might be doing a very small feature!

The other difference between editing a feature and an episodic is that features are still very much director-driven, while episodic is driven by the writer-producer or showrunner. So while you do have to work with the director and make his or her cut exactly how he/she wants it to be, ultimately the next step will be to show the cut to the showrunner and make sure that all those things talked about in the tone meeting have been realized.

What episode did you submit for Emmy consideration?
I submitted the first episode of Season 3, called “War Time.” The season was meant to begin and end with the violence of the Mexican cartels and the harsh world that the Byrdes are now a part of. The first scene of the first episode took place in Mexico as men in three SUVs drive into a cul-de-sac and enter a house where people are counting drug money. They slaughter them and burn the house. The carnage is the message from one cartel to the other.

Next we cut to black and then to an incongruously happy and cheesy commercial for the Missouri Belle casino, with Wendy and Marty on the top deck toasting the camera. The commercial is actually on Wendy’s laptop as a producer is presenting it to her for notes. She dismisses the producer, and while Charlotte tells her the mundane things on her schedule for the day, Wendy sees a news piece about the torching of the homes in Mexico. Cut to Marty arriving at the new casino.

The sequence in the cul-de-sac and the destruction of a house were too expensive to shoot — approximately $1 million for a two- to three-minute sequence — so we delayed finishing the episode until we could come up with a more affordable concept. While we waited, I started the episode (and the season) with the cheesy commercial. It worked as a total gear shift from the uncertainty of the Season 2 finale, but it didn’t set the stakes as high as starting with an act of vengeful violence. So we waited.

Eventually our production team found a Latin market in Georgia that needed very little set dressing to look like a shopping center in Mexico. The idea was that a courier was making his regular money drop, but he had been corrupted and turned on the guys who were counting money in the back room, slaughtering everyone and blowing up the location with a couple of bombs. Then the cheesy commercial. It was less expensive but still very effective.

What’s next for you? Season 4 has been announced as the show’s final season.
I’m currently working on a feature-length documentary about the singer Pink, which will be completed before I return to Ozark. Since the final season will be extended (14 episodes), and we don’t have a start date yet, we only know that we will go well into 2021. So I can’t predict what will come next!


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 


Framestore creates variety of animation styles for Libresse/Bodyform spots

Partnering with creatives Nick & Nadja at agency AMV BBDO, Framestore provided animation and VFX for the latest campaign for Libresse and Bodyform. The campaign was directed by Golden Globe winner and Emmy-nominated Nisha Ganatra (Late Night). Framestore provided six animated sequences, each featuring a different style of animation to show the inner worlds that act as reflections to the realities of the uterus. The film has been created to dispel myths, encourage a positive conversation and address the life-changing moments in a woman’s life, from miscarriages and menopause to endometriosis.

Framestore creative director Sharon Lock worked with Nick & Nadja to select the styles of animations that would bring to life the emotions and unique perspectives of each story. Styles included 2D cell techniques and stop-frame animation, as well as hand-painted images created with oil paint on glass.

Lock worked with the team of artists to direct the animated sequences and work as the main central point of creativity for them with the client and agency. Talking about bringing those visually different elements together into a single cohesive film, she says, “it was important that the animations produced for this film not only looked as good as possible but also made an emotional impact on audiences because of the nature of the film.

“We worked with animators who had wonderful storytelling abilities and whose work was unique and handmade and could communicate a range of tone and emotion to audiences in a short amount of time on screen.”

The team at Framestore, which included producers Niamh O’Donohoe and Emma Cook, was a part of the film’s predominantly female cast and crew, which they felt made a big difference in creating something that was honest and powerful. “We were telling real stories about the experiences of being a woman, so having the team we did meant we had something of a shorthand,” explains O’Donohoe. ‘We could easily communicate what we needed because there was a mutual understanding of how these stories had to be presented, something that I feel beautifully reflects the messages that Libresse/Bodyform is always communicating.”

Framestore also delivered invisible VFX work for the film’s live-action portions and created a world of uteri, which represents the billions of women who are a part of the Libresse/Bodyform story. These visuals are featured in the opening and closing sequences that will become the brand’s main visual for this campaign. Framestore also provided the color grade.

“It was important that everyone worked really closely together to make sure every frame did its part in telling the stories and I think the final piece speaks for itself. It was amazing to be part of such an inspiring and creative campaign,” concludes Lock.


Leon Silverman to chair HPA Industry Recovery Task Force

Industry veteran and former Hollywood Professional Association (HPA) president Leon Silverman will lead the HPA Industry Recovery Task Force (IRTF). He is a founder of HPA and continues to serve on its board. Over the course of his decades long career, he has held executive roles at major studios and entertainment companies including Netflix, The Walt Disney Studios, Kodak and LaserPacific. In these roles, he has focused on the intersection of technology and creativity, working closely with a number of key industry organizations.

The HPA’s Industry Recovery Task Force is focused on the sustainable resumption of the production and post industry with the aim of understanding how to enable content creation in an evolving world impacted by the pandemic crisis. The task force’s upcoming virtual Town Hall events will share the latest health and safety, technical and creative best practices.

“We are at a pivotal moment at an important and challenging time in our industry,” says Silverman. “While the pandemic forces us to evolve the way we work to effectively create and deliver content, we also have a real opportunity to not just get back to work, but to move our industry forward. This Task Force will mobilize experts, artists and technological visionaries from a range of disciplines to thoughtfully collaborate on industry evolution and innovation. The HPA is well suited to help create a common ground and forum for this conversation, and while we may not be in the same room, we can still help bring our industry together. I sincerely believe we can emerge from this current crisis stronger and focused on enhancing creativity and content creation itself.”

The first IRTF Town Hall will be held in July and will be moderated by Hollywood Reporter tech editor Carolyn Giardina. HPA plans to continue this format over the following months as the impact of the pandemic evolves. These events will present the latest knowledge and processes for individuals and companies at work on sets, in post-on-set environments, visual effects companies, studios, production companies and post companies. The first event will feature a panel that includes medical experts, scientists, political leaders, post artists and members of guilds. Video case studies will take pros behind the scenes to learn how facilities and companies have managed the challenges of the pandemic.

“It is extremely important,” notes Silverman, “to collaborate with the key individuals who have scientific knowledge as well as those who have already set standards for returning to work to make sure we are in sync with their guidelines and can educate our HPA community. Ultimately, our aim is to build an incredibly collaborative, creative and technically sound future.”

The specific schedule and speakers for the upcoming town halls will be announced shortly.


VFX supervisor Jay Worth talks Season 2 of Netflix’s Altered Carbon

By Barry Goch

Netflix’s Altered Carbon is now streaming Season 2, with a new lead in Anthony Mackie as Takeshi Kovacs in a new skin. He’s the only surviving soldier of a group of elite interstellar warriors, continuing his centuries-old quest to find his lost love, Quellcrist Falconer (Renée Elise Goldsberry). After decades of planet-hopping and searching the galaxy, Kovacs finds himself recruited back to his home planet of Harlan’s World with the promise of finding Quell. In this world of Altered Carbon, lives can be continued after death by taking on a new skin and using the person’s stack — or brain.

Jay Worth — Credit: Rob Flate

As you can imagine, there are a ton of visual effects used to tell Takeshi’s story. To find out more, we reached out to Jay Worth, an Emmy Award-winning VFX supervisor with 15 years of experience working in visual effects. His credits include Fringe, Person of Interest and Westworld, for which he won the Emmy for Outstanding Special Visual Effects in 2017.

How did you get involved in Altered Carbon?
I have a relationship with showrunner Alison Schapker. We go way back to the good old days of Fringe and a few other things. I had worked with the head of visual effects and post for Skydance, Dieter Ismagil, and then I had just come off of working on a Netflix show. It worked out for all three of those parties to come together and have me join the team. It was a fun bit of a reunion for us to get back together.

At what point did you come on board for Season 2?
I came in after it was shot in order to usher things through post and the final creative push through the final delivery. VFX producer Tony Meagher and I were able to keep the ball rolling and push it through to the final. The VFX team at Double Negative and the other vendors that we had were really able to carry it through from the beginning to the end as well.

Tell us about your review process. Where were you based?
We were in Los Angeles — the showrunners, Tony Meagher and I — but the rest of the team was in Toronto: our VFX coordinator, VFX editor, post team and DI facility (Deluxe Toronto). The VFX vendors were spread across Canada. The interesting thing for us was how to set up the review process while being in Los Angeles. We relied really completely on ClearView and that amazing technology. We were able to do editorial reviews and full-range color UHD review sessions for final VFX shots. It was a beautiful process. Being able to review many things in the edit and make a checklist was useful. Then we needed to look at this one in color, so being able to go downstairs and just flip a switch in our bay and have our beautifully calibrated setup was amazing. That afforded us the ability to work seamlessly even though we weren’t all in the same place.

This was the first time I had done a show that was so remote. I’ve done many shows where editorial is in one place and the VFX team is in another, but this was the first time I’d done something this ambitious. We did everything remotely, from editorial reviews to effects reviews to color and even the sound, and it was really an amazing, far more seamless process than I thought it would be when we started. The team at Skydance, the production team and the post team really had all the variables dialed in, and it was really painless considering we were spread out. The editorial team and the VFX team on the show side were just phenomenal in terms of how they were able to coordinate with everybody.

       
Before and After

This production predates the COVID-19 restrictions. Do you think that would have impacted your production?
It would have been a challenge, but not impossible. We would have probably ended up having more ClearView boxes for the team in order to work remotely. I’ve worked recently on other shows that have the colorists working from home, and they’re all tapping into the same box; it just happens to be a pipeline issue. It was doable before, but now there’s just a little bit more back and forth to set up the pipeline.

What was the hardest sequence on “Broken Angels,” the last episode of the season, and why?
One of the larger challenges in visual effects is how to convey something visually from a story perspective and still have it feel real and organic. A lot of times, it ends up being a more challenging hurdle to get over from a visual standpoint when the storytellers are trusting you to help convey these different story points. That’s really where visual effects shine: When you are willing to take on that risk and that narrative responsibility, that’s really where the fun lies.

For the finale, it was telling the story of Angelfire. People kind of understand the overarching idea of satellites and weapons from space, but we had to help people understand the communication between them. We also needed them to understand how it connects to the older technology and what that’s going to mean for our characters. That was by far the biggest challenge for that episode and for the season.

Tell us about the look development of the Angelfire.
It was definitely a journey, but it started with the page and trying to visualize it. Alison Schapker and EP James Middleton had written up what these moments were going to be: a communication tower and a force field around a planet they didn’t quite understand. That was part of the mystery for the viewers and the characters as they were going through the season.

Our goal, from a visual effects standpoint, was to show this ancient-yet-modern communication and to figure out how to visually tell the story of how these things are communicating … that they’re all kind of like-minded and they’re protective. We key that up when Danica fires off the rocket with the rebels attached to them so we can see firsthand what these orbitals can do. Then we see Angelfire come down on the soldiers in the forest.

We’re starting to understand more and more what this thing does so that we can understand what the sacrifice really means … to figure out what the orbitals are and how they could look and feel organic and threatening as well as benign and ultimately destructive. I feel like we ended at a point where it makes sense and it all works together, but at the beginning, when you have a blank canvas, it’s a rather daunting task to figure out what it all should look like.

We had so many conversations about how to depict Angelfire. Should it be more like glass breaking? Should it be like lightning? Should it be like a wave? Should it just crackle? Should it splash in? We had so many iterations of things that just didn’t feel or look quite right. It didn’t convey what we wanted it to convey. “It looks too digital; it looks fake.” To end up with something that felt integrated into the environment and the sky was a testament not only to the team’s perseverance but to Alison’s and James’ patience, leadership and ability to explain creatively what they were going for. I’m really happy with where we finally landed.

How did you lock in the final look?
We wanted it to feel organic and real for the audience. We had a lot of different meetings to talk about what perspective we were going to take — how high up we need to be, how close we need to be to understand that they were communicating with each other and still firing — and whether those different perspectives should be down on the ground or up in the sky. We figured it out with editorial while we were locking episodes, which is a fairly normal process when you’re dealing with full CG shots mixed with pieces that we shot on the day.

We obviously had numerous versions of animatics, and we had to figure out how it was going to work in the edit before we could lock down animation and timing. Honestly, for the final moments when Kovacs sacrificed himself and Angelfire was going off, we were tweaking those with editorial, and our editorial team did a phenomenal job of helping us realize the moment.

Any people or companies that you want to give a shout-out to?
Bob Munroe (a production-side VFX supervisor) and Tony Meagher. All the work they did was groundwork for everything that ended up on the screen. And all the vendors, like Double Negative, Mavericks, Spin, Switch and Krow. Also our VFX coordinating team and everybody up in Toronto. They were the backbone of everything we did this season. And it was just so much fun to work with Alison and James and the team.

Any advice for people wanting to work in visual effects?
From my standpoint, there are not enough people on the show side of things, and if they have a passion for it, there’s a lot of opportunity to get into that.

I would say try to find your lane. Is it on the artist side? Is it on the coordinating and producing side? There are so many resources out there now. And now that the technology is available for everybody, it’s an amazing opportunity for creatives to get together and collaborate and to make things that that are compelling.

When I’m on a show or in the office, I can tell which PA or assistant has a fascination with VFX, and I always encourage them to come along. I have hired from within many times. It’s about trying to educate yourself and figure out what your passion is, and realizing there’s space for almost any role when it comes to visual effects. That’s the exciting thing about it.


Barry Goch is senior finishing artist at The Foundation and an instructor in post production at UCLA Extension.

MisterWives’ Rock Bottom music video: light, dark and neon

American indie pop band MisterWives’ Rock Bottom video was made to promote the band’s first single off its upcoming album. In the video, the band’s lead singer, Mandy Lee, is seen walking on the sands and hills of a beach before walking through a mirror to find the rest of her band on a dance floor. The video combines neon colors and different textures with dark and gray backgrounds at the beginning as Lee goes from dark times to eventually breaking through the mirror and shining with her band on a swirling dance floor.

To capture the look the band wanted, production was done in different locations at different times of day. This included shooting on a remote beach and in the California desert, into which director and colorist Jade Ehlers and his small crew had to hand-carry all of their camera gear and lighting, including a 100-pound mirror. Ehlers color graded the piece on Resolve and edited on Adobe Premiere.

“We wanted to go for a darker tone, with the neon colors in the darkness that showed that light can shine through even the dark times. The song is about showing it is more about the journey to get to the end of the tunnel than just sitting in the dark times, and the video had to capture that perfectly,” Ehlers says.

The video was shot with a Blackmagic Pocket Cinema Camera 6K, which was chosen because of its small size, high dynamic range and ability to shoot in low light, all essential requirements that allowed Ehlers to shoot at the locations that were best for the song.

“Honestly, because of how different all our scenes were, I knew we needed a camera that had great low light that would allow us to be sparing with light since this shoot had a lot of hiking involved,” he says. “The beach location was quite crazy, and we hiked all of the gear in, so having a small camera bag to carry everything in was great.”

Throughout the video, Ehlers had to adjust for different textures and unexpected lighting problems — including lighting the lead singer’s bright-green puffy dress against a gray background in the desert. Another challenge came from shooting the dance floor scenes, wherein the black floor was not putting out as much light as expected. To compensate and get the shots, Ehlers used the camera’s 13 stops of dynamic range and dual native ISO up to 25,600 along with the Blackmagic Raw codec for high-quality, lifelike color images and skin tones.

“Because of the bit range of the camera’s sensor, I was able to qualify the dress to make it pop a bit more, which was amazing and saved me a lot of extra work. And the dance floor scenes were great but were also harder than we imagined, so we had to push the camera higher on the ISO so get the exposure we needed,” concludes Ehlers.

Warner Bros. De Lane Lea’s Alfred controls assets in the cloud

Warner Bros. De Lane Lea (WB DLL) has launched Alfred, a secure, online portal that gives its clients complete control of their content. This platform plays a key role in the production and post production process, providing the ability to store, distribute and manage assets and data in a speedy and efficient manner.

Developed by the team at Warner Bros. De Lane Lea, Alfred has a number of applications that provide productions with flexible control over their content, including:
– Allowing clients to request media pulls including VFX, marketing and conform, at any time
– Automatically sending pulls to selected vendors and recipients
– Fully customized to suit all production requirements
– Providing media tracking for raw and delivered elements
– Giving visibility to clients on the status of processes within the facility, including conform and VFX, QC reports and deliverables

The service are available to all WB DLL clients from today.

Kevin Harwood, head of workflow at Warner Bros. De Lane Lea, oversaw the creation of Alfred from start to finish. “On a typical day, we’ll work through TBs of data and Alfred allows us and our clients to manage and track that data seamlessly, and for it to be accessed anywhere, at any time. Alfred is a flexible platform that can be moulded to each individual client’s needs.”

He continues, “Being able to access and manage content wherever you are is always important, especially now, we’re confident Alfred will help facilitate productions having to find creative solutions when working remotely.”

 

EditShare adds seamless proxy editing to EFSv

EditShare has added a new seamless proxy-editing feature to its EFSv platform that it says can help make editing in the cloud more cost-efficient while also improving remote editing workflows.

According to EditShare, EFSv optimizes the use of both object and block storage located in the cloud to allow for savings of up to 75% compared to the existing costs of cloud storage and workstations, thereby making cloud editing more accessible to more facilities.

With this new feature, EFSv eliminates the workarounds associated with proxy editing and conforming to create a seamless proxy-editing experience in all editors, including those from Adobe, Avid and Blackmagic.

“The economics of cloud production have been a barrier to widespread adoption for media companies everywhere. Often it is considered as a backup plan or short-term workaround to a specific situation or project,” says Conrad Clemson, CEO of EditShare. “The implementation of the seamless proxy-editing feature of EFSv unlocks the power of the cloud as the last piece of the content supply chain.”

With EFSv, high-resolution original files can be stored in economical object storage yet read through the EFSv native file system driver, while small, lightweight editing proxies are stored in standard block storage. Both sets of files are always available and accessible to the NLE application — streamlining color grading, effect creation and conforming. Additionally, for Adobe Premiere, the new Flow panel automates the clip import process to ensure Premiere understands that both versions exist and enables toggling between them at any time.

OpenDrives launches scalable NAS storage platform

OpenDrives is offering a new storage solution called the Ultra Hardware Platform that allows users to scale their storage solutions in multiple ways while preserving high performance, low latency and intelligent data integrity. The Ultra platform includes three different NAS hardware series to meet changing business requirements, all controlled by OpenDrives’ Atlas centralized management software, which will soon have some new features.

Designed with speed and low latency in mind, the Ultimate series incorporates an all-flash NVMe design in the form of the F and FD capacity modules to handle the most demanding workflows.

For companies that require a balance between performance and flexibility, the Optimum Series was designed to provide freedom of choice and is configurable with either all-flash NVMe (F, FD) or SAS HDD (H, HD) capacity modules, or both. The workhorse of the Ultra Hardware Platform, the Optimum can handle resource-intensive workflows while also managing data integrity with ease.

The Momentum Series uses OpenDrives’ HDD (H, HD) capacity models to deliver high performance at a cost-effective price. Designed to excel at write-intensive workflows, such as camera-heavy security surveillance, the Momentum provides power at a price to meet any budget.

OpenDrives will also release the latest version of its Atlas software platform in the fourth quarter as a free upgrade for all OpenDrives users. The next version will combine new intelligence technology with automation in an easy-to-use, single-pane-of-glass, centralized management solution for the OpenDrives Ultra hardware family. With the ability to manage all OpenDrives systems from a single interface, companies can manage their entire storage infrastructure no matter how geographically distributed it may be. Plus, Atlas incorporates intelligent data integrity to ensure against corrupt data or data loss. New features include inline and proactive caching, bandwidth throttling and actionable analytics to support business intelligence.

OpenDrives’ Ultra Hardware Platform and the existing Atlas Operating System are available now. All users will be able to upgrade to the new Atlas software in the fourth quarter.

Lenovo intros next-gen ThinkPad mobile workstations

Lenovo has launched the next generation of its ThinkPad P Series: the ThinkPad P15, ThinkPad P17 and ThinkPad P1 Gen 3; the new ThinkPad P15v; and the ThinkPad X1 Extreme Gen 3. Equipped with high-performance 10th Gen Intel H series mobile processors, these new ThinkPads are available in a variety of configurations.

ThinkPad P1 Gen 3

The ThinkPad P Series and the ThinkPad X1 Extreme Gen 3 feature the new Ultra Performance Mode, exclusive to these systems, which allows users to take full control of their performance settings. Users can now dial up the system, ensuring peak performance when they need to complete a render as fast as possible or demo high-fidelity VR content while maintaining a stable frame rate.

Enabled by default as a setting in BIOS, Ultra Performance Mode relaxes restrictions on acoustics and temperature, allowing users to tap into the GPU and CPU and leverage an improved thermal design to maintain the integrity of the machine and deliver increased performance.

A complete reengineering of the thermal design optimizes performance on the ThinkPad P15 and P17 over their predecessors, resulting in what Lenovo says is an added 13% more air flow, a 30% larger CPU heat sink, larger vents and a new thermal mesh to dissipate heat faster.

Lenovo has also moved to a new daughter card design instead of relying on a soldered solution. The ThinkPad P15 and P17 will feature this modular design, offering four times the number of GPU and CPU configurations than previous generations. With Nvidia Quadro RTX GPUs on board, the ThinkPad P15 and P17 support higher-GPU-wattage graphics than their predecessors, increasing from 80 watts to 90 watts and 90 watts to 110 watts, respectively. This increase allows users to select the right configuration for their needs — optimizing performance for their workflow directly out of the box and enabling more complex graphics on a mobile workstation.

ThinkPad P15 and P17

The ThinkPad P15 and P17 boast additional shared features – including a new 94WHr battery and up to 4TB of storage, along with up to 128GB DDR4 of memory and UHD Dolby Vision HDR displays.

The ThinkPad P15 and P17 will be available in July starting at $1,979 and $2119, respectively.

Lenovo’s thinnest and lightest 15-inch mobile workstation – the ThinkPad P1 Gen 3 – has been updated with additional usability features including a new anti-smudge coating, upgraded speakers and a new UHD LCD display option with a 600-nit panel. For mobile workstation users in areas without expansive Wi-Fi access, the ThinkPad P1 Gen 3 also offers optional LTE WWAN – the fastest internet option for remote workers – for increased mobility and performance.

The ThinkPad P1 Gen 3 will be available in July starting at $2,019.

ThinkPad X1 Extreme Gen 3

The latest ThinkPad X1 Extreme Gen 3 is designed for advanced users seeking a high-performance Windows 10 laptop with 10th Gen Intel H series vPro mobile processors up to Core i9 and optional Nvidia GeForce 1650Ti graphics. This combination of processing power and high-performance graphics, along with a 15.6-inch display with up to 600-nits brightness, offer users advanced productivity and collaboration capabilities.

New Wi-Fi 6 and optional Cat 16 LTE-A wireless WAN provide reliable high-speed data transfers for a highly efficient remote working experience. Modern Standby helps ensure emails, messages and updates are received, even when the lid is closed, and allows rapid resume.
The ThinkPad X1 Extreme Gen 3 will be available in July. Price to be announced.

Rounding out the mobile workstation portfolio is the new ThinkPad P15v. Powered by 10th Gen Intel H series mobile processors, the 15-inch P15v offers a UHD 600-nit LCD display and the Nvidia Quadro P620 GPU.
The ThinkPad P15v will be available in July starting at $1,349.

Tom Kendall

Picture Shop VFX and Ghost merge, Tom Kendall named president

Ghost artists at work in Copenhagen studio.

Streamland Media (formerly Picture Head Holdings) has consolidated its visual effects offerings under the Ghost VFX brand. Picture Shop’s visual effects division will merge with Ghost VFX to service feature film, television and interactive media clients. LA-based Picture Shop, as part of the Streamland Media Group, acquired Denmark’s Ghost VFX in January.

Tom Kendall, who headed Picture Shop VFX, will move into the the role of president for Ghost VFX, based out of the Los Angeles facility. Jeppe Nygaard Christensen, Ghost co-founder and EVP, and Phillip Prahl, Ghost SVP, will continue to operate out of the Copenhagen studio.

“I’m very excited about combining both teams,” says Kendall. “It strengthens our award-winning VFX services worldwide, while concentrating our growing team of talent and expertise under one global brand. With strategic focus on the customer experience, we are confident that Ghost VFX will continue to be a partner of choice for leading storytellers around the world.”

Over the years, Ghost has contributed to more than 70 feature films and titles. Some of Ghost’s work includes Star Wars: The Rise of Skywalker, The Mandalorian, The Walking DeadSee, Black Panther and Star Trek Discovery. Recent Picture Shop VFX credits include Hawaii Five-O, Magnum P.I., The Walking Dead and Fear the Walking Dead.

The Streamland Media Group includes Picture Shop, Formosa Group, Picture Head, Ghost VFX, The Farm and Finalé, with locations in the US, Denmark, Canada and the UK.

AJA upgrades Ki Pro Go H.264 recorder/player

In response to user requests, AJA Video Systems has released Ki Pro Go v2.0  firmware for its portable multi-channel H.264 recorder and player. The update introduces enhancements for improved H.264 recording quality and reliability, including recording support for up to 25Mb/s, 10-bit and 4:2:2 color space, in addition to new expanded timecode capabilities with LTC, enhanced super out and front-panel audio monitoring, in-system drive formatting, network file downloads and gang recording support.

Ki Pro Go offers up to four channels of simultaneous HD or SD recording from HDMI or SDI sources direct to off-the-shelf USB drives. The new firmware also provides 4:2:2 color space and 10-bit options for capturing richer imagery. Ki Pro Go now offers five bit rate speeds — 5Mb/s (Low), 10Mb/s (Med-Low), 15 (Medium), 20 (Med-High) and 25 (High) — providing users with increased flexibility to choose their desired bit rate for production needs.

New enhanced super out and front-panel audio monitoring also display the remaining media percentage and audio meters for all four video channels for improved user monitoring. Ki Pro Go v2.0 further expands timecode choices by offering a new option for LTC on one of the analog audio inputs, enabling the second analog audio input channel to function as a mono input.

Additionally, the firmware introduces new in-system media formatting, eliminating the need for a separate PC. Network file downloading allows for more streamlined use in critical live production environments, giving the user the option to move recorded files to a central server on the LAN.

Gang support has also been added so that users can connect multiple Ki Pro Go devices together via easy-to-use Ethernet and control the entire group of devices using one unit via Ki Pro Go’s web-based UI or front-panel button controls.

Ki Pro Go v2.0 firmware is available now as a free download from the AJA website. Ki Pro Go is available through AJA’s reseller network for $3,995.

Review: Sound Devices MixPre 3-II portable 5-track audio recorder

By Brady Betzel

Even though things are opening up slowly, many of us are still spending the majority of our time at home. Some of us are lucky enough to be working, some still furloughed and some unemployed. Many are using the time to try new things.

Here is one idea: While podcasts might not be a moneymaker out of the gate, they are a great way to share your knowledge with the community. Whether you’re making video or audio, there is one constant: You need high-quality audio recording equipment. In this review, I am going to be covering Sound Devices’ MixPre-3 II three-preamp, five-track, 32-bit float audio recorder.

While at Sundance this past January, I saw someone using this portable recorder. It seemed easy to use and very durable. I was intrigued enough to reach out to Sound Devices about a review; they sent me the MixPre-3 II, which is their smallest and most portable recorder. The box can run with a power cable, USB-C or with four AA batteries. The MixPre-3 II has several new advancements over the original MixPre-3, including 32-bit float recording, USB audio streaming, recording up to 192KHz, faster hardware, internal LTC timecode generation and output, adjustable limiters, auto-copy to USB drive, and pre-roll buffer increased to 10 seconds. But really the MixPre-3 II is a rugged field audio recorder, voiceover recorder, podcast recorder and more. It currently retails for around $680 from retailers like Sweetwater and B&H.

One of my goals for this review was to see how easy this recorder was to set up and use with relatively little technical know-how. It was really simple. In my mind, I wanted to plug in the MixPre-3 II and begin recording — and to my surprise, I was up and running within 10 minutes.

Up and Running
To test it, I grabbed an old AKG microphone (which I got when I purchased an entire Avid Nitris offline edit bay after Matchframe went out of business), an XLR cable, my Android phone and a spare TRRS cable to plug my phone into the MixPre-3 II for audio. I accessed the menus using the touch screen and the gain knobs. I was able to adjust the XLR mic on Input 1 and the phone on Input 2, which I set by pushing the gain knob to assign the input to the aux/mic input, and I plugged my headphones into the headphone jack to monitor the audio.

The levels on the on-screen display used in conjunction with my headphones let me dial in my gain without raising the noise floor too much. I was actually impressed at how quiet the noise was. I think I can attribute the clean audio to my AKG mic and Kashmir microphone preamps. The audio was surprisingly clean, even when recording in a noisy garage. I used Spotify on my Android phone to mix in songs while I was talking on the AKG (like a podcast), and within 10 minutes, I was ready to record.

Digging Deeper
Once I was up and running, I dove a little deeper and discovered that the MixPre-3 II can connect to my phone using Sound Devices’ Wingman app. The Wingman app can trigger recording as well as monitor your inputs. I then remembered I had a spare Timecode Systems Ultra Sync One timecode generator from a previous review. One essential tool when working with backup audio or field recording during a video shoot is sync.

Without too much work, I plugged in the Ultra Sync One using a Mini BNC-to-3.5mm cable connector to send mic level LTC timecode to the MixPre-3 II via the aux/mic input. I then enabled external timecode through the menus and had timecode running to the MixPre-3 II. The only caveat when using the 3.5mm plug for timecode from the Ultra Sync One is that you lose the ability to feed something like a 3.5mm mic or phone into the MixPre-3 II. But still, it was easy to get external timecode into the recorder.

It is really amazing that the MixPre-3 II gives users the ability to be up and running in minutes, not hours. Beyond the simplicity of use, you can dive deeper into the Advanced Menu to assign different inputs to different gain knobs, control the MixPre-3 II over USB, use timecode or HDMI signals to trigger recording and much more.

Summing Up
Sound Devices produces some great products. The MixPre-3 II costs under $700; while that might not be cheap, it’s definitely worth it. The high-quality casing and ease of use makes it a must-buy if you are looking for a podcast recorder, field audio recorder or mixer.

In addition to its product line, Sound Devices is also one of those companies making a difference during the pandemic.

The past couple of months have been very eye-opening for our industry and the world. We are seeing the best from people and businesses. My wife began sewing masks using her own fabric for hospital workers (for free), people are donating their time and money to bring meals to children and the elderly, and we’ve seen so many more amazing acts of kindness.
Sound Devices recently began producing face shields. See our coverage here. After we get through these hard times, I know that I and many others will remember the companies and people who tried to do their best for the community at large. Sound Devices is one of those companies.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producers Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

 

OWC intros Mercury Extreme Pro 6G SSD with 4TB capacity

OWC is offering the latest version of its Mercury Extreme Pro 6G SSD 2.5-inch SATA solid state drives. Available in capacities from 240GB ($79.75) up to 4TB ($899.75), the new device is engineered to bring older Macs and PCs up to current-model performance levels.

The Mercury Extreme Pro 6G SSD targets users who require sustained performance in audio, video and production applications when a drop in performance could lead to lost frames and production time.

The company says that adding a Mercury Extreme Pro 6G SSD to any Mac or PC can help restore it to like-new performance levels and expand the longevity of the machine. Built using high-quality NAND flash memory and controller design, the Mercury Extreme Pro 6G SSD delivers sustained real-world tested read/write speeds over 500MB/s throughout its storage capacity — resulting in faster boots and application launch times to greatly improved system responsiveness.

 

Telestream’s Prism waveform monitor upgraded for SDI, IP workflows

Telestream has upgraded its Prism waveform monitor with new functionality and software-based features. Since purchasing this monitor technology, Telestream has developed it into a single solution that is suited to both SDI and IP workflow applications.

Prism can now be configured for all the traditional SDI waveform monitoring tools required in operations, compliance, quality control and post workflows up to 8K resolution. Simultaneously, the same product offers a comprehensive suite of IP-based waveform monitoring tools up to 4K resolution on 25G Ethernet. Prism includes enhanced high dynamic range and wide color gamut reports and tools to increase efficiency.

Prism is a software-based solution, which means it can be installed on one physical device that’s used to support a complete range of applications and features. The Prism user interface and API are remotely accessible, enabling remote work and socially distanced production environments, which are especially relevant during the pandemic. Prism enables multi-user flexibility, where the operators do not need to be at the same physical location as the device.

In addition to remote working, touch-screen and dual-screen options are supported, allowing users to adapt to their preferred working environment. All functionality is available on the same user interface whether working remotely or using a touch screen.

Estudios GGM to resume production, open new soundstages

Estudios GGM in Mexico is unveiling three new soundstages as it prepares to resume production activity later this month. Ranging from 10,000 to 13,000 square feet, the new stages will be the studio’s largest and give it a total of nine shooting spaces. Construction of one stage is already complete, while work on the other two will be finished by November, when the studio expects to be supporting a full slate of television and feature productions.

Planned before the coronavirus outbreak, the new stages are meant to serve Mexico’s accelerating boom in television and film production. Launched in 2016, Estudios GGM was operating at capacity prior to the lockdown, providing stages, production offices, casting, editing, visual effects and other services to projects from Telemundo, Netflix, Amazon, Viacom, MGM and other producers. Enemigo Intimo, Falsa Identidad, El Club, Luis Miguel: The Series and Ingobernable are among the streaming series recently shot in whole or in part at the studio.

Francisco Bonilla

“We expect production activity to pick up rapidly beginning in June,” says Estudios GGM CTIO Francisco Bonilla. “We built these stages to increase capacity and meet the needs of producers from around the world who are want to shoot in Mexico. They are large shooting spaces, have high ceilings and are supported by many other resources to accommodate a cinematic style of production.”

Adding to the social distancing guidelines mandated by the Mexican government, the studio will apply a variety of health and safety measures to protect cast and crew, including culture changes and hygienic training for work and everyday life; thermal CCTV monitoring; periodic chemical, ozone and UV sanitization; and restricted access to facilities, sets and offices. The new stages are complemented by modular, multi-purpose space that will allow directors, cinematographers, control room crew and other personnel to work in isolation. Other steps will include regular sanitizing of cameras, lighting, wardrobe and props; the use of masks and gloves; and modifications to craft and catering services. All the studio’s stages are equipped with HVAC systems that draw fresh air from outdoors to reduce the risk of spreading infection.

“We are working with local health officials and medical advisors to develop appropriate protocols,” notes Bonilla. “We are also monitoring the situations in Spain, Italy, Germany, Iceland, Australia and other countries where production has resumed. We are gathering as much information as possible to allow production to ramp up quickly but safely.”

While production has been curtailed during the lockdown, other work has continued. The studio has been using Bebop remote collaboration technology and Adobe tools to allow sound and picture editors, visual effects artists and others to carry on their work remotely. It has also been serving as a beta site for Avid On-Demand, a cloud-based editing platform. Similarly, post finishing has continued at Cinematic Media, the post facility located within the studio complex, with most staff working off site.

Estudios GGM is also expanding its visual effects department. It is hiring artists and adding new capabilities, including high-end motion capture and virtual set technology. Demand for visual effects services has risen dramatically along with the broader push to elevate production value. The studio expects the need for sophisticated visual effects to grow as productions look to limit travel and location production.

For producers eager to get back to production, Estudios GGM wants to make the process simple by providing one-stop solutions. “We provide everything necessary to produce premium television and cinema,” Bonilla says. “That includes experienced talent and crew to reduce the need to travel or bring people from outside the country.”

Review:Dell UltraSharp monitors — 49-inch curved and 27-inch 4K

By Brady Betzel

I’ve been using dual monitors for most of my life. When I was a kid learning Avid, Premiere, Photoshop and more, I had two separate monitors. This was so I could watch tutorials on one monitor while working in the app on the other. What I love about using two separate monitors is the ability to section off different parts of my monitoring. So basically, OCD.

When I started seeing giant, curved screens at Fry’s and Best Buy, I was very interested, but also worried I wouldn’t like it being one big fluid monitor. Weird, I know. So when Dell sent me a 49-inch curved monitor for review, I jumped at the opportunity. In addition, they sent me a 27-inch 4K UltraSharp monitor to check out as well. I figured it would be a good output monitor to use with my Blackmagic DeckLink 4K Extreme for watching full-screen video when working in Resolve or Avid Media Composer.

Up first is the monster Dell 49-inch U4919DW curved monitor, which currently retails for $1,439.99. It really is a monster; it is so wide. Full disclosure: I have always been a fan of Dell monitors. They always look sleek, have a clean image and last a long time.

The U4919DW not only looks good but hosts lots of inputs and even a very handy picture-by-picture (PBP) feature. It is made with an LED backlit LCD monitor that has in-plane switching (IPS) — a.k.a., a wide viewing angle. It also offers native resolution of 5120×1440 at 60Hz, brightness of 350cd/m2, 8ms response time for normal mode, 5ms response time for fast mode, matte/anti-glare coating, 1000:1 contrast ratio, 99% sRGB coverage and a security slot, and it weighs about 58 pounds.

This monitor also comes with a three-year Advanced Exchange Service and Premium Panel Guarantee, which is amazing. The Premium Panel Exchange will replace the panel free with even just one dead pixel, and the Advanced Exchange Service will send a replacement out the next business day if it needs to be replaced. And while I can’t imagine using two of these, there is even a mounting pole available to stack them horizontally. And for those wondering, the stand that comes with this monitor doesn’t go vertical.

Real-World Testing
I’ve been working on this curved monitor for the last few months, and I love it. If I have one critique, it might be that it’s too big for me. I like to keep my monitors within 2 to 3 feet, and sometimes I find myself panning my head just to see everything. To solve this, the U4919DW can be set up as essentially two separate 27-inch monitors on the one 49-inch monitor.

When working in Blackmagic Resolve, Avid Media Composer and Adobe Premiere, this is how I like to work, and I did it by using the picture-by-picture option. PBP allows you to use two separate inputs/outputs for this to work. However, when the kids play Roblox and Fortnite, they like the full 49-inch-wide viewing angle. Dang kids are spoiled with some of the gear I get to test — When I hear, “Dad, this monitor is too small,” I have to give the “back in my day” speech. But I digress.

The Dell 49-inch U4919DW is 99% sRGB-accurate, which is nice for the interface, but it can’t be relied on as “color-accurate” like an external monitor connected via a connection like the Blackmagic DeckLink can. There are a lot of inputs on this beast: two HDMI 2.0 (HDCP 2.2) (10-bit color at 60Hz), one DP 1.4 HDCP 2.2 (10-bit color at 60Hz), five USB 3.0 downstream ports, two USB 3.0 upstream ports, and one USB Type-C with alternate mode with DP 1.4, power delivery and USB 2.0 (8-bit color at 60Hz).

This is a phenomenal monitor when working in multiple apps or on multiple computers. Sometimes I have two computers hooked up so I can watch a progress bar or render on one while I work on the other via the PBP feature. Its built-in KVM feature was something I didn’t think I would care about, but now I use it all the time. At just under $1,400, the Dell U4919DW is not cheap, but it is feature-packed and gigantic. I would love it if the monitor had a higher-than-60Hz refresh rate, but it is still a great multimedia monitor.

Dell UltraSharp 27-Inch
If you are looking for a smaller monitor but also need an HDR output, DCI-P3 color coverage and an overall higher-quality monitor, then the Dell UltraSharp 27-inch 4K USB-C is a great option. The Dell U2720Q is currently priced at $579.99. There is also a 43-inch version, which retails currently for $839.99.

Technically, the U2720Q has similar specs to the U4919DW: 27-inch IPS LED backlit display, 3840x 2160 (UHD) resolution, 350 cd/m2 brightness, 1300:1 contrast ratio, 8ms response time in normal mode and 5ms in fast mode, 1.07 billion color display and anti-glare matte coating. It can be rotated vertically, and it weighs 21.16 pounds.

For us multimedia folks, the color accuracy is also a great feature. It covers 99% of the sRGB color gamut, 99% Rec. 709 color gamut and 95% DCI-P3 wide color gamut. The U4919DW has a ton of inputs as well: security lock slot, HDMI, full-size DisplayPort, USB-C/DisplayPort, audio line out and two USB downstream ports. On the side there is an additional USB 3.0 port and a USB-C port. This monitor has the same three-year warranty as the curved monitor, with the Premium Panel Exchange and the Advanced Exchange Service.

Summing Up
In the end, the Dell UltraSharp monitors are phenomenal monitors. Dell monitors always look sleek with a very minimal bezel. From the curved ultra-wide 49-inch U4919DW to the 27-inch U2720Q, Dell continues to put out quality products. These monitors are a solid choice for anyone looking for an upgrade to their current monitor.

There may be monitors with higher refresh rates, but they will either cost more or will give up perks, like the 95% DCI-P3 gamut coverage.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producers Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Pixelworks hires post vets to lead TrueCut motion grading initiatives

Pixelworks, which provides video and display processing solutions, has beefed up its team with the hire of industry veterans Sarah Priestnall and Bruce Harris. The duo will lead TrueCut motion grading technology initiatives in North America, specifically targeting Hollywood.

Introduced in 2019, Pixelworks TrueCut motion grading provides filmmakers with the ability to deliver at a cinematically tuned high frame rate while filming at any frame rate. This allows for a broader set of motion appearances. The platform takes advantage of current cinema and home-entertainment displays while also ensuring a consistent motion appearance across different devices and screens that is faithful to the original artistic intent.

Priestnall joins as the director of product marketing, heading up the Burbank office and advancing the TrueCut commercial rollout. Harris joins as creative engineer and artist, responsible for usability design, customer training and both technical and artistic support. Priestnall and Harris both report directly to executive vice president of technology Richard Miller, who will continue to lead strategic development of the TrueCut platform.

“We’re excited to bring Sarah and Bruce on board as we ramp our efforts in Hollywood and continue the success of our TrueCut technology around the globe,” Miller says. “We are working with studios, streaming services, post production facilities and creatives to ensure that original intent of artists and filmmakers is displayed on screens across all sizes, from the cinema to the home and beyond.”

Priestnall has been evangelizing new technologies for movie, television production and post throughout her career. She was deeply involved in the development of the digital intermediate process at Cinesite, working closely with Roger Deakins, ASC, BSC, and the Coen Brothers on O Brother, Where Art Thou? She also led Codex’s worldwide marketing efforts. Priestnall is a board member of the Colorist Society International and an associate member of the American Society of Cinematographers. She wrote the chapter on digital post production in the latest ASC manual.

Harris began his career working on movie sets as a propmaker, including on major motion pictures such as Pulp Fiction and A River Runs Through It. He then transitioned into visual effects, becoming a longstanding member of the Visual Effects Society. Working all over the world, he has used his artistic talents as a compositor on productions such as The Aviator and Guardians of the Galaxy.

5th Kind targets post with updates to Core DAM

5th Kind has updated its Core platform with optimized post workflows. Core, 5th Kind’s cloud-based digital asset management (DAM) and realtime collaboration platform, is used in post and dailies workflows across desktop, mobile and Apple TV devices.

5th Kind’s upcoming Core 6.5 release will feature the new Real-Time Review Player, which allows post teams to securely review videos, images, docs and audio files. By the end of the year, 5th Kind says the tool will allow review of 3D models in real time. With annotation tools and accurate synchronization down to the page and video frame, watermarked and encrypted, the Real-Time Reviewer allows teams to make quick decisions for workflow surrounding scripts, contracts, dailies, VFX, previz, marketing and distribution.

Other new Core features include seamless integration with Box.com, so users can link their Box accounts into the 5th Kind platform to browse, view and ingest their Box files into the system. Using the folder path and file name, users can associate their files to their organizational taxonomies. Those files can then be viewed in the Box folder structure and the Core meta-structure.

Additionally, the ALE importer is a new feature for post workflows. Users will be able to import any format ALE into Core and map the files to any tag in the system. This allows complete flexibility when building playlists and importing meta tags.

Core’s mobile, Apple TV and TV screeners apps have also been further enhanced. Mobile now supports Face ID and has streamlined offline downloads, while the Apple TV offers multi-user login to streamline conference room usage. As with all builds, 5th Kind has also made a range of performance and stability improvements across the Core platform and its device-based apps.

Colorist Chat: Light Iron’s Nick Hasson

Photography plays a big role in this colorist’s world; he often finds inspiration through others’ images.

Name: Nick Hasson

Company: Light Iron

Can you describe what Light Iron does?
Light Iron is a full-service post company providing end-to-end solutions — including dailies and finishing in both HDR and SDR.

The L Word

As a colorist, what would surprise people the most about what falls under that title?
Colorists are one of the select few creatives that touch every frame of a project. Working with the cinematographer and director, we help shape the tone of a project. It’s very collaborative.

Are you often asked to do more than just color on projects?
Almost every project I do has a visual effects component to it. I have a background in visual effects and online editing, so I am comfortable in those disciplines. I also tend to do a lot of sky replacements, beauty and cleanup work.

What’s your favorite part of the job?
Being creative on a daily basis. Problem solving is another fun aspect of the job. I love finding solutions and making the client smile.

What’s your least favorite?
I like being outside. The long days in a dark room can be a challenge.

Queen of the South

If you weren’t a colorist, what would you be doing instead?
Electrical engineering or network infrastructure. I’m a big geek and love to build computers and push technology.

How did you choose color grading as a profession?
I was originally heading to a career in music. After a year of touring, I decided it was not for me and got a job at a post house. I was lucky enough to work in both VFX and telecine at the time. Photography was always my first love, so color grading just felt right and fell into place for me.

What are some recent projects you’ve worked on?
I’m lucky to work in both episodic and feature film. Recent movies include Corporate Animals, Sweetheart, Boss Level, Wander Darkly and Like a Boss. On the episodic side, I have been working on The L Word, Room 104, Queen of the South, Greenleaf, Exhibit A and The Confession Tapes.

Room 104

What is the project you are most proud of?
Room 104 is a big challenge. Not many projects get to Season 4. Coming up with looks that aid the storytelling and are different every episode has been exciting and creative challenging. We do a lot of the look design in pre-production, and I love seeing what the cinematographers come back with.

Where do you find inspiration?
I love photography! I like to seek out interesting photographers and see how they are pushing the limits of what can be done digitally. I shoot black-and-white film every week. It is a great way to study composition and lighting.

Name three pieces of technology you can’t live without.
My phone, air-conditioned car seats and Amazon.

What social media channels do you follow?
I only use Instagram, and I tend to follow hashtags rather than specific outlets. It gives my feed a broader reach and keeps things fresh.

How do you de-stress from it all?
Spending time with my family. Working on my old cars and playing guitar. I also ride mountain bikes and love to cook in a wood-fired oven.

Hulu’s The Great: Creator and showrunner Tony McNamara

By Iain Blair

Aussie writer/director Tony McNamara is the creator, showrunner and executive producer of Hulu’s The Great, the new 10-episode series starring Elle Fanning as Catherine the Great and Nicholas Hoult as Russian Emperor Peter III. The Great is a comedy-drama about the rise of Catherine the Great — from German outsider to the longest reigning female ruler in Russia’s history (from 1762 until 1796).

Season 1 is a fictionalized and anachronistic story of an idealistic, romantic young girl who arrives in Russia for an arranged marriage to Emperor Peter. Hoping for love and sunshine, she finds instead a dangerous, depraved, backward world that she resolves to change. All she has to do is kill her husband, beat the church, baffle the military and get the court on her side. A very modern story about the past, which incorporates historical facts occasionally, it encompasses the many roles she played over her lifetime — as lover, teacher, ruler, friend and fighter.

L-R: Tony McNamara and cinematographer John Brawley

McNamara most recently wrote the Oscar-winning film The Favourite, for which he received an Academy Award nomination for Best Original Screenplay. His other feature film credits include The Rage in Placid Lake, which he wrote and directed, and Ashby.

McNamara has writen some Australia’s most memorable television series, including The Secret Life of Us, Love My Way, Doctor Doctor and Spirited. He also served as showrunner of the popular series Puberty Blues.

I recently spoke with McNamara, who was deep in post, about making the show and his love of editing and post.

When you wrote the stage play this is based on, did you also envision it as a future TV series?
Not at all. I was just a playwright and I’d worked a bit in TV but I never thought of adapting it. But then Marian Macgowan, my co-producer on this, saw it and suggested making a movie of it, and I began thinking about that

What did the stage version teach you?
That it worked for an audience, that the characters were funny, and that it was just too big a story for a play or a film.

It’s like a Dickensian novel with so many periods and great characters and multiple storylines.
Exactly, and as I worked more and more in TV, it seemed like the perfect medium for this massive story with so many periods and great characters. So once the penny dropped about TV, it all went very fast. I wrote the pilot and off we went.

I hear you’re not a fan of period pieces, despite this and all the success you had with The Favourite. So what was the appeal of Catherine and what sort of show did you set out to make?
I love period films like Amadeus and Barry Lyndon, but I don’t like the dry, polite, historically accurate, by-the-numbers ones. So I write my things thinking, “What would I want to watch?” And Catherine’s life and story are so amazing, and anything but polite.

What did Elle Fanning and Nicholas Hoult bring to their roles?
They’re both great actors and really funny, and that was important. The show’s a drama in terms of narrative, but it also feels like a comedy, but then it also gets very dark in places. So they had to be able to do both — bring a comic force to it but also be able to put emotional boots on the ground… and move between the two very easily, and they can do that. They just got it and knew the show I wanted to make before we even got going. I spent time with them discussing it all, and they were great partners.

Where do you shoot?
We did a lot of it on stages at 3 Mills Studios in London and shot some exteriors around London. We then went to this amazing palace near Naples, Italy, where we shot exteriors and interiors for a couple of weeks. We really tried to give the show a bit more of a cinematic feel and look than most TV shows, and I think the production design is really strong. We all worked very hard to not make it feel at all like sets. We planned it out so we could move between a lot of rooms so you didn’t feel trapped by four walls in just one set. So even though it’s a very character-driven story, we also wanted to give it that big epic sweep and scope.

Do you like being a showrunner?
(Laughs) It depends what day it is. It’s a massive job and very demanding.

What are the best parts of the job and the worst?
I love the writing and working with the actors and the director. Then I love all the editing and all the post — that’s really my favorite thing in the whole process after the writing. I’ve always loved editing, as it’s just another version of writing. And I love editors, and ours are fun to hang out with, and it’s fun to try and solve problems. The worst parts are having to deal with all the scheduling and the nuts and bolts of production. That’s not much fun.

Where do you post?
We do it all in London, with all the editing at Hireworks and all the sound at Encore. When we’re shooting at the studios we set up an edit suite on site, so we start working on it all right away. You have to really, as the TV schedule doesn’t allow much time for post compared with film.

Talk about editing. You have several editors, I assume because of the time factor. How does that work?
We had three editors, who are all so creative and inventive. I love getting all the material and then editing and tweaking things, particularly in comedy. There’s often a very fine line in how you make something funny and how you give the audience permission to laugh.

I think the main editing challenges were usually the actual storytelling, as we tell a lot of stories really fast, so it’s managing how much story you tell and how quickly. It’s a 10-hour story; you’re also picking off moments in an early episode that will pay off far later in the series. Plus you’re dealing with the comedy factor, which can take a while to get up and running in terms of tone and pace. And if there’s a darker episode, you still want to keep some comedy to warm it up a bit.

But I don’t micro-manage the editors. I watch cuts, give some notes and we’ll chat if there are big issues. That way I keep fresh with the material. And the editors don’t like coming on set, so they keep fresh too.

How involved are you with the sound?
I’m pretty involved, especially with the pre-mix. We’ll do a couple of sessions with our sound designer, Joe Fletcher, and Marian will come in and listen, and we’ll discuss stuff and then they do the fixes. The sound team really knows the style of the soundscape we want, and they’ll try various things, like using tones instead of anything naturalistic. They’re very creative.

Tony McNamara and Elle Fanning on set

There’s quite a lot of VFX. 
BlueBolt and Dneg did them all — and there are a lot, as period pieces always need a ton of small fixes. Then in the second half, we had a lot of stuff like dogs getting thrown off roofs, carriages in studios that had to be running through forests, and we have a lot of animals — bears, butterflies and so on. There’s also a fair whack of violence, and all of it needed VFX.

Where do you do the DI?
We did the grading at Encore, and we spent a lot of time with DP John Brawley setting the basic look early on when we did the pilot, so everyone got it. We had the macro look early, and then we’d work on specific scenes and the micro stuff.

Are you already planning Season 2?
I have a few ideas and a rough arc worked out, but with the pandemic we’re not sure when we’ll even be able to shoot again.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Posting Michael Jordan’s The Last Dance — before and during lockdown

By Craig Ellenport

One thing viewers learned from watching The Last Dance — ESPN’s 10-part documentary series about Michael Jordan and the Chicago Bulls — is that Jordan might be the most competitive person on the planet. Even the slightest challenge led him to raise his game to new heights.

Photo by Andrew D. Bernstein/NBAE via Getty Images

Jordan’s competitive nature may have rubbed off on Sim NY, the post facility that worked on the docuseries. Since they were only able to post the first three of the 10 episodes at Sim before the COVID-19 shutdown, the post house had to manage a work-from-home plan in addition to dealing with an accelerated timeline that pushed up the deadline a full two months.

The Last Dance, which chronicles Jordan’s rise to superstardom and the Bulls’ six NBA title runs in the 1990s, was originally set to air on ESPN after this year’s NBA Finals ended in June. With the sports world starved for content during the pandemic, ESPN made the decision to begin the show on April 19 — airing two episodes a night on five consecutive Sunday nights.

Sim’s New York facility offers edit rooms, edit systems and finishing services. Projects that rent these rooms will then rely on Sim’s artists for color correction and sound editing, ADR and mixing. Sim was involved with The Last Dance for two years, with ESPN’s editors working on Avid Media Composer systems at Sim.

When it became known that the 1997-98 season was going to be Jordan’s last, the NBA gave a film crew unprecedented access to the team. They compiled 500 hours of 16mm film from the ‘97-’98 season, which was scanned at 2K for mastering. The Last Dance used a combination of the rescanned 16mm footage, other archival footage and interviews shot with Red and Sony cameras.

Photo by Andrew D. Bernstein/NBAE via Getty Images

“The primary challenge posed in working with different video formats is conforming the older standard definition picture to the high definition 16:9 frame,” says editor Chad Beck. “The mixing of formats required us to resize and reposition the older footage so that it fit the frame in the ideal composition.”

One of the issues with positioning the archival game footage was making sure that viewers could focus when shifting their attention between the ball and the score graphics.

“While cutting the scenes, we would carefully play through each piece of standard definition game action to find the ideal frame composition. We would find the best position to crop broadcast game graphics, recreate our own game graphics in creative ways, and occasionally create motion effects within the frame to make sure the audience was catching all the details and flow of the play,” says Beck. “We discovered that tracking the position of the backboard and keeping it as consistent as possible became important to ensuring the audience was able to quickly orient themselves with all the fast-moving game footage.”

From a color standpoint, the trick was taking all that footage, which was shot over a span of decades, and creating a cohesive look.

Rob Sciarratta

“One of main goals was to create a filmic, dramatic natural look that would blend well with all the various sources,” says Sim colorist Rob Sciarratta, who worked with Blackmagic DaVinci Resolve 15. “We went with a rich, slightly warm feeling. One of the more challenging events in color correction was blending the archival work into the interview and film scans. The older video footage tended to have various quality resolutions and would often have very little black detail existing from all the transcoding throughout the years. We would add a filmic texture and soften the blacks so it would blend into the 16mm film scans and interviews seamlessly. … We wanted everything to feel cohesive and flow so the viewer could immerse themselves in the story and characters.”

On the sound side, senior re-recording mixer/supervising sound editor Keith Hodne used Avid Pro Tools. “The challenge was to create a seamless woven sonic landscape from 100-plus interviews and locations, 500 hours of unseen raw behind-the-scenes footage, classic hip-hop tracks, beautifully scored instrumentation and crowd effects, along with the prerecorded live broadcasts,” he says. “Director Jason Hehir and I wanted to create a cinematic blanket of a basketball game wrapped around those broadcasts. What it sounds like to be at the basketball game, feel the game, feel the crowd — the suspense. To feel the weight of the action — not just what it sounds like to watch the game on TV. We tried to capture nostalgia.”

When ESPN made the call to air the first two episodes on April 19, Sim’s crew still had the final seven episodes to finish while dealing with a work-from-home environment. Expectations were only heightened after the first two episodes of The Last Dance averaged more than 6 million viewers. Sim was now charged with finishing what would become the most watched sports documentary in ESPN’s history — and they had to do this during a pandemic.

Stacy Chaet

When the shutdown began in mid-March, Sim’s staff needed to figure out the best way to finish the project remotely.

“I feel like we started the discussions of possible work from home before we knew it was pushed up,” says Stacy Chaet, Sim’s supervising workflow producer. “That’s when our engineering team and I started testing different hardware and software and figuring out what we thought would be the best for the colorist, what’s the best for the online team, what’s the best for the audio team.”

Sim ended up using Teradici to get Sciarratta connected to a machine at the facility. “Teradici has become a widely used solution for remote at home work,” says Chaet. “We were easily able to acquire and install it.”

A Sony X300 monitor was hand-delivered to Sciarratta’s apartment in lower Manhattan, which was also connected to Sciarratta’s machine at Sim through an Evercast stream. Sim shipped him other computer monitors, a Mac mini and Resolve panels. Sciarratta’s living room became a makeshift color bay.

“It was during work on the promos that Jason and Rob started working together, and they locked in pretty quickly,” says David Feldman, Sim’s senior VP, film and television, East Coast. “Jason knows what he wants, and Rob was able to quickly show him a few color looks to give him options.

David Feldman

“So when Sim transitioned to a remote workflow, Sciarratta was already in sync with what the director, Jason Hehir, was looking for. Rob graded each of the remaining seven episodes from his apartment on his X300 unsupervised. Sim then created watermarked QTs with final color and audio. Rob reviewed each QT to make sure his grade translated perfectly when reviewed on Jason’s retina display MacBook. At that point, Sim provided the director and editorial team access for final review.”

The biggest remote challenge, according to producer Matt Maxson, was that the rest of the team couldn’t see Sciarratta’s work on the X300 monitor.

“You moved from a facility with incredible 4K grading monitors and scopes to the more casual consumer-style monitors we all worked with at home,” says Maxson. “In a way, it provided a benefit because you were watching it the way millions of people were going to experience it. The challenge was matching everyone’s experience — Jason’s, Rob’s and our editors’ — to make sure they were all seeing the same thing.”

Keth Hodne

For his part, Hodne had enough gear in his house in Bay Ridge, Brooklyn. Using Pro Tools with Mac Pro computers at Sim, he had to work with a pared-down version of that in his home studio. It was a challenge, but he got the job done.

Hodne says he actually had more back-and-forth with Hehir on the final episode than any of the previous nine. They wanted to capture Jordan’s moments of reflection.

“This episode contains wildly loud, intense crowd and music moments, but we counterbalance those with haunting quiet,” says Hodne. “We were trying to achieve what it feels like to be a global superstar with all eyes on Jordan, all expectations on Jordan. Just moments on the clock to write history. The buildup of that final play. What does that feel and sound like? Throughout the episode, we stress that one of his main strengths is the ability to be present. Jason and I made a conscious decision to strip all sound out to create the feeling of being present and in the moment. As someone whose main job it is to add sound, sometimes there is more power in having the restraint to pull back on sound.”

ESPN Films_Netflix_Mandalay Sports Media_NBA Entertainment

Even when they were working remotely, the creatives were able to communicate in real time via phone, text or Zoom sessions. Still, as Chaet points out, “you’re not getting the body language from that newly official feedback.”

From a remote post production technology standpoint, Chaet and Feldman both say one of the biggest challenges the industry faces is sufficient and consistent Internet bandwidth. Residential ISPs often do not guarantee speeds needed for flawless functionality. “We were able to get ahead of the situation and put systems in place that made things just as smooth as they could be,” says Chaet. “Some things may have taken a bit longer due to the remote situation, but it all got done.”

One thing they didn’t have to worry about was their team’s dedication to the project.

“Whatever challenges we faced after the shutdown, we benefitted from having lived together at the facility for so long,” says Feldman. “There was this trust that, somehow, we were going to figure out a way to get it done.”


Craig Ellenport is a veteran sports writer who also covers the world of post production. 

Invisible VFX on Hulu’s Big Time Adolescence

By Randi Altman

Hulu’s original film Big Time Adolescence is a coming-of-age story that follows 16-year-old Mo, who is befriended by his sister’s older and sketchy ex-boyfriend Zeke. This aimless college dropout happily introduces the innocent-but-curious Mo to drink and drugs and a poorly thought-out tattoo.

Big Time Adolescence stars Pete Davidson (Zeke), Griffin Gluck (Mo) and Machine Gun Kelly (Nick) and features Jon Cryer as Mo’s dad. This irony will not be lost on those who know Cryer from his own role as disenfranchised teen Duckie in Pretty in Pink.

Shaina Holmes

While this film doesn’t scream visual effects movie, they are there — 29 shots — and they are invisible, created by Syracuse, New York-based post house Flying Turtle. We recently reached out to Flying Turtle’s Shaina Holmes to find out about her work on the film and her process.

Holmes served as VFX supervisor, VFX producer and lead VFX artist on Big Time Adolescence, creating things like flying baseballs, adding smoke to a hotboxed car, removals, replacements and more. In addition to owning Flying Turtle Post, she is a teacher at Syracuse University, where she mentors students who often end up working at her post house.

She has over 200 film and television credits, including The Notebook, Tropic Thunder, Eternal Sunshine of the Spotless Mind, Men in Black 3, Swiss Army Man and True Detective.

Let’s find our more…

How early did you get involved on Big Time Adolescence?
This this was our fifth project in a year with production company American High. With all projects overlapping in various stages of production, we were in constant contact with the client to help answer any questions that arose in early stages of pre-production and production.

Once the edit was picture-locked, we bid all the VFX shots in October/November 2018, VFX turnovers were received in November, and we had a few short weeks to complete all VFX in time for the premiere at the Sundance Film Festival in January 2019.

What direction were you given from your client?
Because this was our fifth feature with American High and each project has similar basic needs, we already had plans in place for how to shoot certain elements.

For example, most of the American High projects deal with high school, so cell phones and computer screens are a large part of how the characters communicate. Production has been really proactive about hiring an on-set graphics artist to design and create phone and computer screen graphics that can be used either during the shoot or provided to my team to add in VFX.

Having these graphics prebuilt has saved a lot of design time in post. While we still need to occasionally change times and dates, remove the carrier, change photos, replace text and other editorial changes, we end up only needing to do a handful of shots instead of all the screen replacements. We really encourage communication during the entire process to come up with alternatives and solutions that can be shot practically, and that usually makes our jobs more efficient later on.

Were you on set?
I was not physically needed on set for this film, however after filming completed, we realized in post that we were missing some footage during the batting cages scene. The post supervisor and I, along with my VFX coordinator, rented a camera and braved the freezing Syracuse, New York, winter to go to the same batting cages and shoot the missing elements. These plates became essential, as production had turned off the pitching machine during the filming.

Before and After: Digital baseballs

To recreate the baseball in CG, we needed more information for modeling, texture and animation within this space to create more realistic interaction with the characters and environment in VFX. After shoveling snow and ice, we were able to set the camera up at the batting cage and create the reference footage we needed to match our CG baseball animation. Luckily, since the film shot so close to where we all live and work, this was not a problem… besides our frozen fingers!

What other effects did you provide?
We aren’t reinventing the wheel here in the work we do. We work on features wherein invisible VFX are the supporting roles that help create a seamless experience for the audience without distractions from technical imperfections and without revising graphics to enable the story to unfold properly. I work with the production team to advise on ways to shoot to save on costs in post production and use creative problem solving to cut down costs in VFX to satisfy their budget and achieve their intended vision

That being said, we were able to do some fun sequences including CG baseballs, hotboxing a car, screen replacements, graphic animation and alterations, fluid morphs and artifact cleanup, intricate wipe transitions, split screens and removals (tattoos, equipment, out-of-season nature elements).

Can you talk about some of those more challenging scenes/effects?
Besides the CG baseball, the most difficult shots are the fluid morphs. These usually consist of split screens where one side of the split has a speed change effect to editorially cut out dialogue or revise action/reactions.

They seem simple, but to seamlessly morph two completely different actions together over a few frames and create all the in-betweens takes a lot of skill. These are often more advanced than our entry-level artists can handle, so they usually end up on my plate.

What was the review and approval process like?
All the work starts with me receiving plates from the clients and ends with me delivering final versions to the clients. As I am the compositing supervisor, we go through many internal reviews and versions before I approve shots to send to the client for feedback, which is a role I’ve done for the bulk of my career.

For most of the American High projects, the clients are spread out between Syracuse, LA and NYC. No reviews were done in person, although if needed, I could go to Syracuse Studios at any time to review dailies if there was any footage I thought could help with some fix-it-in-post VFX requests.

All shots were sent online for review and final delivery. We worked closely with the executive producer, post supervisor, editor and assistant editor for feedback, notes, design and revisions. Most review sessions were collaborative as far as feedback and what’s possible.

What tools did you use on the film?
Blackmagic’s Fusion is the main compositing software. Artists were trained on Fusion by me when they were in college, so it is an easy and affordable transition for them to use for professional-quality work. Since everyone has their own personal computer setup at home, it’s been fairly easy for artists to send comp files back to me and I render on my end after relinking. That has been a much quicker process for internal feedback and deliveries as we’re working on UHD and 4K resolutions.

For Big Time Adolescence specifically, we also needed to use Adobe After Effects for some of the fluid morph shots, plus some final clean-up in Fusion. For the CG baseball shots, we used Autodesk Maya and Substance Painter, rendered with Arnold and comped in Fusion.

You are female-owned and you are in Syracuse, New York. Not something you hear about every day.
Yes, we are definitely set up in a great up-and-coming area here in Ithaca and Syracuse. I went to film school at Ithaca College. From there, I worked in LA and NYC for 20 years as a VFX artist and producer. In 2016, I was offered the opportunity to teach VFX back at Ithaca College, so I came back to the Central New York area to see if teaching was the next chapter for me.

Timing worked out perfectly when some of my former co-workers were helping create American High, using the Central New York tax incentives and they were prepping to shoot feature films in Syracuse. They brought me on as the local VFX support since we had already been working together off and on since 2010 in NYC. When I found myself both teaching and working on feature films, that gave me the idea to create a company to combine forces.

Teaching at Syracuse University and focusing on VFX and post for live-action film and TV, I am based at The Newhouse School, which is very closely connected with American High and Syracuse Studios. I was already integrated into their productions, so this was just a really good fit all around to bring our students into the growing Central New York film industry, aiming to create a sustainable local talent pool.

Our team is made up of artists who started with me in post mentorship groups I created at both Ithaca College (Park Post) and Syracuse University (SU Post). I teach them in class, they join these post group collaborative learning spaces for peer-to-peer mentorship, and then a select few continue to grow at Flying Turtle Post.

What haven’t I asked that’s important?
When most people hear visual effects, they think of huge blockbusters, but that was never my thing. I love working on invisible VFX and the fact that it blows people’s minds — how so much attention is paid to every single shot, let alone frame, to achieve complete immersion for the audience, so they’re not picking out the boom mic or dead pixels. So much work goes on to create this perfect illusion. It’s odd to say, but there is such satisfaction when no one noticed the work you did. That’s the sign of doing your job right!

Every show relies on invisible VFX these days, even the smallest indie film with a tiny budget. These are the projects I really like to be involved in as that’s where creativity and innovation are at their best. It’s my hope that up-and-coming filmmakers who have amazing stories to tell will identify with my company’s mentorship-focused approach and feel they also are able to grow their vision with us. We support female and underrepresented filmmakers in their pursuit to make change in our industry.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Arch platform launches for cloud-based visual effects

Arch Platform Technologies, a provider of cloud-based infrastructure for content creation, has made its secure, scalable, cloud-based visual effects platform available commercially. The Arch platform is designed for movie studios, productions and VFX companies and enables them to leverage a VFX infrastructure in the cloud from anywhere in the world.

An earlier iteration of the Arch platform was only available to those companies who were already working with Hollywood-based Vitality VFX, where the technology was created by Guy Botham. Now, Arch is making its next-generation version of its “rent vs. own” cloud-based VFX platform commercially available broadly to movie studios, productions and VFX companies. This version was well along in its development when COVID-19 arrived, making it a very timely offering.

By moving VFX to the cloud, the platform lets VFX teams scale up and down quickly from anywhere and build and manage capacity with cloud-based workstations, renderfarms, storage and workflow management – all in a secure environment.

“We engineered a robust Infrastructure as a Service (IaaS), which now enables a group of VFX artists to collaborate on the same infrastructure as if they were using an on-premises system,” says Botham. “Networked workstations can be added in minutes nearly anywhere in the world, including at an artist’s home, to create a small to large VFX studio environment running all the industry-standard software and plugins.”

Recently, Solstice Studios, a Hollywood distribution and production studio, used the Arch platform for the VFX work on the studio’s upcoming first movie, Unhinged. The platform has also been used by VFX companies Track VFX and FatBelly VFX and is now commercially available to the industry.

Dell intros redesigned Precision mobile workstation line

Dell Technologies has introduced new mobile workstations in its Precision line, which targets professional content creators. The Precisions feature the Dell Optimizer, an automated AI-based optimization technology that learns how each person works and adapts to their behavior. It is designed to improve overall application performance; enable faster log-in and secure lock outs; eliminate echoes and reduce background noise on conference calls; and extend battery run time.

The reengineered  Precision workstation portfolio is designed to handle demanding workloads, such as intensive graphics processing, data analysis and CAD modeling. With smaller footprints and thermal innovations, the new Precision mobile workstations offer increased performance and ISV certifications with professional graphics from Nvidia and the latest 10th Gen Intel Core vPro and Xeon processors.

Designed for creators and pros, the Dell Precision 5550 and 5750 are small and thin 15-inch and 17-inch mobile workstations, respectively, and offer a 16:10, four-sided InfinityEdge (up to HDR 400) display.

The new Precision 5750 is also VR/AR and AI-ready to handle fast rendering, detailed visualizations and complex simulations. Targeting media and entertainment pros, the 5750 comes with the option of an Nvida Quadro RTX 3000 GPU, weighing in at only 4.7 pounds. It is available with a UHD+ (3840 x 2400) HDR400 screen, dual Thunderbolt (four ports total) and up to two M.2 NVMe drives.

The Dell Precision 5550 is available now starting at $1,999. The Dell Precision 5750 is available in early June starting at $2,399.

The Precisions are designed for sustainability with recycled materials, sustainable packaging, energy efficient designs and EPEAT Gold registrations.

 

Color grading Togo with an Autochrome-type look

Before principal photography began on the Disney+ period drama Togo, the film’s director and cinematographer, Ericson Core, asked Company 3 senior colorist Siggy Ferstl to help design a visual approach for the color grade that would give the 1920s-era drama a unique look. Based on a true story, Togo is named for the lead sled dog on Leonhard Seppala’s (Willem Dafoe) team and tells the story of their life-and-death relay through Alaska’s tundra to deliver diphtheria antitoxin to the desperate citizens of Nome.

Siggy Ferstl

Core wanted a look that was reminiscent of the early color photography process called Autochrome, as well as an approach that evoked an aged, distressed feel. Ferstl, who recently colored Lost in Space (Netflix) and The Boys (Amazon), spent months — while not working on other projects — developing new ways of building this look using Blackmagic’s Resolve 16.

Many of Ferstl’s ideas were realized using the new Fusion VFX tab in Resolve 16. It allowed him to manipulate images in ways that took his work beyond the normal realm of color grading and into the arena of visual effects.

By the time he got to work grading Togo, Ferstl had already created looks that had some of the visual qualities of Autochrome melded with a sense of age, almost as if the images were shot in that antiquated format. Togo “reflects the kind of style that I like,” explains Ferstl. “Ericson, as both director and cinematographer, was able to provide very clear input about what he wanted the movie to look like.”

In order for this process to succeed, it needed to go beyond the appearance of a color effect seemingly just placed “on top” of the images. It had to feel organic and interact with the photography, to seem embedded in the picture.

A Layered Approach
Ferstl started this large task by dividing the process into a series of layers that would work together to affect the color, of course, but also to create lens distortion, aging artifacts and all the other effects. A number of these operations would traditionally be sent to Company 3’s VFX department or to an outside vendor to be created by their artists and returned as finished elements. But that kind of workflow would have added an enormous amount of time to the post process. And, just as importantly, all these effects and color corrections needed to work interactively during grading sessions at Company 3 so Ferstl and Core could continuously see and refine the overall look. Even a slight tweak to a single layer could affect how other layers performed, so Ferstl needed complete, realtime control of every layer for every fine adjustment.

Likewise the work of Company 3 conform artist Paul Carlin could not be done in the way conform has typically been done. It couldn’t be sent out of Resolve and into a different conform/compositing tool, republished to the company network and then returned to Ferstl’s Resolve timeline. This would have taken too long and wouldn’t have allowed for the interactivity required in grading sessions.

Carlin needed to be able to handle the small effects that are part of the conform process — split screens, wire removals, etc. — quickly, and that meant working from the same media Ferstl was accessing. Carlin worked entirely in Resolve using Fusion for any cleanup and compositing effects — a practice becoming more and more common among conform artists at Company 3. “He could do his work and return it to our shared timeline,” Ferstl says. “We both had access to all the original material.”


Most of the layers actually consisted of multiple sublayers. Here is some detail:
Texture: This group of sublayers was based on overlaid textures that Ferstl created to have a kind of “paper” feel to the images. There were sublayers based on photographs of fabrics and surfaces that all play together to form a texture over the imagery.
Border: This was an additional texture that darkened portions of the edges of the frame. It inserts a sense of a subtle vignette or age artifact that framed the image. It isn’t consistent throughout; it continually changes. Sublayers bring to the images a bit of edge distortion that resembles the look of diffraction that can happen to lenses, particularly lenses from the early 20th century, under various circumstances.
Lens effects: DP Core shot with modern lenses built with very evolved coatings, but Ferstl was interested in achieving the look of uncoated and less-refined optics of the day. This involved the creation of sublayers of subtle distortion and defocus effects.
Stain: Ferstl applied a somewhat sepia-colored stain to parts of the image to help with the aging effect. He added a hint of additional texture and brought some sepia to some of the very bluish exterior shots, introducing hints of warmth into the images.
Grain-like effect: “We didn’t go for something that exactly mimicked the effect of film grain,” Ferstl notes. “That just didn’t suit this film. But we wanted something that has that feel, so using Resolve’s Grain OFX, I generated a grain pattern, rendered it out and then brought it back into Resolve and experimented with running the pattern at various speeds. We decided it looked best slowed to 6fps, but then it had a steppiness to it that we didn’t like. So I went back and used the tool’s Optical Flow in the process of slowing it down. That blends the frames together, and the result provided just a hint of old-world filmmaking. It’s very subtle and more part of the overall texture.”

Combining Elements
“It wasn’t just a matter of stacking one layer on top of the other and applying a regular blend. I felt it needed to be more integrated and react subtly with the footage in an organic-looking way,” Ferstl recalls.

One toolset he used for this was a series of customized lens flare using Resolve’s OFX, not for their actual purpose but as the basis of a matte. “The effect is generated based on highlight detail in the shot,” explains Ferstl. “So I created a matte shape from the lens flare effect and used that shape as the basis to integrate some of the texture layers into the shots. It’s the textures that become more or less pronounced based on the highlight details in the photography and that lets the textures breathe more.”

Ferstl also made use of the Tilt-Shift effect in Fusion that alters the image in the way movements within a tilt/shift lens would. He could have used a standard Power Window to qualify the portion of the image to apply blur to, but that method applied the effect more evenly and gave a diffused look, which Ferstl felt wasn’t like a natural lens effect. Again, the idea was to avoid having any of these effects look like some blanket change merely sitting on top of the image.

“You can adjust a window’s softness,” he notes, “but it just didn’t look like something that was optical… it looked too digital. I was desperate to have a more optical feel, so I started playing around with the Tilt-Shift OFX and applying that just to the defocus effect.

“But that only affected the top and bottom of the frame, and I wanted more control than that,” he continues. “I wanted to draw shapes to determine where and how much the tilt/shift effect would be applied. So I added the Tilt-Shift in Fusion and fed a poly mask into it as an external matte. I had the ability to use the mask like a depth map to add dimensionality to the effect.”

As Ferstl moved forward with the look development, the issue that continually came up was that while he and Core were happy with the way these processes affected any static image in the show, “as soon as the camera moves,” Ferstl explains, “you’d feel like the work went from being part of the image to just a veil stuck on top.”

He once again made use of Fusion’s compositing capabilities: The delivery spec was UHD, and he graded the actual photography in that resolution. But he built all the effects layers at the much larger 7K. “With the larger layers,” he says, “if the camera moved, I was able to use Fusion to track and blend the texture with it. It didn’t have to just seem tacked on. That really made an enormous difference.”

Firepower
Fortunately for Ferstl, Company 3’s infrastructure provided the enormous throughput, storage and graphics/rendering capabilities to work with all these elements (some of which were extremely GPU-intensive) playing back in concert in a color grading bay. “I had all these textured elements and external mattes all playing live off the [studio’s custom-built] SAN and being blended in Resolve. We had OpenFx plugins for border and texture and flares generated in real time with the swing/tilt effect running on every shot. That’s a lot of GPU power!”

Ferstl found this entire experience artistically rewarding, and looks forward to similar challenges. “It’s always great when a project involves exploring the tools I have to work with and being able to create new looks that push the boundaries of what my job of colorist entails.”

Quantum upgrades StorNext to v6.4

Quantum has made upgrades to its StorNext file system and data management software that the company says will make cloud content more accessible, with significantly improved read and write speeds for any cloud and object store-based storage solution. The new StorNext 6.4 software features enable hybrid-cloud and multi-cloud storage use cases, for greater flexibility in media and entertainment and other data-intensive workflows.

StorNext 6.4 software incorporates self-describing objects to make cloud content more easily accessible, enabling new hybrid-cloud workflows. The client writes files into StorNext file system, and then, based on policy, StorNext 6.4 software copies files to the public or private cloud, with the option to include additional object metadata. Non-StorNext software clients and cloud-resident processes may now access objects directly, thereby using the new extended metadata.

Multi-threaded put/get operations improve retrieval speed to and from large object stores and the cloud. Quantum says users can expect to see a 5x to 7x performance increase with StorNext 6.4 software, depending on the size of their objects and other factors. This feature enhances performance where single stream object performance is limited.

Among the other features in the StorNext upgrade is dynamic library pooling. This feature improves resiliency for large tape archives, enabling the use of multiple libraries for performance and redundancy, including scale-out tape with vertical libraries like Quantum’s Scalar i6 tape library. Customers can rotate file stores to different libraries to increase availability.

StorNext 6.4 also adds support for Amazon Glacier Deep Archive to StorNext’s integration with Amazon Web Services, Microsoft Azure, Google Cloud Platform and others.

Epic Games offers first look at Unreal Engine 5

Epic Games has offered a first look at Unreal Engine 5 — the next-generation of its technology designed to create photorealistic images on par with movie CG and real life. Designed for development teams of all sizes, it offers productive tools and content libraries

Unreal Engine 5 will be available in preview in early 2021, and in full release late in 2021, supporting next-generation consoles, current-generation consoles, PC, Mac, iOS and Android.

The reveal was introduced with Lumen in the Land of Nanite, a realtime demo running live on PlayStation 5, to showcase Unreal Engine technologies that can allow creators to reach the highest level of realtime rendering detail in the next generation of games and beyond.

New core technologies in Unreal Engine 5
Nanite virtualized micropolygon geometry will allow artists to create as much geometric detail as the eye can see. Nanite virtualized geometry means that film-quality source art comprising hundreds of millions or billions of polygons can be imported directly into Unreal Engine — anything from ZBrush sculpts to photogrammetry scans to CAD data. Nanite geometry is streamed and scaled in real time, so there are no more polygon count budgets, polygon memory budgets, or draw count budgets. Users won’t need to bake details to normal maps or manually author LODs, and, according to Epic, there is no loss in quality.

Lumen is a fully dynamic global Illumination solution that reacts to scene and light changes. The system renders diffuse interreflection with infinite bounces and indirect specular reflections in detailed environments, at scales ranging from kilometers to millimeters. Artists can create more dynamic scenes using Lumen, for example, changing the sun angle for time of day, turning on a flashlight, or blowing a hole in the ceiling. Additionally, indirect lighting will adapt accordingly. Lumen erases the need to wait for lightmap bakes to finish and to author light map UVs — a big time savings when an artist can move a light inside the Unreal Editor and lighting looks the same as when the game is run on console.

To build large scenes with Nanite geometry technology, Epic’s team made heavy use of the Quixel Megascans library, which provides film-quality objects up to hundreds of millions of polygons. To support vastly larger and more detailed scenes than previous generations, PlayStation 5 provides a dramatic increase in storage bandwidth.

The demo also showcases existing engine systems such as Chaos physics and destruction, Niagara VFX, convolution reverb and ambisonics rendering.

Unreal Engine 4 and 5 Timeline
Unreal Engine 4.25 already supports next-generation console platforms from Sony and Microsoft, and Epic is working closely with console manufacturers and dozens of game developers and publishers using Unreal Engine 4 to build next-gen games. Epic is designing for forward compatibility, so developers can get started with next-gen development now in UE4 and move projects to UE5 when ready.

Epic will release Fortnite, built with UE4, on next-gen consoles at launch and, in keeping with the team’s commitment to prove out industry-leading features through internal production, migrate the game to UE5 in mid-2021.

Waiving Unreal Engine Royalties: first $1 Million in Game Revenue
Game developers can still download and use Unreal Engine for free, but now royalties are waived on the first $1 million in gross revenue per title. The new Unreal Engine license terms are retroactive to January 1, 2020.

Epic Online Services
Friends, matchmaking, lobbies, achievements, leaderboards and accounts: Epic built these services for Fortnite, and launched them across seven major platforms — PlayStation, Xbox, Nintendo Switch, PC, Mac, iOS, and Android. Now Epic Online Services are opened up to all developers for free in a simple multiplatform SDK.

Developers can mix and match these services together with their own account services, platform accounts, or Epic Games accounts, which reach the world’s largest social graph with over 350 million players and their 2.2 billion friend connections across half a billion devices.

Review: Boris FX Continuum 2020.5 and Sapphire 2020

By Brady Betzel

The latest Boris FX 2020 plugin releases like Continuum, Sapphire and Mocha, as well as the addition of the Silhouette software (and paint plugin!), have really changed the landscape of effects and compositing.

Over the course of two reviews I will be covering all four of Boris FX’s 2020 offerings — Continuum 2020.5 and Sapphire 2020 now, and Mocha Pro 2020.5 and Silhouette 2020.5 to come soon — for NLE applications like Avid Media Composer, Adobe Premiere and Blackmagic Resolve. Silhouette is a bit different in that it comes as a stand-alone or a compatible plugin for Adobe Premiere or After Effects (just not Avid Symphony/Media Composer at the moment).

Because they are comparable, and editors tend to use both or choose between the two, Continuum 2020.5 and Sapphire 2020 are first. In an upcoming review, I will cover Mocha 2020.5 and Silhouette 2020.5; they have a similar feature set from the outside but work symbiotically on the inside.

While writing this review, Boris FX released the 2020.5 updates for everything but Sapphire, which will eventually come out, but they are dialing it in. You’ll see that I jump back and forth between 2020 and 2020.5 a little bit. Sorry if it’s confusing, but 2020 has some great updates, and 2020.5 has even more improvements.

All four Boris FX plugins could have a place in your editing tool kit, and I will point out the perks of each as well as how all of them can come together to make the ultimate Voltron-like plugin package for editors, content creators, VFX artists and more.

Boris FX has standardized the naming of each plugin and app with release 2020. Beyond that, Continuum and Sapphire 2020 continue to offer the same high-quality effects you know, continue to integrate Mocha tracking, and have added even more benefits to what I always thought was an endless supply of effects.

You have a few pricing options called Annual Subscription, Permanent, Renewals (upgrades), Units and Enterprise. While I have always been a fan of outright owning the products I use, I do like the yearly upgrades to the Boris FX products and think the Annual Subscription price (if you can afford it) is probably the sweet spot. Continuum alone ranges from $295 per year for Adobe-only to $695 per year for Avid, Adobe, Apple and OFX (Resolve). Sapphire alone ranges from $495 to $895 per year, Mocha Pro ranges from $295 to $595 per year, and Silhouette goes for $995 per year. You can bundle Continuum, Sapphire and Mocha Pro from $795 to $1,195 per year. If the entire suite of plugins is too expensive for your wallet, you can purchase individual categories of plugins called “units,” and you can find more pricing options here.

Ok, let’s run through some updates …

Continuum 2020.5
Boris FX Continuum 2020.5 has a few updates that make the 2020 and 2020.5 releases very valuable. At its base level, I consider Continuum to be more of an image restoration, utility and online editor tool kit. In comparison, Sapphire is more of a motion graphics, unicorn poop, particle emitter sparkle-fest. I mean unicorn poop in the most rainbow-colored and magnanimous way possible. I use Continuum and Sapphire every day, and Continuum is the go-to for keying, tracking, roto, film grain and more. Sapphire can really spice up a scene, main title or motion-graphics masterpiece.

My go-to Continuum tools are Gaussian Blur, Primatte Keyer (which has an amazing new secondary spill suppressor update) and Film Grain — all of which use the built-in Mocha planar tracking. There are more new tools to look at in Continuum 2020, including the new BCC Corner Pin Studio, BCC Cast Shadow and BCC Reflection. BCC Corner Pin Studio is a phenomenal addition to the Continuum plugin suite, particularly inside of NLEs such as Media Composer, which don’t have great built-in corner pinning abilities.

As an online editor, I often have to jump out of the NLE I’m using to do title work. After Effects is my tool of choice because I’m familiar with it, but that involves exporting QuickTime files, doing the work and re-exporting either QuickTime files with alpha channels or QuickTime files with the effect baked into the footage. If possible, I like to stay as “un-baked” as possible (feel free to make your own joke about that).

BCC Corner Pin Studio is another step forward in keeping us inside of one application. Using Corner Pin Studio with Mocha planar tracking is surprisingly easy. Inside of Media Composer, place the background on v2 and foreground on v1 of the timeline, apply BCC Corner Pin Studio, step into Effects Mode, identify the foreground and background, use Mocha to track the shot, adjust compositing elements inside of Avid’s Effect window, and you’re done. I’ve over-simplified this process, but it works pretty quickly, and with a render, you will be playing back a rock-solid corner pin track inside of the same NLE you are editing in.

Avid has a few quirks when working with alpha channels to say the least. When using BCC Corner Pin Studio along with the Avid title tool, you will have to “remove” the background when compositing the text. To do this, you click and drag (DO NOT Alt + Drag) a plugin like BCC Brightness and Contrast on top of the Avid title tool layer, enable “Apply to Title Matte” and set background to “None.”

It’s a little cumbersome, but once you get the workflow down, it gets mechanical. The only problem with this method is that when you replace the matte key on the Avid title tool layer, you lose the ability to change, alter or reposition the title natively inside of the Avid title effect or title tool itself. Just make sure your title is “final,” whatever final means these days. But corner pinning with this amount of detail inside of Media Composer can save hours of time, which in my mind equals saving money (or making more money with all your newly found free time). You can find a great six-minute tutorial on this by Vin Morreale on Boris FX’s YouTube page.

Two more great new additions to Continuum in release 2020 are BCC Cast Shadow and Reflection. What’s interesting is that all three — Corner Pin Studio, Cast Shadow and Reflection — can be used simultaneously. Well, maybe not all three at once, but Corner Pin Studio with Shadow or Reflection can be used together when putting text into live-action footage.

Life Below Zero, a show I online edit for Nat Geo, uses this technique. Sometimes I composite text in the snow or in the middle of a field with a shadow. I don’t typically do this inside of Media Composer, but after seeing what Corner Pin Studio can do, I might try it. It would save a few exports and round trips.

To ramp up text inserted into live-action footage, I like to add shadows or reflections. The 2020 Continuum update with Cast Shadow and Reflection makes it easy to add these effects inside of my NLE instead of having to link layers with pick whips or having special setups. Throw the effect onto my text (pre-built graphic in After Effects with an alpha channel) and boom: immediate shadow and/or reflection. To sell the effect, just feather off the edge, enable a composite-mode overlay, or knock the opacity down and you are done. Go print your money.

In the Continuum 2020.5 update, one of my most prized online editing tools that has been updated is BCC Remover. I use BCC Remover daily to remove camera people, drones in the sky, stray people in the background of shots and more. In the 2020.5 update, BCC Remover added some great new features that make one of the most important tools even more useful.

From an ease-of-use standpoint, BCC Remover now has Clone Color and Clone Detail sliders. Clone Color can be used to clone only the color from the source, whereas Clone Detail can be used to take the actual image from the source. You can mix back and forth to get the perfect clone. Inside of Media Composer, the Paint Effect has always been a go-to tool for me, mainly for its blurring and cloning abilities. Unfortunately, it is not robust — you can’t brighten or darken a clone; you can only clone color or clone the detail. But you can do both in BCC Remover in Continuum 2020.5.

In addition, you can now apply Mocha Tracking data to the Clone Spot option and specify relative offset or absolute offset under the “Clone” dropdown menu when Clone Spot is selected. Relative offset allows you to set the destination (in the GUI or Effects panel), then set the source (where you want to clone from), and when you move the destination widget, the source widget will be locked at the same distance it was set at. Absolute offset allows both the source and destination to be moved independently and tracked independently inside of Mocha.

There are a lot more Continuum 2020 updates that I couldn’t get into in this space, and even more for the 2020.5 update. More new transitions were added, like the trendy spin blur dissolve, the area brush in Mocha (which I now use all the time to make a quick garbage matte), huge Particle Illusion improvements (including additional shapes) and Title Studio improvements.

In 2020.5, Particle Illusion now has force and turbulence options, and Title Studio has the ability to cast shadows directly inside the plugin. Outside of Title Studio (and back inside of an NLE like Avid), you have direct access to Composite modes and Transformations, letting you easily adjust parameters directly inside of Media Composer instead of jumping back and forth.

Title Studio is really becoming a much more user-friendly plugin. But I like to cover what I actually use in my everyday editing work, and Corner Pin Studio, Cast Shadow/Reflection and Remover are what I use consistently.

And don’t forget there are hundreds of effects and presets including BCC Flicker Fixer, which is an easy fix to iris shifts in footage (I’m looking at you, drone footage)!

Sapphire 2020
I’ve worked in post long enough to remember when Boris FX merged with GenArts and acquired Sapphire. Even before the merger, every offline editor used Sapphire for its unmistakable S_Glow, Film Looks and more.

It’s safe to say that Sapphire is more of an artsy-look plugin. If you are wondering how it compares to Continuum, Sapphire will take over after you are done performing image restoration and technical improvements in Continuum. Adding glows, blurs, dissolves, flutter cuts and more. Sapphire is more “video candy” than technical improvements. But Sapphire also has technical plugins like Math Ops, Z Depth and more, so each plugin has its own perks. Ideally both work together very well if you can afford it.

What’s new in Sapphire 2020? There are a few big ones that might not be considered sexy, but they are necessary. One is OCIO support and the ability to apply Mocha-based tracking to 10 parameter-driven effects: S_LensFlare, S_EdgeRays, S_Rays, S_Luna, S_Grunge, S_Spotlight, S_Aurora, S_Zap, S_MuzzleFlash and S_FreeLens.

In addition, there are some beauty updates, like the new S_FreeLens. And one of the biggest under-the-hood updates is the faster GPU rendering. A big hurdle with third-party effects apps like Continuum and Sapphire is the render times when using effects like motion blur and edge rays with Mocha tracking. In Sapphire 2020 there is a 3x speed and performance increase (depending on the host app you are using it on). Boris FX has a great benchmark comparison.

So up first I want to cover the new OCIO support inside of Sapphire 2020. OCIO is an acronym for “OpenColorIO,” which was created by Sony Picture Imageworks. It’s essentially a way to use Sapphire effects, like lens flares, in high-end production workflows. For example, for Netflix final deliverables, they ask the colorist to work in an ACES environment, but the footage may be HLG-based. The OCIO options can be configured in the effect editor. So just choose the color space of the video/image you are working on and what the viewing color space is. That’s it.

If you are in an app without OpenColorIO, you can apply the effect S_OCIOTransform. This will allow you to use the OCIO workflow even inside apps that don’t have OCIO built in. If you aren’t worried about color space, this stuff can make your eyes glaze over, but it is very important when delivering a show or feature and definitely something to remember if you can.

On top of the tried-and-true Sapphire beauty plugins like S_Glow or S_Luna (to add a moon), Boris FX has added S_FreeLens to its beauty arsenal. Out in the “real world,” free lensing or “lens whacking” is when you take your lens off of your camera, hold it close to where it would normally mount and move the lens around to create dream-like images. It can add a sort of blur-flare dreamy look; it’s actually pretty great when you need it, but you are locked into the look once you do it in-camera. That’s why S_FreeLens is so great; you can now adjust these looks after you shoot instead of baking in a look. There are a lot of parameters to adjust, but if you load a preset, you can get to a great starting point. From defocus to the light leak color, you can animate and dial in the exact look you are going for.

Parameter tracking has been the next logical step in tying Mocha, Continuum and Sapphire together. Finally, in Sapphire 2020, you can use Mocha to track individual parameters. Like in S_LensFlare, you can track the placement of the hotspot and separately track its pivot.

It’s really not too hard once you understand how it correlates inside the Mocha interface. Sapphire sets up two trackers inside of Mocha: 1) the hotspot search area and position of the actual flare, and 2) the pivot search area and position of the pivot point. The search area gathers the tracking data, while the position crosshair is the actual spot on which the parameter will be placed.

While I’m talking about Mocha, in the Sapphire 2020 update, Mocha has added the Area Brush tool. At first, I was skeptical of the Area Brush tool — it seemed a little too easy — but once I gave in, I realized the Area Brush tool is a great way to make a rough garbage matte. Think of a magnetic lasso but with less work. It’s something to check out when you are inside of Mocha.

Summing Up
Continuum and Sapphire continue to be staples of broadcast TV editors for a reason. You can even save presets between NLEs and swap them (for instance, Media Composer to Resolve).

Are the Boris FX plugins perfect? No, but they will get you a lot further faster in your Media Composer projects without having to jump into a bunch of different apps. One thing I would love to see Boris FX add to Continuum and Sapphire is the ability to individually adjust your Mocha shapes and tracks in the Avid Effects editor.

For instance, if I use Mocha inside of BCC Gaussian Blur to track and blur 20 license plates on one shot — I would love to be able to adjust each “shape’s” blur amount, feather, brightness, etc., without having to stack additional plugin instances on top.

But Boris FX has put in a lot of effort over the past few updates of Continuum and Sapphire. Without a doubt, I know Continuum and Sapphire have saved me time, which saves me and my clients money. With the lines between editor, VFX artist and colorist being more and more blurred, Continuum and Sapphire are necessary tools in your arsenal.

Check out the many tutorials Boris FX has put up: and go update Continuum and Sapphire to the latest versions.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producers Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

AMD’s new Radeon Pro VII graphics card for 8K workflows

AMD has introduced the AMD Radeon Pro VII workstation graphics card designed for those working in broadcast and media in addition to CAE and HPC applications. According to AMD, the Radeon Pro VII graphics card offers 16GB of extreme speed HBM2 (high bandwidth memory) and support for six synchronized displays and high-bandwidth PCIe 4.0 interconnect technology.

AMD says the new card considerably speeds up 8K image processing performance in Blackmagic’s DaVinci Resolve in addition to performance speed updates in Adobe’s After Effects and Photoshop and Foundry’s Nuke.

The AMD Radeon Pro VII introduces AMD Infinity Fabric Link technology to the workstation market, which speeds application data throughput by enabling high-speed GPU-to-GPU communications in multi-GPU system configurations. The new workstation graphics card provides the high performance and advanced features that enable post teams and broadcasters to visualize, review and interact with 8K content.

The AMD Radeon Pro VII graphics card is expected to be available beginning mid-June for $1,899. AMD Radeon Pro VII-equipped workstations are expected to be available in the second half of 2020 from OEM partners.

Key features include:
– 16GB of HBM2 with 1TB/s memory bandwidth and full ECC capability to handle large and complex models and datasets smoothly with low latency.
– A high-bandwidth, low-latency connection that allows memory sharing between two AMD Radeon Pro VII GPUs, enabling users to increase project workload size and scale, develop more complex designs and run larger simulations to drive scientific discovery. AMD Infinity Fabric Link delivers up to 5.25x PCIe 3.0 x16 bandwidth with a communication speed of up to 168GB/s peer-to-peer between GPUs.
– Users can access their physical workstation from virtually anywhere with the remote workstation IP built into AMD Radeon Pro Software for Enterprise driver.
– PCIe 4.0 delivers double the bandwidth of PCIe 3.0 to enable smooth performance for 8K, multichannel image interaction.
– Enables precise synchronized output for display walls, digital signage and other visual displays (AMD FirePro S400 synchronization module required).
– Supports up to six synchronized display panels, full HDR and 8K screen resolution (single display) combined with ultra-fast encode and decode support for enhanced multi-stream workflows.
– Optimized and certified with pro applications for stability and reliability. The list of Radeon Pro Software-certified ISV applications can be found here.
– ROCm open ecosystem, an open software platform for accelerated compute, provides an easy GPU programming model with support for OpenMP, HIP and OpenCL and for ML and HPC frameworks.

AMD Radeon Pro workstation graphics cards are supported by the Radeon Pro Software for Enterprise driver, offering enterprise-grade stability, performance, security, image quality and other features, including high-resolution screen capture, recording and video streaming. The company says the latest release offers up to a 14 percent year-over-year performance improvement for current-generation AMD Radeon Pro graphics cards. The new software driver is now available for download from AMD.com.

AMD also released updates for AMD Radeon ProRender, a physically-based rendering engine built on industry standards that enables accelerated rendering on any GPU, any CPU and any OS. The updates include new plugins for Side Effects Houdini and Unreal Engine and updated plugins for Autodesk Maya and Blender.

For developers, an updated AMD Radeon ProRender SDK is now available on the redesigned GPUOopen.com site and is now easier to implement with an Apache License 2.0. AMD also released a beta SDK of the next-generation Radeon ProRender 2.0 rendering engine with enhanced CPU and GPU rendering support with open-source versions of the plugins.

Production begins again on New Zealand’s Shortland Street series

By Katie Hinsen

The current global pandemic has shut down production all over the world. Those who can have moved to working from home, and there’s speculation about how and when we’ll get back to work again.

New Zealand, a country with a significant production economy, has announced that it will soon reopen for shoots. The most popular local television show, Shortland Street, was the first to resume production after an almost six-week break. It’s produced by Auckland’s South Pacific Pictures.

Dylan Reeve

I am a native New Zealander who has worked in post there on and off over the years. Currently I live in Los Angeles, where I am an EP for dailies and DI at Nice Shoes, so taking a look at how New Zealand is rolling things out interests me. With that in mind, I reached out to Dylan Reeve, head of post production at Shortland Street, to find out how it looked the week they went back to work under Level 3 social distancing restrictions.

Shortland Street is a half-hour soap that runs five nights a week on prime-time television. It has been on air for around 28 years and has been consistently among the highest-rated shows in the nation. It’s a cultural phenomenon. While the cast and crew take a single three-week annual break from production during the Christmas holiday season, the show has never really stopped production … until the pandemic hit.

Shortland Street’s production crew is typically made up of about 100 people; the post department consists of two editors, two assistants, a composer and Reeve, who is also the online editor. Sound mixes and complex VFX are done elsewhere, but everything else for the production is done at the studio.

New Zealand responded to COVID-19 early, instituting one of the harshest lockdowns in the world. Reeve told me that they went from alert Level 1 — basic social distancing, more frequent handwashing — to Level 3 as soon as the first signs of community transmission were detected. They stayed at this level for just two days before going to Level 4: complete lockdown. New Zealanders had 48 hours to get home to their families, shop for supplies and make sure they were ready.

“On a Monday afternoon at about 1:30pm, the studio emptied out,” explains Reeve. “We were shut down, but we were still on air, and we had about five or six weeks’ worth of episodes in various stages of production and post. I then had two days to figure out and prepare for how we were going to finish all of those and make sure they got delivered so that the show could continue to be on air.”

Shortland Street’s main production building dressed as the exterior of the hospital where the show is set, with COVID workplace safety materials on the doors.

The nature of the show’s existing workflow meant that Reeve had to copy all the media to drives and send Avids and drives home with the editors. The assistant editors logged in remotely for any work they needed to do, and Reeve took what he needed home as well to finish onlining, prepping and delivering those already-shot episodes to the broadcaster. They used Frame.io for review and approval with the audio team and with the directors, producers and network.

“Once we knew we were coming back into Level 3, and the government put out more refined guidelines about what that required, we had a number of HoD meetings — figuring out how we could produce the show while maintaining the restrictions necessary.”

I asked Reeve whether he and his crew felt safe going back to work. He reminded me that New Zealand only went back down to Level 3 once there had been a period with no remaining evidence of community transmission. Infection rates in New Zealand had spent two weeks in single digits, including two days when no new cases had been reported.

Starting Up With Restrictions
My conversation with Reeve took place on May 4, right after his first few days back at work. I asked him to explain some of the conditions under which the production was working while the rest of the country was still in isolation. Level 3 in New Zealand is almost identical to the lockdown restrictions put in place in US cities like New York and Los Angeles.

“One of the key things that has changed in terms of how we’re producing the show is that we physically have way less crew in the building. We’re working slower, and everyone’s having to do a bit more, maybe, than they would normally.

Shortland Street director Ian Hughes and camera operator Connagh Heath discussing blocking with a one-metre guide.

“When crew are in a controlled workspace where we know who everyone is,” he continues, “that allows us to keep track of them properly — they’re allowed to work within a meter of one another physically (three feet). Our policy is that we want staff to stay two meters (six feet) apart from one another as much as possible. But when we’re shooting, when it’s necessary, they can be a meter from one another.”

Reeve says the virus has certainly changed the nature of what can be shot. There are no love scenes, no kissing and no hugs. “We’re shooting to compensate for that; staging people to make them seem closer than they are.

Additionally, everything stays within the production environment. Parts of our office have been dressed; parts of our building have been dressed. We’ll do a very low-profile exterior shoot for scenes that take place outside, but we’re not leaving the lot.”

Under Level 3, everyone is still under isolation at home. This is why, explains Reeve, social distancing has to continue at work. That way any infection that comes into the team can be easily traced and contained and affect as few others as possible. Every department maintains what they call a “bubble,” and very few individuals are allowed to cross between them.

Actors are doing their own hair and makeup, and there are no kitchen or craft services available. The production is using and reusing a small number of regular extras, with crew stepping in occasionally as well. Reeve noted that Australia was also resuming production on Neighbours, with crew members acting as extras.

“Right now in our studio, our full technical complement consists of three camera operators at the moment, just one boom operator and one multi-skilled person who can be the camera assist, the lighting assist and the second boom op if necessary. I don’t know how a US production would get away with that. There’s no chance that someone who touches lights on a union production can also touch a boom.”

Post Production
Shortland Street’s post department is still working from home. Now that they are back in production, they are starting to look at more efficient ways to work remotely. While there are a lot of great tools out there for remote post workflows, Reeve notes that for them it’s not that easy, especially when hardware and support are halfway across the world, borders are closed and supply chains are disrupted.

There are collaboration tools that exist, but they haven’t been used “simply because the pace and volume of our production means it’s often hard to adapt for those kinds of products,” he says. “Every time we roll camera, we’re rolling four streams of DNxHD 185, so nearly 800Mb/s each time we roll. We record that media directly into the server to be edited within hours, so putting that in the cloud or doing anything like that was never the best workflow solution. When we wanted feedback, we just grabbed people from the building and dragged them into the edit suite when we wanted them to look at something.”

Ideally, he says, they would have tested and invested in these tools six months ago. “We are in what I call a duct tape stage. We’re taking things that exist, that look useful, and we’re trying to tape them together to make a solution that works for us. Coming out of this, I’m going to have to look at the things we’ve learned and the opportunities that exist and decide whether or not there might be some ways we can change our future production. But at the moment, we’re just trying to make it through.”

Because Shortland Street has only just resumed shooting, they haven’t reached the point yet where they need to do what Reeve calls “the first collaborative director/editor thing” from start to finish. “But there are two plans that we’re working toward. The easy, we-know-it-works plan is that we do an output, we stick it on Frame.io, the director watches it, puts notes on it, sends it back to us. We know that works, and we do that sometimes with directors anyway.

“The more exciting idea is that we have the directors join us on a remote link and watch the episodes as they would if they were in the room. We’ve experimented with a few things and haven’t found a solution that makes us super-happy. It’s tricky because we don’t have an existing hardware solution in place that’s designed specifically for streaming a broadcast output signal over an internet connection. We can do a screen-share, and we’ve experimented with Zoom and AnyDesk, but in both those cases, I’ve found that sometimes the picture will break up unacceptably, or sync will drift — especially using desktop-sharing software that’s not really designed to share full-screen video.”

Reeve and crew are just about to experiment with a tool used for gaming called Parsec. It’s designed to share low-latency, in-sync, high-frame-rate video. “This would allow us to share an entire desktop at, theoretically, 60fps with half-second latency or less. Very brief tests looked good. Plan A is to get the directors to join us on Parsec and screen-share a full-screen output off Avid. They can watch it down and discuss with the editor in real time or just make their own notes and work through it interactively. If that experience isn’t great, or if the directors aren’t enjoying it, or if it’s just not working for some reason, we’ll fall back to outputting a video, uploading it to Frame.io and waiting for notes.

What’s Next?
What are the next steps for other productions returning to work? Shortland Street is the only production that chose to resume under Level 3. The New Zealand Film Commission has said that filming will resume eventually under Level 2, which is being rolled out in several stages beginning this week. Shortland Street’s production company has several other shows, but none have plans to resume yet.

“I think it’s a lot harder for them to stay contained because they can’t shoot everything in the studio,” explains Reeve. “Our production has an added advantage because it is constantly shooting and the core cast and crew are mostly the same every day. I think these types of productions will find it easiest to come back.”

Reeve says that anyone coming into their building has to sign in and deliver a health declaration — recent travel, contact with any sick person, other work they’ve been engaged in. “I think if you can do some of that reasonable contact tracing with the people in your production, it will be easier to start again. The more contained you can keep it, the better. It’s going to be hard for productions that are on location, have high turnover or a large number of extras — anything where they can’t keep within a bubble.

“From a post point of view, I think we’re going to get a lot more comfortable working remotely,” he continues. “And there are lots of editors who already do that, especially in New Zealand. If that can become the norm, and if there are tools and workflows that are well established to support that, it could be really good for post production. It offers a lot of great opportunities for people to essentially broaden their client essentially or the geographic regions in which they can work.

Productions are going to have to make their own sort of health and safety liability decisions, according to Reeve. “All of the things we are doing are effectively responding to New Zealand government regulation, but that won’t be the case for everyone else.”

He sees some types of production finding an equilibrium. “Love Island might be the sort of reality show you can make. You can quarantine everyone going into that show for 14 days, make sure they’re all healthy, and then shoot the show because you’re basically isolated from the world. Survivor as well, things like that. But a reality show where people are running around the streets isn’t happening anymore. There’s no Amazing Race, that’s for sure.”


After a 20-year career talent-side, Katie Hinsen turned her attention to building, developing and running post facilities with a focus on talent, unique business structures and innovative use of technology. She has worked on over 90 major feature and episodic productions, founded the Blue Collar Post Collective, and currently leads the dailies & DI department at Nice Shoes.

Posting John Krasinski’s Some Good News

By Randi Altman

Need an escape from a world filled with coronavirus and murder hornets? You should try John Krasinski’s weekly YouTube show, Some Good News. It focuses on the good things that are happening during the COVID-19 crisis, giving people a reason to smile with things such as a virtual prom, Krasinski’s chat with astronauts on the ISS and bringing the original Broadway cast of Hamilton together for a Zoom singalong.

L-R: Remy, Olivier, Josh and Lila Senior

Josh Senior, owner of Leroi and Senior Post in Dumbo, New York, is providing editing and post to SGN. His involvement began when he got a call from a mutual friend of Krasinski’s, asking if he could help put something together. They sent him clips via Dropbox, and a workflow was born.

While the show is shot at Krasinski’s house in New York at different times during the week, Senior’s Fridays, Saturdays and Sundays are spent editing and posting SGN.

In addition to his post duties, Senior is an EP on the show, along with his producing partner Evan Wolf Buxbaum at their production company, Leroi. The two work in concert with Allyson Seeger and Alexa Ginsburg, who executive produced for Krasinski’s company, Sunday Night Productions. Production meetings are held on Tuesday, and then shooting begins. After footage is captured, it’s still shared via Dropbox or good old iMessage.

Let’s find out more…

What does John use for the shoot?
John films on two iPhones. A good portion of the show is screen-recorded on Zoom, and then there’s the found footage user-generated content component.

What’s your process once you get the footage? And, I’m assuming, it’s probably a little challenging getting footage from different kinds of cameras?
Yes. In the alternate reality where there’s no coronavirus, we run a pretty big post house in Dumbo, Brooklyn. And none of the tools of the trade that we have there are really at play here, outside of our server, which exists as the ever-present backend for all of our remote work.

The assets are pulled down from wherever they originate. The masters are then housed behind an encrypted firewall, like we do for all of our TV shows at the post house. Our online editor is the gatekeeper. All the editors, assistant editors, producers, animators, sound folks — they all get a mirrored drive that they download, locally, and we all get to work.

Do you have a style guide?
We have a bible, which is a living document that we’ve made week over week. It has music cues, editing style, technique, structure, recurring themes, a living archive of all the notes that we’ve received and how we’ve addressed them. Also, any style that’s specific to segments, post processing, any phasing or audio adjustments that we make all live within a document, that we give to whoever we onboard to the show.

Evan Wolf Buxbaum

Our post producers made this really elegant workflow that’s a combination of Vimeo and Slack where we post project files and review links and share notes. There’s nothing formal about this show, and that’s really cool. I mean, at the same time, as we’re doing this, we’re rapidly finishing and delivering the second season of Ramy on Hulu. It comes out on May 29.

I bet that workflow is a bit different than SGN’s.
It’s like bouncing between two poles. That show has a hierarchy, it’s formalized, there’s a production company, there’s a network, there’s a lot of infrastructure. This show is created in a group text with a bunch of friends.

What are you using to edit and color Some Good News?
We edit in Adobe Premiere, and that helps mitigate some of the challenges of the mixed media that comes in. We typically color inside of Adobe, and we use Pro Tools for our sound mix. We online and deliver out of Resolve, which is pretty much how we work on most of our things. Some of our shows edit in Avid Media Composer, but on our own productions we almost always post in Premiere — so when we can control the full pipeline, we tend to prefer Adobe software.

Are review and approvals with John and the producers done through iMessage in Dropbox too?
Yes, and we post links on Vimeo. Thankfully we actually produce Some Good News as well as post it, so that intersection is really fluid. With Ramy it’s a bit more formalized. We do notes together and, usually internally, we get a cut that we like. Then it goes to John, and he gives us his thoughts and we retool the edit; it’s like a rapid prototyping rather than a gated milestone. There are no network cuts or anything like that.

Joanna Naugle

For me, what’s super-interesting is that everyone’s ideas are merited and validated. I feel like there’s nothing that you shouldn’t say because this show has no agenda outside of making people happy, and everybody’s uniquely qualified to speak to that. With other projects, there are people who have an experience advantage, a technical advantage or some established thought leadership. Everybody knows what makes people happy. So you can make the show, I can make the show, my mom can make the show, and because of that, everything’s almost implicitly right or wrong.

Let’s talk about specific episodes, like the ones featuring the prom and Hamilton? What were some of the challenges of working with all of that footage. Maybe start with Hamilton?
That one was a really fun puzzle. My partner at Senior Post, Joanna Naugle, edited that. She drew on a lot of her experience editing music videos, performance content, comedy specials, multicam live tapings. It was a lot like a multicam live pre-taped event being put together.

We all love Hamilton, so that helps. This was a combination of performers pre-taping the entire song and a live performance. The editing technique really dissolves into the background, but it’s clear that there’s an abundance of skill that’s been brought to that. For me, that piece is a great showcase of the aesthetic of the show, which is that it should feel homemade and lo-fi, but there’s this undercurrent of a feat to the way that it’s put together.

Getting all of those people into the Zoom, getting everyone to sound right, having the ability to emphasize or de-emphasize different faces. To restructure the grid of the Zoom, if we needed to, to make sure that there’s more than one screen worth of people there and to make sure that everybody was visible and audible. It took a few days, but the whole show is made from Thursday to Sunday, so that’s a limiting factor, and it’s also this great challenge. It’s like a 48-hour film festival at a really high level.

What about the prom episode?
The prom episode was fantastic. We made the music performances the day before and preloaded them into the live player so that we could cut to them during the prom. Then we got to watch the prom. To be able to participate as an audience member in the content that you’re still creating is such a unique feeling and experience. The only agenda is happiness, and people need a prom, so there’s a service aspect of it, which feels really good.

John Krasinski setting up his shot.

Any challenges?
It’s hard to put things together that are flat, and I think one of the challenges that we found at the onset was that we weren’t getting multiple takes of anything, so we weren’t getting a lot of angles to play with. Things are coming in pretty baked from a production standpoint, so we’ve had to find unique and novel ways to be nonlinear when we want to emphasize and de-emphasize certain things. We want to present things in an expositional way, which is not that common. I couldn’t even tell you another thing that we’ve worked on that didn’t have any subjectivity to it.

Let’s talk sound. Is he just picking up audio from the iPhones or is he wearing a mic?
Nope. No, mic. Audio from the iPhones that we just run through a few filters on Pro Tools. Nobody mics themselves. We do spend a lot of time balancing out the sound, but there’s not a lot of effect work.

Other than SGN and Ramy, what are some other shows you guys have worked on?
John Mulaney & the Sack Lunch Bunch, 2 Dope Queens, Random Acts of Flyness, Julio Torres: My Favorite Shapes by Julio Torres and others.

Anything that I haven’t asked that you think is important?
It’s really important for me to acknowledge that this is something that is enabling a New York-based production company and post house to work fully remotely. In doing this week over week, we’re really honing what we think are tangible practices that we can then turn around and evangelize out to the people that we want to work with in the future.

I don’t know when we’re going to get back to the post house, so being able to work on a show like this is providing this wonderful learning opportunity for my whole team to figure out what we can modulate from our workflow in the office to be a viable partner from home.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Video Chat: Posting Late Night With Seth Meyers from home

By Randi Altman

For many, late-night shows have been offering up laughs during a really tough time, with hosts continuing to shoot from dens, living rooms, backyards and country houses, often with spouses and kids pitching in as crew.

NBC’s Late Night With Seth Meyers is one of those shows. They had their last in-studio taping on March 13, followed by a scheduled hiatus week, followed by the news they wouldn’t be able to come back to the studio. That’s when his team started preproduction and workflow testing to figure out questions like “How are we going to transfer files?” and “How are we going to get it on the air?”

I recently interviewed associate director and lead editor Dan Dome about their process and how that workflow has been allowing Meyers to check in daily from his wasp-ridden and probably haunted attic.

(Watch our original Video Interview here or below.)

How are you making this remote production work?
We’re doing a combination of things. We are using our network laptops to edit footage that’s coming in for interviews or comedy pieces. That’s all being done locally, meaning on our home systems and without involving our SAN or anything like that. So we’re cutting interviews and comedy pieces and then sending links out for approval via Dropbox. Why Dropbox? The syncing features are really great when uploading and downloading footage to all the various places we need to send it.

Once a piece is approved and ready to go into the show — we know the timings are right, we know the graphics are right, we know the spelling is correct, audio levels look good, video levels look good — then we upload that back to Dropbox and back to our computers at 30 Rock where our offices are located. We’re virtually logging into our machines there to compile the show. So, yeah, there are a few bits and pieces to building stuff remotely. And then there are a few bits and pieces to actually compiling the show on our systems back at home base.

What do you use for editing?
We’re still on Adobe Premiere. We launched on Premiere when the show started in February of 2014, and we’re still using that version — it’s solid and stable, and doing a daily show, we don’t necessarily get a ton of time to test new versions. So we have a stable version that we like for doing the show composite aspect of things.

When we’re back at 30 Rock and editing remote pieces, we’re using the newer versions of Adobe Premiere Pro CC 2015.2 9.2.0 (41 Build). At home we are using Premiere Pro CC 2020 14.0.4 (Build 18).

Let talk about how Seth’s been shooting. What’s his main camera?
Some of the home studio recording has been on iPads and iPhones. Then we’re using Zoom to do interviews, and there are multiple records of that happening. The files are then uploaded and downloaded between the edit team, and our director is in on the interviews, setting up cameras and trying to get it to look the best it can.

Once those interviews are done, the different records get uploaded to Dropbox. On my home computer, I use a 6TB CalDigit drive for Dropbox syncing and media storage. (Devon Schwab and Tony Dolezal, who are also editing pieces, use 4TB G-RAID drives with Thunderbolt 3.) So as soon as they tell me the file is up, I sync locally on the folder I know it’s going to, the media automatically downloads, and we simultaneously download it to our systems at 30 Rock. So it syncs there as well. We have multiple copies of it, and if we need to, we can hand off a project between me, Devin or Tony; we can do that pretty easily.

Have you discovered any challenges or happy surprises working this way?
It has been a nice happy surprise that it’s like, “Oh wow, this is working pretty well.” We did have a situation where we thought we might lose power on the East coast because of rains and winds and things like that. So we had safeguards in place for that, as far as having an evergreen show that was ready to go for that night in case we did end up losing power. It would have been terrible, but everything held up, and it worked pretty well.

So there are certainly some challenges to working this way, but it’s amazing that we are working and we can keep our mind on other things and just try to help entertain people while this craziness is going on.

You can watch our original Video Interview with Dome here:

Chimney Group: Adapting workflows in a time of crisis

By Dana Bonomo

In early March, Chimney delivered a piece for TED, created to honor women on International Women’s Day featuring Reshma Saujani, founder of Girls Who Code. This was in the early days of coronavirus taking hold in the United States. We had little comprehension at that point of the true extent to which we would be impacted as a country and as an industry. As the situation grew and awareness around the severity of the COVID-19 health crisis sunk in, we started to realize that it would be animated projects like this one that we would come to rely upon.

TED & Ultimate Software: International Women’s Day

This film showcases the use of other creative solutions when live-action projects can’t be shot. But the real function of work like this is that, on an emotional level, it feels good to make something with a socially actionable message.

In just the last few weeks, platforms have been saturated with COVID-19-related content: salutes to healthcare workers, PSAs from federal, state and local authorities and brands sharing messages of unity. Finding opportunities that can include some form of social purpose help provide hope to our communities while also raising the spirits of those creating it. We are currently in production on two of these projects and they help us feel like we’re contributing in some small way with the resources we have.

As a global company, Chimney is always highlighting our worldwide service capabilities, with 12 offices on four continents, and our abilities to work together. We’ve routinely used portals such as Zoho and Slack in the past, yet now I’m enjoying the shift in how we’re communicating with each other in a more connected and familiar way. Just a short time ago we might have used a typical workflow, and today we’re sharing and exchanging ideas and information at an exponential rate.

As a whole, we prefer to video chat, have more follow-ups and create more opportunities to work on internal company goals in addition to just project pipelines and calendars. There’s efficiency in brainstorming and solving creative challenges in real time, either as a virtual brainstorm or idea exchange in PM software and project communication channels. So at the end of a meeting, internal review or present, current project kick off, we have action items in place and ready to facilitate on a global scale.

Our company’s headquarters is in Stockholm, Sweden. You may have heard that Sweden’s health officials have taken a different approach to handling COVID-19 than most countries, and it is resulting in less drastic social distancing and isolation measures while still being quite mindful of safety. Small shoots are still possible with crews of 10 or less — so we can shoot in Sweden with a fully protected crew, executing safe and sanitary protocols —and we can livestream to clients worldwide from set.

This is Chimney editor Sam O’Hare’s work-from-home setup.

Our CEO North America Marcelo Gandola is encouraging us individually to schedule personal development time, whether it’s for health and wellness, master classes on subjects that interest us, certifications for our field of expertise, or purely creative and expressive outlets. Since many of us used our commute time for that before the pandemic, we can still use that time for emotional recharging in different ways. By setting aside time for this, we regain some control of our situation. It lifts our morale and it can be very self-affirming, personally and professionally.

While most everyone has remote work capabilities these days, there’s a level of creative energy in the air, driven by the need to employ different tactics — either by working with what you have (optimizing existing creative assets, produced content, captured content from the confines of home) or replacing what was intended to be live-action with some form of animation or graphics. For example, Chimney’s Creative Asset Optimization has been around for some time now. Using Edisen, our customization platform, we can scale brands’ creative content on any platform, in any market at any time, without spending more. From title changes to language versioning and adding incremental design elements, clients get bigger volumes of content with high-quality creative for all channels and platforms. So a campaign that might have had a more limited shelf life on one platform can now stretch to an umbrella campaign with a variety of applications depending on its distribution.

Dana Bonomo

They say that necessity is the mother of invention, and it’s exciting to see how brands and makers are creatively solving current challenges. Our visual effects team recently worked on a campaign (sorry we can’t name this yet) that took existing archival footage and — with the help of VFX — generated content that resonated with audiences today. We’re also helping clients figure out remote content capture solutions in lieu of their live events getting canceled.

I was recently on a Zoom call with students at my alma mater, SUNY Oneonta, in conversation with songwriter and producer John Mayer. He said he really feels for students and younger people during this time, because there’s no point of reference for them to approach this situation. The way the younger generation is adapting — reacting by living so fully despite so many limitations — they are the ones building that point of reference for the future. I think that holds true for all generations… there will always be something to be learned. We don’t fully know what the extent of our learning will be, but we’re working creatively to make the most of it.

Main Image: Editor Zach Moore’s cat is helping him edit


Dana Bonomo is managing director at Chimney Group in NYC.

New ZBooks and Envy offerings from HP

A couple of weeks ago, HP introduced the HP ZBook Studio and HP ZBook Create mobile workstations as well as the HP Envy 15. All are the latest additions to the HP Create ecosystem, an initiative introduced during last year’s Adobe Max.

ZBook Studio

These Z by HP solutions are small-form-factor devices for resource-intensive tasks and target professional content creators working in design and modeling. The HP Envy portfolio, including the newest Envy 15, is built for editing video, stills, graphics and web design.

The systems are accelerated by Nvidia Quadro and GeForce RTX GPUs and backed by Nvidia Studio drivers. HP’s ZBook Studio, ZBook Create and Envy 15 laptops with RTX GPUs are members of Nvidia’s RTX Studio program, featuring acceleration for demanding raytraced rendering and AI creative workloads and Studio drivers for reliability.

HP says that the latest additions to the Z by HP portfolio are different from other thin and light mobile workstations and 15-inch notebooks in that they are built specifically for use in the most demanding creative workflows, which call for pro applications, graphics and color accuracy.

The ZBook Studio and ZBook Create, which target visual effects artists, animators and colorists, have all-day battery life. And HP’s DreamColor display accurately represents their content on screen thanks to the built-in colorimeter for automatic self-calibration, 100% sRGB and Adobe RGB for accuracy.

The Z Power Slider gives users control over the type of performance and acoustics for specific workflows. At the same time, the Z Predictive Fan Algorithm intelligently manages fan behavior based on the kind of work and applications used by creatives.

HP Envy 15

The systems feature vapor cooling chamber and liquid crystal polymer, gaming-class thermals. The custom advanced cooling system pushes air away from the CPU and GPU in two-dimensional paths, unlocking power density that the company says is 2.8 times higher gen-to-gen in a laptop design that is up to 22% smaller.

HP says the highly recyclable and lightweight aluminum exterior provides five times the abrasion resistance of painted carbon fiber and still complies with MIL-STD 810G testing.

The HP Envy offers a minimalist design with a sophisticated aluminum chassis and diamond-cut design and is the first Envy with a layer of glass on top of the touchpad for a smooth-touch experience. The HP Envy 15 features an all-aluminum chassis with 82.6% screen-to-body ratio, up to a 4K OLED Vesa DisplayHDR True Black display with touch interface, 10th Gen Intel Core processors, Nvidia GeForce RTX 2060 with Max-Q, and gaming-class thermals for the ultimate experience in creator performance.

Framestore adds three to London film team

During these difficult times, it’s great to hear that Framestore in London has further beefed up its staff. Three new hires will join the VFX studio’s entire workforce (approximately 2,500 people) working from home at the moment.

Two-time VES Award-winner Graham Page joins the company as VFX supervisor after 14 years with Dneg, where he supervised the company’s work on titles such as Avengers: Endgame, Captain Marvel and Avengers: Infinity War. He brings Framestore’s tally of VFX supervisors to 24 — all of whom from pre-production to on-set supervision through to the final delivery.

Mark Hodgkins, who rejoins the company after a 12-year stint with Dneg, will serve as Framestore’s global head of FX, film, and brings with him technical knowledge and extensive experience working on properties from Marvel, DC and J.K. Rowling.

Anna Ford joins Framestore as head of business development, film. Formerly sales and bidding manager at Cinesite, Ford brings knowledge of the global production industry and a passion for emerging technologies that will help identify and secure exciting projects that will push and challenge Framestore’s team of creative thinkers.

“While working in different areas of the company’s business, Graham, Anna and Mark all share the kind of outlook and attitude we’re always looking for at Framestore. They’re forward-thinking, creative in their approaches and never shy away from the kind of challenges that will bring out the best in themselves and those they work with,” says Fiona Walkinshaw, Framestore’s global managing director, film.

Director/EP Lenny Abrahamson on Hulu’s Normal People

By Iain Blair

Irish director Lenny Abrahamson first burst onto the international scene in 2015 with the harrowing drama Room, which picked up four Oscar nominations, including for Best Adapted Screenplay and Best Director. Abrahamson’s latest project is Hulu’s Normal People, based on Sally Rooney’s best-selling novel of the same name.

 (Photo by: Enda Bowe)

Lenny Abrahamson

The series focuses on the passionate, tender and complicated relationship of Marianne and Connell — from the end of their school days in a small town in the west of Ireland to their undergraduate years at Trinity College. At school, he’s a popular sports hero, while she’s upper class, lonely, proud and intimidating. But when Connell comes to pick up his mother from her cleaning job at Marianne’s house, a strange connection grows between the two teenagers… one they are determined to conceal. A year later, they’re both studying in Dublin and Marianne has found her feet in a new social world but Connell hangs on the sidelines, shy and uncertain as the tables are turned.

The series stars Daisy Edgar-Jones (War of the Worlds, Cold Feet) as Marianne and Paul Mescal, in his first television role, as Connell. Adapted by Sally Rooney alongside writers Alice Birch and Mark O’Rowe, Normal People is a 12-episode 30-minute drama series produced by Element Pictures for Hulu and BBC Three. Rooney and Abrahamson also serve as executive producers and Endeavour Content is the international distributor.

I spoke with Abrahamson — whose credits also include The Little Stranger, Frank, Garage, What Richard Did and Adam & Paul — about making the show, his workflows and his love of editing.

You’ve taken on quite a few book projects in the past. What was the appeal of this one?
It’s always an instinctual thing — something chimes with me. Yeah, I’ve done a number of literary adaptations, and I wasn’t really looking to do another. In fact, I was setting out not do another one, but in this case the novel just struck me so much, with such resonance, and it’s very hard not to do it when that happens. And it’s an Irish project and I hadn’t shot in Ireland for some seven years, and it was great to go back and do something that felt so fresh, so all of that was very attractive to me.

(Photo by Enda Bowe/Hulu)

Rooney co-wrote the script with Alice Birch, but translating any novel to a visual medium is always tricky, especially this book with all its inner psychological detail. As a director, how challenging was it to translate the alternating sections of the book while maintaining forward motion of the narrative?
It was pretty challenging. The writing is so direct and honest, yet deep, which is a rare combination. And Sally’s perspective is so fresh and insightful, and all that was a challenge I tried to take on and capture in the filming. How do you deal with something so interior? When you really care about the characters as I did, how do you do justice to them and their extraordinary relationship? But I relished the challenge.

Obviously, casting the right actors was crucial. What did Daisy Edgar-Jones and Paul Mescal bring to their roles and the project?
I feel very lucky to have found them. We actually found Paul first, very early on. He’d been making some waves in theater in Ireland, but he’d never been on screen in anything. What I saw in him was a combination of intelligence, which both characters had to have, and brilliant choices in playing Connell. He really captured that mix of masculinity and anxiety which is so hard to do. There is a sensitivity but also an inarticulateness, and he has great screen presence. Daisy came later, and it was harder in that you had to find someone who works well with Paul. She’s brilliant too, as she found a way of playing Marianne’s spikiness in a very un-clichéd and delicate way that allows you to see past it. They ended up working so well together and became good friends, too.

You co-directed with Hettie Macdonald (Doctor Who, Howard’s End), with you directing the first six episodes and Macdonald directing the final six. How did that work in terms of maintaining the same naturalistic tone and feel you set?
We spoke a lot at the beginning when she came on board. The whole idea was for her to bring her own sensibility to it. We’d already cast and shot the first half and we knew a director of her caliber wasn’t going to break that. We had two DPs: Suzie Lavelle and she had had Kate McCullough. During the shooting I had the odd note, like, “It looks great,” but I was more involved with her material during editing, which is natural as the EP. We had a great relationship.

Tell us about post and your approach.
We did it all — the editing, sound and VFX — at Outer Limits, which is on the coast about 30 minutes outside Dublin. It’s run by two guys who used to be at Screen Scene, where I posted my last five or six films. I followed them over there as I like them so much. It’s a lovely place, very quiet. The editor and I were based out there for the whole thing.

Our VFX supervisor was Andy Clarke, and it’s all pretty invisible stuff, like rain and all the fixes. I also did all the grading and picture finishing at Outer Limits with my regular colorist Gary Curran, who’s done nearly all my projects. He knows what I like, but also when to push me into bolder looks. I tend toward very low-contrast, desaturated looks, but over the years he’s nudged me into more saturated, vivid palettes, which I now really like. And we’ll be doing a 4K version.

I love post, as after all the stress of the shoot and all the instant decisions you have to make on the set, it’s like swimming ashore. You reach ground and can stand up and get all the water out of your lungs and just take your time to actually make the film. I love all the creative possibilities you get in post, particularly in editing.

You edited with your go-to editor Nathan Nugent. Was he on set?
No, we sent him dailies. On a film, he might be cutting next door if we’re in a studio, but not on this. He’s very fast and I’d see an assembly of stuff within 24 hours of shooting it. We like to throw everything up in the air again during the edit. Whatever we thought as we shot, it’s all up for grabs.

What were the main editing challenges?
I think choosing to work with short episodes was really good as it takes away some of the pressure to have lots of plot and story, and it allows you to look really closely at the shifts in their relationship. But there’s nowhere to hide, and you have to absolutely deeply care about the two of them. But if you do, then all the losses and gains, the highs and lows, become as big a story as any you could tell. That’s what gives it momentum. But if you don’t get that right, or you miscast it, then the danger is that you do lose that momentum.

So it’s a real balancing act… to feel that you’re spending time with them but also letting the story move forward in a subtle way. It’s the challenge of all editing — maintaining the tension and pace while letting an audience get a deep and close enough look at the characters.

Lenny Abrahamson

Can you talk about the importance of music and sound in the show.
I’ve had the same team ever since What Richard Did, including my supervising sound designer and editor Steve Fanagan and sound mixer Niall O’Sullivan. They’re so creative. Then I had composer Stephen Rennicks who’s also done all my projects. What was different this time was that we also licensed some tracks, as it just felt right. Our music supervisors Juliet Martin and Maggie Phillips were great with that.

So it was a core team of five, and I did what I always like to do — get all of that involved far earlier than you’d normally do. We don’t just lock picture and hand it over, so this way you have sound constantly interacting with editorial, and they both develop organically at the same time.

What’s next?
Another collaboration with Sally on her first novel, “Conversations With Friends,” with the same team I had on this. But with the COVID-19 pandemic, who knows when we’ll be able to start shooting.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Frame.io offers beta version of Transfer, updates app to v3.6

Frame.io has launched Frame.io v3.6 along with a beta version of a new application called Frame.io Transfer. Frame.io v3.6’s new features are designed for the evolving needs remote workflows, with a particular focus on speed and security. The expanded toolset addresses the need for fast project downloads with the Frame.io Transfer app, boosts security with features like Watermark ID for enterprise accounts and improves collaboration with new features like iOS Offline Mode and Folder Sharing.

Frame.io Transfer works on both Mac and Windows OS. Transfer lets users download large files and sophisticated folder structures — even entire projects — with one click. It supports EDL and XML formats so users can identify specific files, accelerating the process of relinking to original camera files for final conforms, color grading or sharing assets for VFX. Transfer allows users to monitor active downloads and to drag and drop to reprioritize their order. Finally, Transfer facilitates fast and secure downloads, regardless of unstable internet connections; if there’s a disruption mid-download, Transfer pauses and automatically resumes once reconnected.

“Transfer was originally slated for release later this year, but as a response to the profound shift in the way we’re working and the tools our customers need immediately, we’re releasing it in beta today,” says Emery Wells, cofounder/CEO of Frame.io. (Check out our video interview with him below.)

For secure sharing, enterprise users can now secure Presentation and Review links using login-only access, which means that only specified recipients can view Share Links. Recipients will see a list of everything that’s been shared with them in Frame.io’s new Inbox, which offers a clean and focused view.

Frame.io v3.6 also has Watermark ID, which gives customers the ultimate layer of visible security. When any viewer presses “Play,” Frame.io completes a realtime, on-demand transcode of the video with that viewer’s personal identifying information burned into every frame. A two-hour video starts playing back in less than two seconds.

Offline Mode

Users can now add folders to Review Links, allowing them to easily organize and share assets across teams or projects. Any changes made to folders after they are shared are dynamically updated in the Review Link. This is especially useful for teams that produce episodic content or programming that relies on a library of media. Frame.io also made it faster, easier and more intuitive to organize assets with an improved “Move-to” and “Copy-to” flow.

With this update, Frame.io users will now be able to see all the Presentations they’ve shared and access their settings from one organized list. The display shows folder sizes so users can see at a glance which are the heaviest projects, making it easier to optimize projects and storage.

Frame.io v3.6 consolidates notifications made on the same video within short periods of time, grouping them together into one notification. Users can filter by read or unread, see comment previews and scrub asset thumbnails to easily spot what needs to be reviewed or addressed.

Offline Mode for Frame.io’s iOS apps allows users work from anywhere. They can tap a file to make it available offline, then review and leave comments. As soon as the app comes back online, comments are automatically synced to the project.

COVID-19: How our industry is stepping up

We’ve been using this space to talk about how companies are discounting products, raising money and introducing technology to help with remote workflows, as well as highlighting how pros are personally pitching in.

Here are the latest updates, followed by what we’ve gathered to date:

Adobe
Adobe has made a $4.5 million commitment to trusted organizations that are providing vital assistance to those most in need.

• Adobe is joining forces with other tech leaders in the Bay Area to support the COVID-19 Coronavirus Regional Response Fund of the Silicon Valley Community Foundation, a trusted foundation that serves a network of local nonprofits. Adobe’s $1 million donation will help provide low-income people in Santa Clara County through The Santa Clara County Homelessness Prevention System Financial Assistance Program  with immediate financial assistance to help pay rent or meet other basic needs. Additionally, Adobe is donating $250,000 to the Valley Medical Center Foundation to purchase life-saving ventilators for Bay Area hospitals.
• Adobe has donated $1 million to the COVID-19 Fund of the International Federation of Red Cross and Red Crescent Societies, the recognized global leader in providing rapid disaster relief and basic human and medical services. Adobe’s support will help aid vulnerable communities impacted by COVID-19 around the world. This is in addition to the $250,000 the company is donating to Direct Relief as a part of Adobe’s #HonorHeroes campaign.
• To support the community in India, Adobe is donating $1 million towards the American India Foundation (AIF) and the Akshaya Patra Foundation. The donation will help AIF source much-needed ventilators for hospitals, while the grant for Akshaya Patra will provide approximately 5 million meals to impacted families.

Harbor
Harbor is releasing Inspiration in Isolation, a new talk series that features filmmakers in candid conversation about their creative process during this unprecedented time and beyond. The web series aims to reveal the ideas and rituals that contribute to their creative process. The premiere episode features celebrated cinematographer Bradford Young and senior colorist Joe Gawler. The two, who are collaborators and friends, talk community, family, adapting to change and much more.

The full-length episodes will be released on Harbor’s new platform, HarborPresents, with additional content on Harbor’s social media (@HarborPictureCo).

HPA
The HPA has formed the HPA Industry Recovery Task Force, which will focus on sustainably resuming production and post services, with the aim of understanding how to enable content creation in an evolving world impacted by the pandemic.

The task force’s key objectives are:
• To serve as a forum for collaboration, communication and thought leadership regarding how to resume global production and post production in a sustainable fashion.
• To understand and influence evolving technical requirements, such as the impact of remote collaboration, work from home and other workflows that have been highlighted by the current crisis.
• To provide up-to-date information and access to emerging health and safety guidelines that will be issued by various governments, municipalities, unions, guilds, industry organizations and content creators.
• To provide collaborative support and guidance to those impacted by the crisis.

Genelec
Genelec is donating a percentage of every sale of its new Raw loudspeaker range to the Audio Engineering Society (AES) for the remainder of this year. Additionally, Genelec will fund 10 one-year AES memberships for those whose lives have been impacted by the COVID-19 crisis. A longtime sustaining member of AES, Genelec is making the donation to help sustain the society’s cash flow, which has been significantly affected by the coronavirus situation.

OWC
OWC has expanded its safety protocols, as they continue to operate as an essential business in Illinois. They have expanded their already strong standard operating practice in terms of cleanliness with additional surface disinfection actions, as well as both gloves and masks being used by their warehouse and build teams. Even before recent events, manufacturing teams used gloves to prevent fingerprinting units during build, but those gloves have new importance now. In addition, OWC has both MERV air filters in place and a UV air purifier, which combined are considered to be 99.999% effective in killing/capturing all airborne bacteria and viruses.

Red

For a limited time, existing DSMC2 and Red Ranger Helium and Gemini customers can purchase a Red Extended Warranty at a discounted price. Existing customers who are into their second year of warranty can pay the standard pricing they would receive within their first year instead of the markup price. For example, instead of paying $1,740 (the 20% markup), a DSMC2 Gemini owner who is in within the second year of warranty can purchase an Extended Warranty for $1,450.

This promotion has been extended to June 30. Adding the Red Extended Warranty not only increases the warranty coverage period but also provides benefits such as priority repair, expedited shipping, and premium technical support directly from Red. Customers also have access to the Red Rapid Replacement Program. Extended Warranty is also transferable to new owners if completing a Transfer of Ownership with Red.

DejaSoft
DejaSoft has extended its offering of giving editors 50% off all their DejaEdit licenses — it now goes through the end of June. In addition, the company will help users implement DejaEdit in the best way possible to suit their workflow. DejaEdit allows editors to share media files and timelines automatically and securely with remote co-workers around the world, without having to be online continuously. It helps editors working on Avid Nexis, Media Composer and EditShare workflows across studios, production companies and post facilities ensure that media files, bins and timelines are kept up to date across multiple remote edit stations.

Assimilate
Assimilate is offering all of its products — including Scratch 9.2, Scratch VR 9.2, PlayPro 9.2, Scratch Web and the recently released Live Looks and Live Assist — for free through October 31. Users can register for free licenses. Online tutorials are here and free access to Lowepost online Scratch training is here.

B&H
B&H is partnering with suppliers to donate gear to the teams at Mount Sinai and other NYC hospitals to help health care professionals and first responders stay in touch with their loved ones. Some much-needed items are chargers, power sources, battery packs and mobile accessories. B&H is supporting the Mayor’s Fund to Advance New York City and Direct Relief.

FXhome
FXhome last month turned the attention of its “Pay What You Want” initiative to direct proceeds to help fight Covid-19. This month, in an effort to teach the community new skills, and inspire them with new ideas to help them reinvent themselves, FXhome has today launched a new, entirely free Master Class series designed to teach everything from basic editing, to creating flashy title sequences, to editing audio and of course, learning basic VFX and compositing.

Nugen Audio 
Nugen Audio has a new “Staying Home, Staying Creative” initiative aimed at promoting collaboration and creativity in a time of social distancing. Included are a variety of videos, interviews and articles that will inspire new artistic approaches for post production workflows. The company is also providing temporary replacement licenses for any users who do not have access to their in-office workstations.

Already available on the Staying Creative web page is a special interview with audio post production specialist Keith Alexander. Building from his specialty in remote recording and sound design for broadcast, film and gaming, Alexander shares some helpful tips on how to work efficiently in a home-based setting and how to manage audio cleanup and broadcast-audio editing projects from home. There’s also an article focused on three ways to improve lo-fi drum recording in a less-than-ideal space.

Nugen is also offering temporary two-month licenses for current iLok customers, along with one additional Challenge Response license code authorization. The company has also reduced the prices of all products in its web store.

Tovusound 
Tovusound has extended its 20% discount until the end of the month and has added some new special offers.

The Spot Edward Ultimate Suite expansion, regularly $149, is now $79 with coupon. It adds the Spot creature footstep and movement instrument to the Edward footstep, cloth and props designer. Customers also get free WAV files with the purchase of all Edward instruments and expansions and with all Tovusound bundles. Anyone who purchased one of the applicable products after April 1 also has free access to the WAV files.

Tovusound will continue to donate an additional 10% of the sales price to the CleanOceanProject.org. Customers may claim their discounts by entering STAYHOME in the “apply coupon” field at checkout. All offers end on April 30.

 

Previous Updates

Object Matrix and Cinesys-Oceana
Object Matrix and Cinesys-Oceana are hosting a series of informal online Beer Roundtable events in the coming months. The series will discuss the various challenges with implementing hybrid technology for continuity, remote working and self-serve access to archive content.You can register for the next Beer Roundtable here. The sessions will be open, fun and relaxed. Participants are asked to grab themselves a drink and simply raise their glass when they wish to ask a question.

During the first session, Cinesys-Oceana CTO Brent Angle and Object Matrix CEO Jonathan Morgan will introduce what they believe to be the mandatory elements of the ultimate hybrid technology stack. This will be followed by a roundtable discussion hosted by Harry Skopas, director M&E solutions architecture and technical sales at Cinesys-Oceana, with guest appearances from the media and sports technology communities.

MZed
MZed, an online platform for master classes in filmmaking, photography and visual storytelling, is donating 20% of all sales to the Los Angeles Food Bank throughout April. For every new MZed Pro membership, $60 is donated, equating to 240 meals to feed hungry children, seniors and families. MZed serves the creative community, a large portion of which lives in the LA area and is being hit hard by the lockdown due to the coronavirus. MZed hopes to help play a role in keeping high-risk members of the community fed during a time of extreme uncertainty.

MZed has also launched a “Get One, Gift One” initiative. When someone purchases an MZed Pro membership, that person will not only be supporting the LA Food Bank but will instantly receive a Pro membership to give to someone else. MZed will email details upon purchase.

MZed offers hundreds of hours of training courses covering everything from photography and filmmaking to audio and lighting in courses like “The Art of Storytelling” with Alex Buono and Philip Bloom’s Cinematic Masterclass.

NAB Show
NAB Show’s new digital experience, NAB Show Express, will take place May 13-14. The platform is free and offers 24-hour access to three educational channels, on-demand content and a Solutions Marketplace featuring exhibitor product information, announcements and demos. Registration for the event will open on April 20 at NABShowExpress.com. Each channel will feature eight hours of content streamed daily and available on-demand to accommodate the global NAB Show audience. NAB Show Express will also offer NAB Show’s signature podcast, exploring relevant themes and featuring prominent speakers.

Additionally, NAB Show Express will feature three stand-alone training and executive leadership events for which separate registrations will be available soon. These include:
• Executive Leadership Summit (May 11), produced in partnership with Variety
• Cybersecurity & Content Protection Summit (May 12), produced in partnership with Content Delivery & Security Association (CDSA) and Media & Entertainment Services Alliance (MESA) – registration fees apply
• Post | Production World Online (May 17-19), produced in partnership with Future Media Conferences (FMC) – registration fees apply.

Atto 
Atto Technology is supporting content producers who face new workflow and performance challenges by making Atto Disk Benchmark for macOS more widely available and by updating Atto 360 tuning, monitoring and analytics software. Atto 360 for macOS and Linux have been updated for enhanced stability and include an additional tuning profile. The current Windows release already includes these updates. The software is free and can be downloaded directly from Atto.

Sigma
Sigma has launched a charitable giving initiative in partnership with authorized Sigma lens dealers nationwide. From now until June 30, 2020, 5% of all Sigma lens sales made through participating dealers will be donated to a charitable organization of the dealers’ choice. Donations will be made to organizations working on COVID-19 relief efforts to help ease the devastation many communities are feeling as a result of the global crisis. A full list of participating Sigma dealers and benefiting charities can be found here.

FXhome 
To support those who are putting their lives on the line to provide care and healing to those impacted by the global pandemic, FXhome is adding Partners In Health, Doctors Without Borders and the Center for Disaster Philanthropy as new beneficiaries of the FXhome “Pay What You Want” initiative.

Pay What You Want is a goodwill program inspired by the HitFilm Express community’s desire to contribute to the future development of HitFilm Express, the company’s free video editing and VFX software. Through the initiative, users can contribute financially, and those funds will be allocated for future development and improvements to HitFilm. Additionally, FXhome is contributing a percentage of the proceeds to organizations dedicated to global causes important to the company and its community. The larger the contribution from customers, the more FXhome will donate.

Besides adding the three new health-related beneficiaries, FXhome has extended its campaign to support each new cause from one month to three months, beginning in April and running through the end of June. A percentage of all proceeds of revenues generated during this time period will be donated to each cause.

Covid-19 Film and TV Emergency Relief Fund
Created by The Film and TV Charity in close partnership with the BFI, the new COVID-19 Film and TV Emergency Relief Fund provides support to the many thousands of active workers and freelancers who have been hit hardest by the closure of productions across the UK. The fund has received initial donations totaling £2.5 million from Netflix, the BFI, BBC Studios, BBC Content, WarnerMedia and several generous individuals.

It is being administered by The Film and TV Charity, with support from BFI staff. The Film and TV Charity and the BFI is covering all overheads, enabling donations to go directly to eligible workers and freelancers across film, TV and cinema. One-off grants of between £500 and £2,500 will be awarded based on need. Applications for the one-off grants can be made via The Film and TV Charity’s website. The application process will remain open for two weeks.

The Film and TV Charity also has a new COVID-19 Film and TV Repayable Grants Scheme offering support for industry freelancers waiting for payments under the Government’s Self-employment Income Support Scheme. Interest-free grants of up to £2,000 will be offered to those eligible for Self-employment Income Support but who are struggling with the wait for payments in June. The Covid-19 Film and TV Repayable Grants Scheme opens April 15. Applicants will have one week to make a claim via The Film and TV Charity’s website.

Lenovo
Lenovo is offering a free 120-day license of Mechdyne’s TGX Remote Desktop software, which uses Nvidia Quadro GPUs and a built-in video encoder to compress and send information from the host workstation to the end-point device to decode. This eliminates lag on complex and detailed application files.

Teams can share powerful, high-end workstation resources across the business, easily dialing up performance and powerful GPUs from their standard workstation to collaborate remotely with coworkers around the world.

Users keep data and company IP secure on-site while reducing the risk of data breaches and remotely administering computer hardware assets from anywhere, anytime.
Users install the trial on their host workstations and install the receiver software on their local devices to access their applications and projects as if they were in the office.

Ambidio 
To help sound editors, mixers and other post pro who suddenly find themselves working from home, Ambidio is making its immersive sound technology, Ambidio Looking Glass, available for free. Sound professionals can apply for a free license through Ambidio’s website. Ambidio is also waiving its per-title releasing fee for home entertainment titles during the current cinema shutdown. It applies to new titles that haven’t previously been released through Blu-ray, DVD, digital download or streaming. The free offer is available through May 31.

Ambidio Looking Glass can be used as a monitoring tool for theatrical and television projects requiring immersive sound. Ambidio Looking Glass produces immersive sound that approximates what can be achieved on a studio mix stage, except it is playable through standard stereo speaker systems. Editors and mixers working from home studios can use it to check their work and share it with clients, who can also hear the results without immersive sound playback systems.

“The COVID-19 pandemic is forcing sound editors and mixers to work remotely,” says Ambidio founder Iris Wu. “Many need to finish projects that require immersive sound from home studios that lack complex speaker arrays. Ambidio Looking Glass provides a way for them to continue working with dimensional sound and meet deadlines, even if they can’t get to a mix stage.”

Qumulo
Through July 2020, Qumulo is offering its cloud-native file software for free to public and private-sector medical and health care research organizations that are working to minimize the spread and impact of the COVID-19 virus.

“Research and health care organizations across the world are working tirelessly to find answers and collaborate faster in their COVID-19 vaccine mission,” said Matt McIlwain, chairman of the board of trustees of the Fred Hutchinson Cancer Research Center and managing partner at Madrona Venture Group. “It will be through the work of these professionals, globally sharing and analyzing all available data in the cloud, that a cure for COVID-19 will be discovered.”

Qumulo’s cloud-native file and data services allows organizations to use the cloud to capture, process, analyze and share data with researchers distributed across geographies. Qumulo’s software works seamlessly with the applications medical and health care researchers have been using for decades, as well as with artificial intelligence and analytics services more recently developed in the cloud.

Medical organizations can register to use Qumulo’s file software in the cloud, which will be deployable through the Amazon Web Services and Google Cloud marketplaces.

Goldcrest Post
Goldcrest Post has established the capability to conduct most picture and sound post production work remotely. Colorists, conform editors and other staff are now able to work from home or a remote site and connect to the facility’s central storage and technical resources via remote collaboration software. Clients can monitor work through similar secure, fast and reliable desktop connections.

The service allows Goldcrest to ensure theatrical and television projects remain on track while allowing clients to oversee work in as normal a manner as possible under current circumstances.

Goldcrest has set up a temporary color grading facility at a remote site convenient for its staff colorists. The site includes a color grading control panel, two color-calibrated monitors and a high-speed connection to the main Goldcrest facility. The company has also installed desktop workstations and monitors in the homes of editors and other staff involved in picture conforming and deliverables. Sound mixing is still being conducted on-site, but sound editorial and ancillary sound work is being done from home.In taking these measures, the facility has reduced its on-site staff to a bare minimum while keeping workflow disruption to a minimum.

Ziva Dynamics
Ziva Dynamics is making Ziva VFX character simulation software free for students and educators. The same tools used on Game of Thrones, Hellboy and John Wick: Chapter 3 are now available for noncommercial projects, offering students the chance to learn physics-based character creation before they graduate. Ziva VFX Academic licenses are fully featured and receive the same access and support as other Ziva products.

In addition to the software, Ziva Academic users will now receive free access to Ziva Dynamics’ simulation-ready assets Zeke the Lion (previously $10,000) and Lila the Cheetah. Thanks to Ziva VFX’s Anatomy Transfer feature, the Zeke rig has helped make squirrels, cougars, dogs and more for films like John Wick 3, A Dog’s Way Home and Primal.

Ziva Dynamics will also be providing a free Ziva Academic floating lab license to universities so students can access the software in labs across campuses whenever they want. Ziva VFX Academic licenses are free and open to any fully accredited institution, student, professor or researcher (an $1,800 value). New licenses can be found in the Ziva store and are provided following a few eligibility questions. Academic users on the original paid plan can now increase their license count for free.

OpenDrives 
OpenDrives’ OpenDrives Anywhere is an in-place private cloud model that enables customers with OpenDrives to work on the same project from multiple locations without compromising performance. With existing office infrastructure, teams already have an in-place private cloud and can extend its power to each of their remote professionals. No reinvestment in storage is needed.

Nothing changes from a workflow perspective except physical proximity. With simple adjustments, remote control of existing enterprise workstations can be extended via a secure connection. HP’s ZCentral Remote Boost (formerly RGS) software will facilitate remote access over secure connection to your workstations, or Teradici can provide both dedicated external hardware and software solutions for this purpose, giving teams the ability to support collaborative workflows at low cost. OpenDrives can also get teams quickly set up in under two hours on a corporate VPN and in under 24 hours without.

Prime Focus Technologies 
Prime Focus Technologies (PFT), the technology arm of Prime Focus, has added new features and advanced security enhancements to Clear to help customers embrace the virtual work environment. In terms of security, Clear now has a new-generation HTML 5 player enabled with Hollywood-grade DRM encryption. There’s also support for just-in-time visual watermarking embedded within the stream for streaming through Clear as a secure alternative to generating watermarking on the client side.

Clear also has new features that make it easier to use, including direct and faster download from S3 and Azure storage, easier partner onboarding and an admin module enhancement with condensed permissions to easily handle custom user roles. Content acquisition is made easier with a host of new functionalities to simplify content acquisition processes and reduce dependencies as much as possible. Likewise, for easier content servicing, there is now automation in content localization, to make it easier to perform and review tasks on Clear. For content distribution, PFT has enabled on-demand cloud distribution on Clear through the most commonly used cloud technologies.

Brady and Stephenie Betzel
Many of you know postPerspective contributor and online video editor Brady Betzel from his great reviews and tips pieces. During this crisis, he is helping his wife, Stephenie, make masks for her sister (a nurse) and colleagues working at St. John’s Regional Medical Center in Oxnard, California, in addition to anyone else who works on the “front lines.” She’s sewn over 300 masks so far and is not stopping. Creativity and sewing is not new to her. Her day job is also creating. You can check out her work on Facebook and Instagram.

Object Matrix 
Object Matrix co-founder Nick Pearce has another LinkedIn dispatch, this time launching Good News Friday, where folks from around the globe check in with good news!  You can also watch it on YouTube. Pearce and crew are also offering video tips for surviving working from home. The videos, hosted by Pearce, and are updated weekly. Check them out  here.

Conductor
Conductor is waiving charges for orchestrating renders in the cloud. Updated pricing is reflected in the cost calculator on Conductor’s Pricing page. These changes will last at least through May 2020. To help expedite any transition needs, the Conductor team will be on call for virtual render wrangling of cloud submissions, from debugging scenes and scripts to optimizing settings for cost, turnaround time, etc. If you need this option, then email support@conductortech.com.

Conductor is working with partners to set up online training sessions to help studios quickly adopt cloud strategies and workflows. The company will send out further notifications as the sessions are formalized. Conductor staff is also available for one-on-one studio sessions as needed for those with specific pipeline considerations.

Conductor’s president and CEO Mac Moore said this: “The sudden onset of this pandemic has put a tremendous strain on our industry, completely changing the way studios need to operate virtually overnight. Given Conductor was built on the ‘work from anywhere’ premise, I felt it our responsibility to help studios to the greatest extent possible during this critical time.”

Symply
Symply is providing as many remote workers in the industry as possible with a free 90-day license to SymplyConveyor, its secure, high-speed transfer and sync software. Symply techs will be available to install SymplyConveyor remotely on any PC, Mac or Linux workstation pair or server and workstation.

The no-obligation offer is available at gosymply.com. Users sign up, and as long as they are in the industry and have a need, Symply techs will install the software. The number of free 90-day licenses is limited only by Symply’s ability to install them given its limited resources.

Foundry
Foundry has reset its trial database so that users can access a new 30-day trial for all products regardless of the date of their last trial. The company continues to offer unlimited non-commercial use of Nuke and Mari. On the educational side, students who are unable to access school facilities can get a year of free access to Nuke, Modo, Mari and Katana.

They have also announced virtual events, including:

• Foundry LiveStream – a series of talks around projects, pipelines and tools.
• Foundry Webinars – A 30 to 40-minute technical deep dive into Foundry products, workflows and third-party tools.
• Foundry Skill-Ups – A 30-minute guide to improving your skills as a compositor/lighter/texture artist to get to that next level in your career.
• Foundry Sessions – Special conversations with our customers sharing insights, tips and tricks.
• Foundry Workflow Wednesdays –10-minute weekly videos posted on social media showing tips and tricks with Nuke from our experts.

Alibi Music Library
Alibi Music Library is offering free whitelisted licensing of its Alibi Music and Sound FX catalogs to freelancers, agencies and production companies needing to create or update their demo reels during this challenging time.

Those who would like to take advantage of this opportunity can choose Demo Reel 2020 Gratis from the shopping cart feature on Alibi’s website next to any desired track(s). For more info, click here.

2C Creative
Caleb & Calder Sloan’s Awesome Foundation, the charity of 2C Creative founders Chris Sloan and Carla Kaufman Sloan, is running a campaign that will match individual donations (up to $250 each) to charities supporting first responders, organizations and those affected by COVID-19. 2C is a creative agency & production company serving the TV/streaming business with promos, brand integrations, trailers, upfront presentations and other campaigns. So far, the organization’s “COVID-19 Has Met Its Match” campaign has raised more than $50,000. While the initial deadline date for people to participate was April 6, this has now been extended to April 13. To participate, please visit ccawesomefoundation.org for a list of charities already vetted by the foundation or choose your own. Then, simply email a copy of your donation receipt to: cncawesomefoundation@gmail.com and they will match it!

Red Giant 
For the filmmaking education community, Red Giant is offering Red Giant Complete — the full set of tools including Trapcode Suite, Magic Bullet Suite, Universe, VFX Suite and Shooter Suite — free for students or faculty members of a university, college or high school. Instead of buying separate suites or choosing which tools best suits one’s educational needs or budget, students and teachers can get every tool Red Giant makes completely free of charge. All that’s required is a simple verification.

How to get a free Red Giant Complete license if you are a student, teacher or faculty member:
1. Use school or organization ID or any proof of current employment or enrollment for verification. More information on academic verification is available here.
2. Send your academic verification to academic@redgiant.com.
3. Wait for approval via email before purchasing.
4. Once you get approval, go to the Red Giant Complete Product Page and “buy” your free version. You will only be able to buy the free version if you have been pre-approved.

The free education subscription will last 180 days. When that time period ends, users will need to reverify their academic status to renew their free subscription.

Flanders Scientific
Remote collaboration and review benefits greatly from having the same type of display calibrated the same way in both locations. To help facilitate such workflow consistency, FSI is launching a limited time buy one, get one for $1,000 off special on its most popular monitor, the DM240.

Nvidia
For those pros needing to power graphics workloads without local hardware, cloud providers, such as Amazon Web Services and Google Cloud, offer Nvidia Quadro Virtual Workstation instances to support remote, graphics-intensive work quickly without the need for any on-prem infrastructure. End-users only need a connected laptop or thin client, as the virtual workstations support the same Nvidia Quadro drivers and features as the physical Quadro GPUs used by pro artists and designers in local workstations.

Additionally, last week, Nvidia has expanded its free virtual GPU software evaluation to 500 licenses for 90 days to help companies support their remote workers with their existing GPU infrastructure. Nvidia vGPU software licenses — including Quadro Virtual Workstation — enable GPU-accelerated virtualization so that content creators, designers, engineers and others can continue their work. More details are available here.  Nvidia has also posted a separate blog on virtual GPUs to help admins who are working to support remote employees

Harman
Harman is offering a free e-learning program called Learning Sessions in conjunction with Harman Pro University.

The Learning Sessions and the Live Workshop Series provide a range of free on-demand and instructor-led webinars hosted by experts from around the world. The Industry Expert workshops feature tips and tricks from front of house engineers, lighting designers, technicians and other industry experts, while the Harman Expert workshops feature in-depth product and solution webinars by Harman product specialists.

• April 7—Lighting for Churches: Live and Video with Lucas Jameson and Chris Pyron
• April 9—Audio Challenges in Esports with Cameron O’Neill
• April 15—Special Martin Lighting Product Launch with Markus Klüesener
• April 16—Lighting Programming Workshop with Susan Rose
• April 23—Performance Manager: Beginner to Expert with Nowell Helms

Apple
Apple is offering free 90-day trials of Final Cut Pro X and Logic Pro X apps for all in order to help those working from home and looking for something new to master, as well as for students who are already using the tools in school but don’t have the apps on their home computers.

Avid
For its part, Avid is offering free temp licenses for remote users of the company’s creative tools. Commercial customers can get a free 90-day license for each registered user of Media Composer | Ultimate, Pro Tools, Pro Tools | Ultimate and Sibelius | Ultimate. For students whose school campuses are closed, any student of an Avid-based learning institution that uses Media Composer, Pro Tools or Sibelius can receive a free 90-day license for the same products.

Aris
Aris, a full-service production and post house based in Los Angeles, is partnering with ThinkLA to offer free online editing classes for those who want to sharpen their skills while staying close to home during this worldwide crisis. The series will be taught by Aris EP/founder Greg Bassenian, who is also an award-winning writer and director. He has also edited numerous projects for clients including Coca-Cola, Chevy and Zappos.

mLogic
mLogic is offering a 15% discount on its mTape Thunderbolt 3 LTO-7 and LTO-8 solutions The discount applies to orders placed on the mTape website through April 20th. Use discount code mLogicpostPerspective15%.

Xytech
Xytech has launched “Xytech After Dark,” a podcast focusing on trends in the media and broadcasting industries. The first two episodes are now available on iTunes, Spotify and all podcasting platforms.

Xytech’s Greg Dolan says the podcast “is not a forum to sell, but instead to talk about why create the functionality in MediaPulse and the types of things happening in our industry.”

Hosted by Xytech’s Gregg Sandheinrich, the podcast will feature Xytech staff, along with special guests. The first two episodes cover topics including the recent HPA Tech Retreat (featuring HPA president Seth Hallen), as well as the cancellation of the NAB Show, the value of trade shows and the effects of COVID-19 on the industry.

Adobe
Adobe shared a guide to best practices for working from home. It’s meant to support creators and filmmakers who might be shifting to remote work and need to stay connected with their teams and continue to complete projects. You can find the guide here.

Adobe’s principal Creative Cloud evangelist, Jason Levine, hosted a live stream — Video Workflows With Team Projects that focus on remote workflows.

Additionally, Karl Soule, Senior Technical Business Development Manager, hosed a stream focusing on Remote video workflows and collaboration in the enterprise. If you sign up on this page, you can see his presentation.

Streambox
Streambox has introduced a pay-as-you-go software plan for video professionals who use its Chroma 4K, Chroma UHD, Chroma HD and Chroma X streaming encoder/decoder hardware. Since the software has been “decoupled” from the hardware platform, those who own the hardware can rent the software on a monthly basis, pause the subscription between projects and reinstate it as needed. By renting software for a fixed period, creatives can take on jobs without having to pay outright for technology that might have been impractical

Frame.io 
Through the end of March, Frame.io is offering 2TB of free extra storage .capacity for 90 days. Those who could use that additional storage to accommodate work from home workflows should email rapid-response@frame.io to get it set up.

Frame.io is also offering free Frame.io Enterprise plans for the next 90 days to support educational institutions, nonprofits and health care organizations that have been impacted. Please email rapid-response@frame.io to set up this account.

To help guide companies through this new reality of remote working, Frame.io is launching a new “Workflow From Home” series on YouTube, hosted by Michael Cioni, with the first episode launching Monday, March 23rd. Cioni will walk through everything artists need to keep post production humming as smoothly as possible. Subscribe to the Frame.io YouTube channel to get notified when it’s released.

EditShare
EditShare has made its web-based, remote production and collaboration tool, Flow Media Management, free through July 1st. Flow enables individuals as well as large creative workgroups to collaborate on story development with capabilities to perform extensive review approval from anywhere in the world. Those interested can complete this form and one of EditShare’s Flow experts will follow up.

Veritone 
Veritone will extend free access to its core applications — Veritone Essentials, Attribute and Digital Media Hub — for 60 days. Targeted to media and entertainment clients in radio, TV, film, sports and podcasting, Veritone Essentials, Attribute, and Digital Media Hub are designed to make data and content sharing easy, efficient and universal. The solutions give any workforce (whether in the office or remote) tools that accelerate workflows and facilitate collaboration. The solutions are fully cloud-based, which means that staff can access them from any home office in the world as long as there is internet access.

More information about the free access is here. Certain limitations apply. Offer is subject to change without notice.

SNS
In an effort to quickly help EVO users who are suddenly required to work on editing projects from home, SNS has released Nomad for on-the-go, work-from-anywhere, remote workflows. It is a simple utility that runs on any Mac or Windows system that’s connected to EVO.

Nomad helps users repurpose their existing ShareBrowser preview files into proxy files for offline editing. These proxy files are much smaller versions of the source media files, and therefore easier to use for remote work. They take up less space on the computer, take less time to copy and are easier to manage. Users can edit with these proxy files, and after they’re finished putting the final touches on the production, their NLE can export a master file using the full-quality, high-resolution source files.

Nomad is available immediately and free to all EVO customers.

Ftrack
Remote creative collaboration tool ftrack Review is free for all until May 31. This date might extend as the global situation continues to unfold. ftrack Review is an out-of-the-box remote review and approval tool that enables creative teams to collaborate on, review and approve media via their desktop or mobile browser. Contextual comments and annotations eliminate confusion and reduce reliance on email threads. ftrack Review accepts many media formats as well as PDFs. Every ftrack Review workspace receives 250 GB of storage.

Cinedeck 
Cinedeck’s cineXtools allows editing and correcting your file deliveries from home.
From now until April 3rd, pros can get a one month license of cineXtools free of charge.

 

 

Posting Everest VR: Journey to the Top of the World

While preparing to climb both Mount Everest and Mount Lhotse without the use of bottled oxygen, renowned climber Ueli Steck fell to his death in late April of 2017. VR director and alpine photographer Jonathan Griffith and mountain guide Tenji Sherpa, both friends of Steck, picked up the climber’s torch, and the result was the 8K 3D documentary Everest VR: Journey to the Top of the World, produced by Facebook’s Oculus.

Over the course of three years, Griffith shot footage following Tenji and some of the world’s most accomplished climbers in some of the world’s most extreme locations. The series also includes footage that lets viewers witness what it is like to be engulfed in a Himalayan avalanche, cross a crevasse and staring deep in its depths, take a huge rock-climbing fall, camp under the stars and soak in the view from the top of the world.

For the post part of the doc, Griffith called on veteran VR post pro Matthew DeJohn for editing and color correction, VR stitching expert Keith Kolod and Brendan Hogan for sound design.

“It really was amazing how a small crew was able to get all of this done,” says Griffith. “The collaboration between myself as the cameraman and Matt and Keith was a huge part of being able to get this series done — and done at such as a high quality.

“Matt and Keith would give suggestions on how to capture for VR, how camera wobbling impacted stitching, how to be aware of the nadir and zenith in each frame and to think about proximity issues. The efficient post process helped in letting us focus on what was needed, and I am incredibly happy with the end result.”

DeJohn was tasked with bringing together a huge amount of footage from a number of different high-end camera systems, including the Yi Halo and Z Cam V1 Pro.

DeJohn called on Blackmagic Resolve for this part of the project, saying that using one tool for all helped speed up the process.“A VR project usually has different teams of multiple people for editing, grading and stitching, but with Resolve, Keith and I handled everything,” he explains.

Within Resolve, DeJohncut the series at 2Kx2K, relinked to 8Kx8K source and then change the timeline resolution to 8Kx8K for final color and rendering. He used the Fairlight audio editing tab to make fine adjustments, manage different narration takes with audio layers, and manage varied source files such as mono-narration, stereo music and four-channel ambisonic spatial audio.

In terms of color grading, DeJohn says, “I colored the project from the very first edit so when it came to finalize the color it was just a process of touching things up.”

Fusion Studio was used for stereoscopic alignment fixes, motion graphics, rig removal, nadir patches, stabilization, stereo correction of the initial stitch, re-orienting 360 imagery, viewing the 360 scenes in a VR headset and controlling focal areas. More intense stitching work was done by Kolod using Fusion Studio.

Footage of such an extreme environment, as well as the closeness of climbers to the cameras, provided unique challenges for Kolod who had to rebuild portions of images from individual cameras. He also had to manually ramp down the stereo on the images north and south poles to ensure easy viewing, fix stereo misalignment and distance issues between the foreground and background and calm excessive movement in images.

“A regular fix I had to make was adjusting incorrect vertical alignments, which create huge problems for viewing. Even if a camera is a little bit off, the viewer can tell,” says Kolod. “The project used a lot of locked-off tripod cameras, and you would think that the images coming from them would be completely steady. But a little bit of wind or slight movement in what is usually a calm frame makes a scene unwatchable in VR. So I used Fusion for stabilization on a lot of shots.”

“High-quality VR work should always be done with manual stitching with an artist making sure there are no rough areas. The reason why this series looks so amazing is that there was an artist involved in every part of the process — shooting, editing, grading and stitching,” concludes Kolod.

RuckSackNY: Branding, targeted videos and high-quality masks

By Randi Altman

Fred Ruckel got his start in post at New York’s Post Perfect in the ‘90s. From there he grew his skills and experience before opening his own shop, Stitch. While spending his days as a Flame artist, in his spare time Ruckel and his wife Natasha invented something called the Ripple Rug. They’ve since moved to upstate New York, where they built an extensive post suite and studio under the name RuckSackNY.

Fred Ruckel at work.

What is the Ripple Rug, you ask? It’s essentially a cat playground in a rug, but their site describes it as “a multifunction pet enrichment system mainly geared toward house cats.”

Fred and Natasha (whose own career includes stints at creative agencies as well as Autodesk) felt strongly about manufacturing the Ripple Rug in the US, and they wanted to use recycled materials. After a bit of research, they found a factory in Georgia and used recycled plastic water bottles in the process. To date they have recycled over 3 million bottles.

To help promote the Ripple Rug, the Ruckels leveraged their creative capabilities from years of working in advertising and post to create a brand from scratch.

When the COVID-19 crisis hit, the Ruckels realized they were in a unique position — they could repurpose the Georgia factory to make masks and face shields for health workers and the general population. While reformatting the factory to this type of manufacturing is still ongoing, the Ruckels wanted to make sure that, in the meantime, people would have access to high-quality face masks. So they sourced masks via their textile production partners, had them tested in a US lab, and have already sold over 40,000 masks under their new brand, SnugglyMask.

Many have taken to making their own masks, so the factory will also be making filters to help beef up that protection, which will allow people to buy filter packs for their homemade masks. Check out their video showing people how to make their own masks.  “We should have that part functional this week or next. Our mask supplier is quickly trying to put together the production pipeline so we can make masks here, but those machines are automated and take a bit of engineering to make them work properly.”

These materials will be both sold to the general public and donated to those on the frontlines. The Ruckels have once again used their creative backgrounds to build a brand and tell a story. Let’s find out more from Fred…

With the recent COVID-19 crisis, you realized that your factory could be used to make masks — both for civilians and for medical professionals and those on the frontline. How did you come to that realization, and what were your first steps?
When the pandemic broke out, we immediately took action to help the cause. Our factory makes many textile products, and we knew we could set up an assembly line to make masks, shields and gowns, and with some funding, we could pretty much make anything. We have the know-how and ability, as well as 60,000 square feet of space, which we are cutting a chunk out of to make a clean room to handle the process in as sterile an environment as possible.

I reached out to New York Governor Andrew Cuomo’s office, our local congressman and Empire State Development. At the same time, I was communicating with Georgia (we are a registered business in both states) and worked with the Department of Economic Development and the National Association of Manufacturers. That led us to the Global Center for Medical Innovation.

Natasha Ruckel

So while that was happening, you decided to sell and donate masks?
Yes. While waiting for responses to help us retool our factory, we had to do something to be an immediate help. We did not want to wait on the sidelines for red tape to be cut; we had to come up with Plan B while waiting for government help.

Plan B meant using our resources to allow us to purchase masks without several levels of middlemen raising the prices. We still ended up with two levels of middlemen, but it’s better than five! In manufacturing, it is all about pennies. This is a lesson I learned from a mentor early on with our Ripple Rug project. Middlemen make pennies, a nickel becomes $50,000 in profit on 1 million units, so pennies add up, and middlemen capitalize on that. My goal is to remove middlemen and get directly sourced goods to people in need at the best price possible.

Can you describe both masks and the materials used?
In our PSA, we demonstrate the use of a cloth bandana versus a basic medical mask. We are looking to filter particulate matter down to the micron level, smaller than the human eye can see. For reference, the human eye can only see particles as small as 50 to 60 microns (think about a fleck of dust caught in sunlight). The particles we are looking to “arrest” are down to .3 microns, smaller than red blood cells.

The mechanical weaving of cloth masks makes them porous. This allows particulate matter to pass right through, as the holes are enormous in scale. The key component is the middle layer is called “melt-blown.” The outer layer is a polypropylene spun-bond fiber, and the inside layer is an acrylic spun-bond fiber. Sandwiched between is the melt -blown layer, which is the fine particulate catcher. Each layer captures a different size particle. Think of it as a video production — it would be like adding multiple scrims to lights to block light, except we are blocking particles in this case.

You recently created a PSA detailing the differences in the masks people are using to protect themselves. What did you use to shoot and post?
The PSA was shot using a Canon EOS 5D Mark IV. We have some great Fiilex LED lighting kits with a ring light and a 7-foot white shooting tent. My intent wasn’t to make a full-on video. I was shooting elements to make animating gifs to show the testing process. When I loaded the footage into Adobe Premiere and made a selects reel, I realized we had the elements of a PSA … and so a spot was born.

Natasha looked at my selects and quickly switched into producer mode and pieced together a storyline. We then had to shoot more elements. Fortunately, our shooting studio is in our home, so there were no delays. I shot an element, loaded it, shot another and so on until we had the pieces to make it work.

Natasha created graphic elements in Adobe Illustrator while I worked on the edit in Premiere. We also took product pics in raw mode for the packaging and demos, which we developed in camera raw within Photoshop. We shot the video portion in 4K, which allowed us to punch in for closeups and pull back to reframe as if it were a multi-cam shoot.

We filmed on a stainless steel table to give it a clinical feel while blowing it out a little bit to feel ethereal and hazy. My favorite shot is the water dripping on the table; the lighting and table make it feel like mercury.

Why was it so important for you to turn your business into the mask business?
There are so many reasons that it is hard to pinpoint. I knew we had the capability, and our pipeline was efficient enough to pull it off from start to finish. As an inventor I’ve seen people take advantage of situations for financial gain — like knocking off products — and that means making fake masks, which cause more harm than good.

I saw an opportunity to protect everyone I know by supplying quality masks they can trust. On internet sites, fake masks can look identical. In fact, the pics might be of the real mask, but they ship you a cheap version that’s missing some key elements.

I do not cut corners. As a Flame artist, I continually dealt with clients saying, ‘It’s good enough, let’s move to the next shot.” Good enough is not what I do; I do not have a halfway button. I’d look like a bad Flame artist if I didn’t go all the way.

Knowing that we can play an active part in protecting my friends and family and colleagues in the post community by taking on this single effort made me pull the trigger. With that, SnugglyMask.com was born.

Are you guys selling and donating masks? How is that working?
We are both selling and donating masks. One of our RuckSackNY clients is a philanthropist named Josh Malone. As his marketing agency, we created a mask donation program. The first hospitals we shipped to were Montefiore Medical Center in the Bronx and Westchester Medical Center. We will be donating to hospitals nationwide and also selling masks to hospitals and the public via our site, https://www.snugglymask.com/. This is a place people can go for a mask they can trust and that has been lab tested. We built a brand in just a week, and sales simply exploded due to our honest content and demand.

Why is it important for you to make sure your products are being made in the US?
We make the Ripple Rug in the US to provide jobs for US workers. There are more than 100 people working at 10 companies in five states for Ripple Rug. I order carpet 100,000 square feet at a time and cannot imagine shipping it from overseas with the demand we must meet. Shipping from China takes weeks, if not months.

Making it in the USA means continual production to meet demand while reinvesting to grow along the way. Sure, I could produce my products in China and make a lot more money, but I am proud to say American workers put food on the table and children go to school because we make our products in the USA. That alone makes it worth it to me.

Do you feel the videos you create help get more people to pay attention to the product?
We feel effective videos engage viewers and build intrigue about our product. We create a range of videos, not just the regular polished spots. Consumers appreciate the feeling of user-generated content, as it adds to the authenticity of the product. If every spot is beautiful, it feels staged.

We have a series called “Cats Gone Wild” in which all of the videos are made solely of user-generated content sourced from YouTube, Facebook and Instagram. I edit them to a stock music track and create a theme for each video. We add titles to call out the social media names to give credit to the person who posted the video and to give them a little spotlight on our show reel. This, in turn, creates engagement, as it encourages them to share the video on their social media channels.

I keep my edits to around a minute for this series to “get in and get out” before losing the viewer’s attention. The original content is cut to a whimsical track and is fun to watch — who doesn’t love cute cat videos? We share these on social media, and that helps grow our sales. Our customers love it, they get acknowledgement, our brand grows, and we are able to show our product in action.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Hecho Studios: Mobilizing talent and pipeline to keep working

By Ryan Curtis

When Hecho first learned of the possibility of a shutdown due to COVID-19, we started putting together a game plan to maintain the level of production quality and collaboration that we are all used to, but this time remotely. Working closely with our chief content officer Tom Dunlap, our post production workflow manager Nathan Fleming and senior editor Stevo Chang, we first identified the editors, animators, colorists, Flame artists, footage researchers and other post-related talent who work with us regularly. We then built a standing army of remote talent who were ready to embrace the new normal and get to work.

Ryan Curtis

It was a formidable challenge to get the remote editorial stations up and running. We had a relatively short notice that we were going to have to finalize and enact a WFH game plan in LA. In order to keep productions running smoothly, we teamed with our equipment vendor, VFX Technologies, to give our IT team the ability to remote in and fully outfit each work station with software. They also scheduled a driver to make contact-free drop offs at the homes of our artists. We’ve deployed over 15 iMacs for editorial, animation and finishing needs. We can scale as needed, and only need two to three days’ notice to get a new artist fully set up at home with the appropriate tools. Our remote edit bay workstations are mainly iMac Pros, running the Adobe suite of tools, Maxon Cinema 4D, Blackmagic DaVinci Resolve and Autodesk Flame.

We have outfitted each member of our team with Signiant, which allows for rapid speed file transfers for larger files. If an artist’s home internet is not up to snuff for their project, we have been boosting their internet speeds. To maintain file integrity, we are rolling out the same file structure as you would find on our server, allowing us to archive projects back to the server remotely once delivered. We’ve also designated key people who can access the in-office stations and server virtually, retrieve assets and migrate them to remote teams to refresh existing campaigns.

The need to review during each phase of production has never been stronger. We tested a wide variety of review solutions, and have currently settled on the following:

• For Animation/Design-Based Projects:
Frankie – Export-based interactive reviews
• For Editorial Projects:
Evercast – Live plug and play sessions
Wiredrive (often times paired with Google Hangouts or Zoom)
• For Finishing:
Vimeo Review – Export-based color reviews
Streambox – Live color collaboration (paired with Google Hangouts or Zoom)
Frankie – Export-based interactive reviews
Wiredrive for deliverables (often times paired with Google Hangouts or Zoom)

Our collective of talent remains our contracted veteran Hecho crew, well over 50 people who know our shorthand and in-office workflows and can easily be onboarded to our new remote workflow. If needed to satisfy a specific creative challenge, we bring in new talent and quickly onboard them into the Hecho family.

In terms of how we deal with approvals, it depends on the team and the project. If you have a dedicated team to a project it can be even more efficient than working in the office. Overcommunication is key, and transparency with feedback and workflows is paramount to a successful project. However, in many cases, efficiencies can be lost and projects currently move about 20 percent slower than if we were in the office. To combat this, some teams have structured a little differently as it can be hard to wrangle busy individuals with fast deadlines remotely. So having approved backup approvers on board has been immensely helpful to keep projects moving along on time. And without clients in the bay, we lean even more on our post producers to funnel all questions and feedback from clients, ensuring clear back and forth with artists.

NFL #stayhomestaystrong

Challenges Solved
Aside from the lack of in-person interaction and the efficiencies of quick catch ups in the hall or in the bay, the biggest challenge has been home internet speeds. This affects everything else that’s involved with a WFH set up. In some cases we had to actually upgrade current ISP contracts in order to reach an acceptable baseline for getting work done: streaming reviews, file sharing, etc.

The other challenge was quickly testing/evaluating new tools and then getting everybody up to speed on how to use them. Evercast was probably the trickiest new product because it involves live streaming from an editor’s machine (using Adobe Premiere) while multiple “reviewers” watch them work in real time. As you can imagine, there are many factors that can affect live streaming: CPU of the streaming computer, bitrate you’re streaming, etc. Luckily, once we had gone through a couple setups and reviews (trial and error) things got much easier. Also the team at Evercast (thanks Brad, Tyrel, and Robert!) were great in helping us figure out some of the issues we ran into early on.

Our First WFH Projects
For our first COVID-19 response project, we worked with agency 72andSunny and the NFL to share the uplifting message #Stayhomestaystrong. Behind the scenes, our post team produced a complete offline to online workflow in record time and went from brief to live in six days while everyone transitioned to working entirely remotely. #Stayhomestaystrong also helped bring in $35 million in donations toward COVID relief groups. Credits include editors Amanda Tuttle, Andrew Leggett, assistant editors: Max Pankow, Stephen Shirk, animator Lawrence Wyatt, Flame artists Rachel Moorer, Gurvand Tanneau and Paul Song and post producer Song Cho.

Stay INspired

Another project we worked with 72andSunny on was COVID-19 response ad, Pinterest Stay INspired, involving heavy motion graphics and a large number of assets, which ranged from stock photos, raw video files from remote shoots and licensed UGC assets. The designers, motion graphics artists, writers and clients used a Google Slides deck to link thumbnail images directly to the stock photo or UGC asset. Notes were sent directly to their emails via tags in the comments section of the slides.

Our team shared storyboards, frequently jumped on video conference calls and even sent recorded hand gestures to indicate the kind of motion graphic movement they were looking for. Credits for this one include editor/motion designer: Stevo Chang, motion designer Sierra Hunkins, associate editor Josh Copeland and post producer Cho, once again.

What We Learned
WFH reinforced the need for the utmost transparency in team structures and the need for super-clear communication. Each and every member of our team has needed to embrace the change and take on new challenges and responsibilities. What worked before in office, doesn’t necessarily work in a remote situation.

The shutdown also forced us to discover new technologies, like Evercast, and we likely wouldn’t have signed up for Signiant for a while. Moving forward, these tools have both been great additions to what we can offer our clients. These new technologies also open up future opportunities for us to work with clients we didn’t have access to before (out of state and overseas). We can do live remote sessions without the client having to physically be in a bay which is a game changer.


Ryan Curtis is head of post production at two-time Emmy-nominated Hecho Studios, part of MDC’s Constellation collective of companies.

Atomos Ninja V to record 5.9K raw from Panasonic S1H

Atomos and Panasonic are making updates to the Ninja V HDR monitor-recorder and Panasonic Lumix S1H mirrorless digital camera that will make it possible to record 5.9K Apple ProRes raw files directly from the camera’s sensor. The free updates will be available May 25.

The Ninja V captures highly detailed 12-bit raw files from the S1H over HDMI at up to 5.9K/29.97p in full frame or 4K/59.94p in Super35. These clean, unprocessed files preserve the maximum dynamic range, color accuracy and detail from the S1H. The resulting ProRes raw files offer perfect skin tones and easily matched colors ideal for both HDR and SDR (Rec. 709) workflows.

With the new 3.5K Super35 Anamorphic 4:3 raw mode, the Ninja V and S1H combination caters to cinematographers who shoot with anamorphic lenses. The Ninja V and S1H can now be used as an A camera or a smaller B camera on an anamorphic raw production.

Each frame recorded in ProRes raw has metadata supplied by the S1H. Apple’s Final Cut Pro X and other NLEs will automatically recognize ProRes raw files as coming from the S1H and set them up for editing and display in either SDR or HDR projects automatically. Additional information will also allow other software to perform extensive parameter adjustments.

Dolores McGinley heads Goldcrest London’s VFX division

London’s Goldcrest Post, a picture and audio post studio, has launched a visual effects division at its Lexington Street location. It will be led by VFX vet Dolores McGinley, whose first task is to assemble a team of artists that will provide services for both new and existing clients.

During the COVID-19 crisis, all Goldcrest staff is working from home except the colorists, who are coming in as needed and working alone in the grading suites. McGinley and her team will move into the Goldcrest facility when lockdown has ended.

“Having been immersed in such a diverse range of projects over the past five years, we identified the need to expand into VFX some time ago,” explains Goldcrest MD Patrick Malone. “We know how essential an integrated VFX service is to our continued success as a leading supplier of creative post solutions to the film and broadcast community.

“As a successful VFX artist in her own right, Dolores is positioned to interpret the client’s brief and offer constructive creative input throughout the production process. She will also draw upon her considerable experience working with colorists to streamline the inclusion of VFX into the grade and guarantee we are able to meet the specific creative requirements of our clients.”

With over two decades of creative experience, McGinley joins Goldcrest having held various senior roles within the London VFX community. Recent examples of her work include The Crown, Giri/Haji and Good Omens.

Colorist Chat: I Am Not Okay With This’ Toby Tomkins

Colorist Toby Tomkins, co-founder of London color grading and finishing boutique Cheat, collaborated once again with The End of the F***ing World director Jonathan Entwistle on another Charles Forsman novel, I Am Not Okay With This. The now-streaming Netflix show is produced by Entwistle alongside Stranger Things EPs Shawn Levy and Dan Cohen, as well as Josh Barry. The director also once again called DP Justin Brown.

Toby Tomkins

Adapted from Forsman’s graphic novel of the same name, the series follows a teenage girl named Sydney as she navigates school, crushes, her sexuality and sudden on-set superpowers. You know, the typical teenage experience.

Here, Tomkins talks about collaborating with the director and DP as well as his workflow.

How early did you get involved on I Am Not Okay With This?
Jon Entwistle had reached out to DP Justin Brown about his interest in adapting this graphic novel after working on The End of the F***ing World. When the series then got commissioned and Justin was on board, he and Jon convinced production company 21 Laps that they could do the grade in London with Cheat. There were some discussions about grading in LA, but we managed to convince them that it could be a quick and easy process back here, and that’s how I got involved.

I was on board quite early on in the production, getting involved with camera tests and reviewing all the material with Justin. We worked together to evaluate the material, and after Justin chose the camera and lenses, we built a color pipeline that informed how the material was shot and how the show would be captured and pass through the color pipeline. From then, we started building off the work we did on The End of the F***ing World. (Check out our coverage of The End of the F***ing World, which includes an interview with Tomkins.)

What kind of look did Jon and Justin want, and how did they express that look to you? Film or show references? Pictures?
There were quite a few visual references, which I already knew from previously working with Jon and Justin. They both gravitate toward a timeless American cinema look — something photochemical but also natural. I knew it would be similar to The End of the F***ing World, but we were obviously using different locations and a slightly different light, so there was a little bit of playing around at the beginning.

We’re all fans of American cinema, especially the look of old film stock. We wanted the look of the show to feel a little bit rough around the edges — like when things used to be shot on film and you had limited control on how to make any changes. Films weren’t corrected to a perfect level and we wanted to keep those imperfections for this show, making it feel authentic and not overly polished. Although it was produced by the same people that did Stranger Things, we wanted to stray away from that style slightly, making it feel a bit different.

We were really aiming for a timeless American look, with a vintage aesthetic that played into a world that was slightly out of place and not really part of reality. During the grade, Justin liked to put a line through it, keeping it all very much in the same space, with perhaps a little pop on the reds and key “American” colors.

Personally, I wanted to evoke the style of some teen film from the late 20th century — slightly-independent looking and minimally processed. Films like 10 Things I Hate About You and She’s All That certainly influenced me.

You have all worked together in the past. How did that help on this show? Was there a kind of shorthand?
We learned a lot doing The End of the F***ing World together and Justin and I definitely developed a shorthand. It’s like having a head start because we are all on the same page from the get-go. Especially as I was grading remotely with Justin and Jon just trusted us to know exactly what he wanted.

Tomkins works on Resolve

At the end of the first day, we shared our work with Jon in LA and he’d watch and add his notes. There were only three notes of feedback from him, which is always nice! They were notes on richness in some scenes and a question on matching between two shots. As we’d already tested the cameras and had conversations about it before, we were always on the same page with feedback and I never disagreed with a single note. And Jon only had to watch the work through once, which meant he was always looking at it with clean, fresh eyes.

What was the show shot on, and what did you use for color grading?
It was shot ARRI Alexa, and I used DaVinci Resolve Studio.

Any particular challenges on this one for you?
It was actually quite smooth for me! Because Justin and I have worked together for so long, and because we did the initial testing around cameras and LUTs, we were very prepared. Justin had a couple of challenges due to unpredictable weather in Pittsburgh, but he likes to do as much as possible in-camera. So once it got to me, we were already aligned and prepared.

How did you find your way to being a colorist?
I started off in the art department on big studio features but wanted to learn more about filmmaking in general, so I went to film school in Bournemouth, back when it was called the Arts University College Bournemouth. I quickly realized my passion was post and gleamed what I could from an exceptional VFX tutor there called Jon Turner. I started specializing in editing and then VFX.

I loved the wizardry and limitless availability of VFX but missed the more direct relationship with storytelling, so when I found out about color grading — which seemed like the perfect balance of both — I fell in love. Once I started grading, I didn’t stop. I even bribed the cleaners to get access to the university grading suite at night.

My first paid gig was for N-Dubz, and after I graduated and they became famous, they kept me on. And that gave me the opportunity to work on bigger music videos with other artists. I set up a suite at home (way before anyone else was really doing this) and convinced clients to come 30 minutes out of London to my parents’ house in a little village called Kings Langley.

I then got asked to set up a color department for a sound studio called Tate Post, where I completed lots of commercials, a few feature films — notably Ill Manors — and some shorts. These included one for Jon called Human Beings, which is where our relationship began! After that, I went it alone again and eventually set up Cheat. The rest is history.

What, in addition to the color, do you provide on projects? Small VFX, etc.?
For I Am Not Okay With This, we did some minor work, online and delivery in house at Cheat. I just do color, however. I think it’s best to leave each department to do its own work and trust the knowledge and experience of experts in the field. We worked with LA-based VFX company Crafty Apes for the show; they were really fantastic.

Where do you get inspiration? Photographs, museums?
Mostly from films — both old and new — and definitely photography and the work of other colorists.

Finally, any advice you’d give your younger self about working as a colorist?
Keep at it! Experience is everything.

Digital Nirvana updated Trance 3.0 for captions, transcriptions

Digital Nirvana has released version 3.0 of its Trance cloud-based application for closed captioning and transcription, which combines STT technology and other AI-driven processes with cloud-based architecture. Implementing cloud-based metadata generation and closed captioning as part of their existing operations, media companies can reduce the time and cost of delivering accurate, compliant content worldwide. Users can can enrich and classify content, which enables more effective repurposing of media libraries and facilitating more intelligent targeting of advertising spots.

“Trance 3.0 includes a new transcript correction window, a text translation engine that simplifies and speeds captioning in additional languages, and automated caption conformance to accelerate delivery of content to new platforms and geographic regions,” says Russell Wise, SVP at Digital Nirvana. “Even now, with the widespread need to work from home, Trance 3.0 users can maintain their productivity in prepping content for distribution on platforms such as Quibi, Netflix, Hulu, HBO Max, and others.”

A new transcript correction window simplifies the process of reviewing and correcting the transcript used to generate closed captions. The user interface shows time-synced video and captions side by side in a window along with tools for editing text and adding visual cues, music tags and speaker tags. Dictionaries, scripts, rosters and other text resources ingested into Trance help to boost the accuracy of a transcript and, ultimately, the closed captions applied to video. Source text can be automatically translated into one or more additional languages, with the resulting text displayed in a dual-pane window for review and correction.

New caption conformance and quality assurance capabilities within Trance 3.0 allow users to configure captions according to style guidelines of each distribution platform, ensuring streaming services do not reject content just because captioning doesn’t match up with their internal caption style guide. Users configure and apply presets for target platforms, and Trance 3.0 automates caption formatting — number of characters, number of lines and caption placement — in accordance with policies defined in the appropriate preset. The resulting captions are displayed in a captioner window for final comparison to video. Once captions have been reviewed and approved, the file is used to generate the multiple output formats required for distribution.

Trance 3.0 completes the closed-captioning processes from end to end, generating a project management layer that centralizes tasks and minimizes the need for manual  intervention. The project manager can configure roles and priorities for different users and then set up individual projects by identifying necessary tasks, outputs and deadlines. Trance automatically handles the movement and processing of content, transcription, translation and captioning and tracks the productivity, workload and availability of different staff members. It also identifies the most appropriate person to task with a particular job and delivers notifications and alerts as needed to drive each project through to completion.