Author Archives: Randi Altman

Blackmagic updates Ursa Mini Pro 4.6K with 3D LUTs, more

Blackmagic Design’s new software update for the original Blackmagic Ursa Mini Pro 4.6K camera adds localization for 11 languages, the ability to embed custom 3D LUTs in Blackmagic raw clips, new customizable frame guides and improved behavior for function buttons. The update also adds improved compatibility for embedded audio on SDI outputs, better jam sync timecode functionality and much more.

Blackmagic Camera 6.9.3 Update is available now as a free download from the Blackmagic website.

Users can now embed custom 3D LUTs in Blackmagic raw clips, which helps to streamline production workflow. 3D LUTs can be directly embedded as metadata within the Braw file, making it easy to display the images with the intended “look” to maintain creative consistency through post. When working with software such as Davinci Resolve, it’s easy to access the 3D LUTs for editing and color correction, as they always travel with the Braw file.

This update also adds new features that Ursa Mini Pro customers have requested, such the ability to disable the HFR button to prevent accidentally changing the frame rate between shots. The HFR button can now also be remapped, as can the VTR and return buttons on Cine servo PL and B4 lenses. In addition, customers can now lock timecode to the SDI input, jam sync timecode is maintained between power cycles, and the ND filter icon is always visible on status text when in use.

Blackmagic Camera 6.9.3 Update adds support for 11 different languages so it can be used internationally. The camera can be set to English, German, French, Spanish, Portuguese, Italian, Russian, Turkish, Chinese, Japanese or Korean. It’s easy to switch to another language so users can share their camera with other crew. When customers switch languages, all on-screen overlays, setup menus and monitoring information will be displayed in the selected language instantly.

Audio improvements mean users can now choose between -18 or -20dB audio alignment levels. Sidetone adjustment for talkback allows camera operators to speak loudly in noisy environments while keeping their voices mixed down in their earpiece. Line level XLR input signals can now also be adjusted, and audio meters have improved ballistics.

Blackmagic OS has been updated and now adds ⅓ stop ISO increments to the ISO slider, as well as the ability to quickly switch media from the heads-up display. In addition, new 2:1, 1:1 and 5:4 frame guides have been added, along with the option for custom frame guides. An improved, more intuitive media formatting interface means users are less likely to accidentally format a card.

The Blackmagic Camera 6.9.3 Update also adds general performance improvements and greater stability when using external reference. In addition, auto exposure speed and performance has been improved, and so has playback of renamed clips.

DP Chat: Cinematographer John Grillo talks Westworld, inspiration

By Randi Altman

HBO’s Westworld ended its third season in early May, and it was quite a ride. There was anarchy, rioting, robots, humans, humans who are really robots, robots who had other robots’ brains. Let’s just say there was a lot going on. This season took many of our characters — including Dolores, Maeve, Bernard and the Man in Black — out of the Westworld park and into the real world, meaning the look of the show needed to feel different.

John Grillo on set

Cinematographer John Grillo has shot eight Westworld episodes spanning Seasons 2 and 3. In fact, he earned an Emmy nomination for his work on Season 2. His resume is full of episodic work and includes TNT’s new series Snowpiercer, The Leftovers and Preacher, among many others.

We reached out to Grillo to find out about his process on Westworld and how he found his way to cinematography.

Thie most current season of Westworld has completely different locations than previous years — we are now in the outside world. How did this change the look?
We introduced more LED practical fixtures in both interior and exterior sets. The idea was to create more linear patterns of illumination. Our production designer Howard Cummings created sets that incorporated this futuristic motif, whether built on stage or adding them to existing locations.

We relied much more on the art department and post VFX to help us eliminate certain elements in the background that would bump against the story. Beyond that, we endeavored to find locations in Los Angeles, Spain and Singapore that either already had a futuristic vibe about them or some that we could touch up with VFX. There were some that needed no extra work like The City of Arts and Sciences in Valencia, Spain, which became Delos Headquarters and another location in Barcelona called La Fabrica, which used to be a cement factory that the architect Ricardo Boffil converted into his offices and living quarters. This would become the character Serac’s home. In the near future, not everything has changed so dramatically, so we focused on key elements like vehicles and buildings.

DP Paul Cameron, who shot the show’s pilot, directed Episode 4 this season, which you shot for him. Tell us about working together on that episode.
I’ve known Paul Cameron for many years and assisted him on a few occasions, most notably on Collateral. I’ve always admired his lighting, so needless to say there was a healthy mix of excitement and fear on my part when I heard I’d be shooting his directorial debut!

I have shot episodic TV for other DP-directors and I’ve been in that situation myself — recently directing episodes for Preacher — so I came in with a new appreciation of how difficult it is to direct.

Working with a fellow cinematographer makes the communication a lot smoother; if he asked me for a specific look or feel we were able to speak in shorthand. He was very respectful of my opinions and let me do my thing, and at the same time I was able to help him like I would any director. He came up with some great ideas that were not in the script, particularly for the opening sequence with Ed Harris. Anybody directing for the first time with actors of the caliber that we have in Westworld would be a nervous wreck, but Paul was very much in control and we managed to have fun in the process.

What camera was Westworld shot on? What about lenses?
We shot on Kodak 35mm stock with ARRICAM ST and LT, 435 and 235 cameras using ARRI Master Primes serviced from Keslow Camera. They were very helpful in securing HD video taps for us, which were invaluable. We also shot anamorphic sequences with Cooke Anamorphic primes. We did shoot a little bit of digital here and there for wide-angle night exteriors of skylines just to make the buildings pop. For that we used the Sony Venice camera with ARRI Signature Primes. We also used the Rialto extension on the Venice to create a camera rig we mounted on a DJI Ronin S that we called the Hobo Cam. This allowed us to shoot in the crowded streets of Singapore unnoticed. The idea being that it was a one-man operation with the body of the camera in a backpack and the sensor module mounted on the Ronin. We used Zeiss Super Speeds to keep the weight down.

Tell us about the color grade. How do you work with the colorist?
I worked remotely with Kostas Theodosiou, who was our final colorist at FotoKem. He is new to the show this season, so we had some conversations over the phone early on. I would send reference stills to the dailies colorist Jon Rocke after each shoot day in an effort to lock down the look we were going for ahead of time.

We were tweaking as we went along, even re-transferring some dailies when we didn’t feel they were right. For me skin tones are very important. We spent a lot of time correcting them. Film is amazing in that respect, but when you transfer it to the digital domain it takes a lot of know-how from the colorist to dig for them.

You also shot TNT’s new Snowpiercer series. Both shows feature a lot of visual effects. How does that affect your work?
It’s like working with a ghost. You know it’s there but you can’t see it. I’ve worked with some great VFX artists, so I depend on them to keep me on the right track. Sometimes I affect what they do by suggesting a certain look or vice versa, but it’s all worked out in prep so usually we are on the same page when it comes time to shoot.

It used to be more complicated when I was coming up in terms of the execution, locking down cameras with 20 C-stands and such. Now they’ve come a long way and there’s nothing they can’t do. I usually don’t even see their work until the show comes out, so it’s always a pleasant surprise.

How did you become interested in cinematography?
It was by happenstance. My dad is a jazz guitarist and my mother is a painter. Growing up, I was surrounded by music and painting. There were plenty of art books in my mom’s house in Acapulco, where I was raised, so early on I had an interest in the visual arts.

When I was living in Mexico City, I got a job on an American film that was shooting in town. After working as a PA in various departments I ended up with the VFX crew and that was my first time being near a film camera. The assistants began teaching me how to load the old Mitchell and VistaVision cameras and after principal photography was done they offered me a job in LA as a loader.

After that I worked as an assistant for many years and was lucky to work for some of the best cinematographers around. What really turned me on to the art of cinematography was discovering the connection to my childhood interests and seeing how certain cinematographers were in fact painting with light. Vittorio Storaro, Sven Nykvist, Nestor Almendros and Conrad Hall were channeling Rembrandt, Vermeer and Caravaggio. I began paying more attention to the craft as I continued assisting DPs and then decided to make the leap.

What inspires you artistically?
I draw a lot of inspiration from the other arts. Paintings, photography, music and dance are great tools for learning about color, composition, rhythm and movement. For example, music is very helpful for camera choreography — how slow or how fast the dolly moves or how long a focus rack takes — is always linked to the rhythm of a scene, so it becomes a beautiful dance with the actors. That’s why we always talk about beats in a scene like we do with music.

Looking back over the past few years, what new technology has changed the way you work?
Probably the advent of LED lights. It’s been a game-changer, particularly on tight schedules. Having a dimmer board able to control not just the intensity but also color and angle has freed up time to think about the other dozen things that go into creating an image.

What are some of your best practices or rules you try to follow on each job?
For me it’s about paying attention. Having your antennas up. Listening to the director. Working on a film is a group effort and I like being involved in the process and want my crew to feel the same way. We spend more time with each other than with our families, so it’s important that everyone is inspired to do their best work but also have fun doing it. The rule is to always serve the story and the director’s vision.

What’s your go-to gear — things you can’t live without?
It all depends on the project. Lately I’ve been impressed with the Sony Venice camera. I love the high ISO setting for low light scenes. Also, I’ve grown quite dependent on the Astera Titan tubes for lighting. They are like Kino Flos but wireless, battery-powered and color-controlled. They can quickly get you out of a jam.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Hulu’s The Great: Creator and showrunner Tony McNamara

By Iain Blair

Aussie writer/director Tony McNamara is the creator, showrunner and executive producer of Hulu’s The Great, the new 10-episode series starring Elle Fanning as Catherine the Great and Nicholas Hoult as Russian Emperor Peter III. The Great is a comedy-drama about the rise of Catherine the Great — from German outsider to the longest reigning female ruler in Russia’s history (from 1762 until 1796). Season 1 is a fictionalized and anachronistic story of an idealistic, romantic young girl who arrives in Russia for an arranged marriage to Emperor Peter. Hoping for love and sunshine, she finds instead a dangerous, depraved, backward world that she resolves to change. All she has to do is kill her husband, beat the church, baffle the military and get the court on her side. A very modern story about the past, which incorporates historical facts occasionally, it encompasses the many roles she played over her lifetime — as lover, teacher, ruler, friend and fighter.

L-R: Tony McNamara and cinematographer John Brawley

McNamara most recently wrote the critically acclaimed, Oscar-winning film The Favourite, for which he received an Academy Award nomination for Best Original Screenplay. His other feature film credits include The Rage in Placid Lake, which he wrote and directed, and Ashby.

McNamara has writen some Australia’s most memorable television series, including The Secret Life of Us, Love My Way, Doctor Doctor and Spirited. He also served as showrunner of the popular series Puberty Blues.

I recently spoke with McNamara, who was deep in post, about making the show and his love of editing and post.

When you wrote the stage play this is based on, did you also envision it as a future TV series?
Not at all. I was just a playwright and I’d worked a bit in TV but I never thought of adapting it. But then Marian Macgowan, my co-producer on this, saw it and suggested making a movie of it, and I began thinking about that

What did the stage version teach you?
That it worked for an audience, that the characters were funny, and that it was just too big a story for a play or a film.

It’s like a Dickensian novel with so many periods and great characters and multiple storylines.
Exactly, and as I worked more and more in TV, it seemed like the perfect medium for this massive story with so many periods and great characters. So once the penny dropped about TV, it all went very fast. I wrote the pilot and off we went.

I hear you’re not a fan of period pieces, despite this and all the success you had with The Favourite. So what was the appeal of Catherine and what sort of show did you set out to make?
I love period films like Amadeus and Barry Lyndon, but I don’t like the dry, polite, historically accurate, by-the-numbers ones. So I write my things thinking, “What would I want to watch?” And Catherine’s life and story are so amazing, and anything but polite.

What did Elle Fanning and Nicholas Hoult bring to their roles?
They’re both great actors and really funny, and that was important. The show’s a drama in terms of narrative, but it also feels like a comedy, but then it also gets very dark in places. So they had to be able to do both — bring a comic force to it but also be able to put emotional boots on the ground… and move between the two very easily, and they can do that. They just got it and knew the show I wanted to make before we even got going. I spent time with them discussing it all, and they were great partners.

Where do you shoot?
We did a lot of it on stages at 3 Mills Studios in London and shot some exteriors around London. We then went to this amazing palace near Naples, Italy, where we shot exteriors and interiors for a couple of weeks. We really tried to give the show a bit more of a cinematic feel and look than most TV shows, and I think the production design is really strong. We all worked very hard to not make it feel at all like sets. We planned it out so we could move between a lot of rooms so you didn’t feel trapped by four walls in just one set. So even though it’s a very character-driven story, we also wanted to give it that big epic sweep and scope.

Do you like being a showrunner?
(Laughs) It depends what day it is. It’s a massive job and very demanding.

What are the best parts of the job and the worst?
I love the writing and working with the actors and the director. Then I love all the editing and all the post — that’s really my favorite thing in the whole process after the writing. I’ve always loved editing, as it’s just another version of writing. And I love editors, and ours are fun to hang out with, and it’s fun to try and solve problems. The worst parts are having to deal with all the scheduling and the nuts and bolts of production. That’s not much fun.

Where do you post?
We do it all in London, with all the editing at Hireworks and all the sound at Encore. When we’re shooting at the studios we set up an edit suite on site, so we start working on it all right away. You have to really, as the TV schedule doesn’t allow much time for post compared with film.

Talk about editing. You have several editors, I assume because of the time factor. How does that work?
We had three editors, who are all so creative and inventive. I love getting all the material and then editing and tweaking things, particularly in comedy. There’s often a very fine line in how you make something funny and how you give the audience permission to laugh.

I think the main editing challenges were usually the actual storytelling, as we tell a lot of stories really fast, so it’s managing how much story you tell and how quickly. It’s a 10-hour story; you’re also picking off moments in an early episode that will pay off far later in the series. Plus you’re dealing with the comedy factor, which can take a while to get up and running in terms of tone and pace. And if there’s a darker episode, you still want to keep some comedy to warm it up a bit.

But I don’t micro-manage the editors. I watch cuts, give some notes and we’ll chat if there are big issues. That way I keep fresh with the material. And the editors don’t like coming on set, so they keep fresh too.

How involved are you with the sound?
I’m pretty involved, especially with the pre-mix. We’ll do a couple of sessions with our sound designer, Joe Fletcher, and Marian will come in and listen, and we’ll discuss stuff and then they do the fixes. The sound team really knows the style of the soundscape we want, and they’ll try various things, like using tones instead of anything naturalistic. They’re very creative.

Tony McNamara and Elle Fanning on set

There’s quite a lot of VFX. 
BlueBolt and Dneg did them all — and there are a lot, as period pieces always need a ton of small fixes. Then in the second half, we had a lot of stuff like dogs getting thrown off roofs, carriages in studios that had to be running through forests, and we have a lot of animals — bears, butterflies and so on. There’s also a fair whack of violence, and all of it needed VFX.

Where do you do the DI?
We did the grading at Encore, and we spent a lot of time with DP John Brawley setting the basic look early on when we did the pilot, so everyone got it. We had the macro look early, and then we’d work on specific scenes and the micro stuff.

Are you already planning Season 2?
I have a few ideas and a rough arc worked out, but with the pandemic we’re not sure when we’ll even be able to shoot again.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Barking Owl adds first NYC hire, sound designer/mixer Dan Flosdorf

Music and sound company Barking Owl has made its first New York hire with sound designer/mixer Dan Flosdorf. His addition comes as the LA-based company ramps up to open a studio in New York City this summer.

Flosdorf’s resume includes extensive work in engineering, mixing and designing sound and music for feature films, commercials and other projects. Prior to his role at Barking Owl, Flosdorf was mixer and sound designer at NYC-based audio post studio Heard City for seven years.

In addition to his role at Heard City, Flosdorf has had a long collaboration with director Derek Cianfrance creating sound design for his films The Place Beyond the Pines and Blue Valentine. He also worked with Ron Howard on his feature, Made in America. Flosdorf’s commercial work includes working with brands such as Google, Volvo, Facebook and Sprite and multiple HBO campaigns for Watchmen, Westworld and Game of Thrones.

Leading up to the opening of Barking Owl’s NYC studio, Flosdorf is four-walling in NYC and working on projects for both East and West Coast clients

Even with the COVID crisis, Barking Owl’s New York studio plans continue. “COVID-19 has been brutal for our industry and many others, but you have to find the opportunity in everything,” says Kelly Bayett, founder/creative director at Barking Owl. “Since we were shut down two weeks into New York construction, we were able to change our air systems in the space to ones that bring in fresh air three times an hour, install UV systems and design the seating to accommodate the new way of living. We have been working consistently on a remote basis with clients and so in that way, we haven’t missed a beat. It might take us a few months longer to open there, but it affords us the opportunity to make relevant choices and not rush to open.”

Posting Michael Jordan’s The Last Dance — before and during lockdown

By Craig Ellenport

One thing viewers learned from watching The Last Dance — ESPN’s 10-part documentary series about Michael Jordan and the Chicago Bulls — is that Jordan might be the most competitive person on the planet. Even the slightest challenge led him to raise his game to new heights.

Photo by Andrew D. Bernstein/NBAE via Getty Images

Jordan’s competitive nature may have rubbed off on Sim NY, the post facility that worked on the docuseries. Since they were only able to post the first three of the 10 episodes at Sim before the COVID-19 shutdown, the post house had to manage a work-from-home plan in addition to dealing with an accelerated timeline that pushed up the deadline a full two months.

The Last Dance, which chronicles Jordan’s rise to superstardom and the Bulls’ six NBA title runs in the 1990s, was originally set to air on ESPN after this year’s NBA Finals ended in June. With the sports world starved for content during the pandemic, ESPN made the decision to begin the show on April 19 — airing two episodes a night on five consecutive Sunday nights.

Sim’s New York facility offers edit rooms, edit systems and finishing services. Projects that rent these rooms will then rely on Sim’s artists for color correction and sound editing, ADR and mixing. Sim was involved with The Last Dance for two years, with ESPN’s editors working on Avid Media Composer systems at Sim.

When it became known that the 1997-98 season was going to be Jordan’s last, the NBA gave a film crew unprecedented access to the team. They compiled 500 hours of 16mm film from the ‘97-’98 season, which was scanned at 2K for mastering. The Last Dance used a combination of the rescanned 16mm footage, other archival footage and interviews shot with Red and Sony cameras.

Photo by Andrew D. Bernstein/NBAE via Getty Images

“The primary challenge posed in working with different video formats is conforming the older standard definition picture to the high definition 16:9 frame,” says editor Chad Beck. “The mixing of formats required us to resize and reposition the older footage so that it fit the frame in the ideal composition.”

One of the issues with positioning the archival game footage was making sure that viewers could focus when shifting their attention between the ball and the score graphics.

“While cutting the scenes, we would carefully play through each piece of standard definition game action to find the ideal frame composition. We would find the best position to crop broadcast game graphics, recreate our own game graphics in creative ways, and occasionally create motion effects within the frame to make sure the audience was catching all the details and flow of the play,” says Beck. “We discovered that tracking the position of the backboard and keeping it as consistent as possible became important to ensuring the audience was able to quickly orient themselves with all the fast-moving game footage.”

From a color standpoint, the trick was taking all that footage, which was shot over a span of decades, and creating a cohesive look.

Rob Sciarratta

“One of main goals was to create a filmic, dramatic natural look that would blend well with all the various sources,” says Sim colorist Rob Sciarratta, who worked with Blackmagic DaVinci Resolve 15. “We went with a rich, slightly warm feeling. One of the more challenging events in color correction was blending the archival work into the interview and film scans. The older video footage tended to have various quality resolutions and would often have very little black detail existing from all the transcoding throughout the years. We would add a filmic texture and soften the blacks so it would blend into the 16mm film scans and interviews seamlessly. … We wanted everything to feel cohesive and flow so the viewer could immerse themselves in the story and characters.”

On the sound side, senior re-recording mixer/supervising sound editor Keith Hodne used Avid Pro Tools. “The challenge was to create a seamless woven sonic landscape from 100-plus interviews and locations, 500 hours of unseen raw behind-the-scenes footage, classic hip-hop tracks, beautifully scored instrumentation and crowd effects, along with the prerecorded live broadcasts,” he says. “Director Jason Hehir and I wanted to create a cinematic blanket of a basketball game wrapped around those broadcasts. What it sounds like to be at the basketball game, feel the game, feel the crowd — the suspense. To feel the weight of the action — not just what it sounds like to watch the game on TV. We tried to capture nostalgia.”

When ESPN made the call to air the first two episodes on April 19, Sim’s crew still had the final seven episodes to finish while dealing with a work-from-home environment. Expectations were only heightened after the first two episodes of The Last Dance averaged more than 6 million viewers. Sim was now charged with finishing what would become the most watched sports documentary in ESPN’s history — and they had to do this during a pandemic.

Stacy Chaet

When the shutdown began in mid-March, Sim’s staff needed to figure out the best way to finish the project remotely.

“I feel like we started the discussions of possible work from home before we knew it was pushed up,” says Stacy Chaet, Sim’s supervising workflow producer. “That’s when our engineering team and I started testing different hardware and software and figuring out what we thought would be the best for the colorist, what’s the best for the online team, what’s the best for the audio team.”

Sim ended up using Teradici to get Sciarratta connected to a machine at the facility. “Teradici has become a widely used solution for remote at home work,” says Chaet. “We were easily able to acquire and install it.”

A Sony X300 monitor was hand-delivered to Sciarratta’s apartment in lower Manhattan, which was also connected to Sciarratta’s machine at Sim through an Evercast stream. Sim shipped him other computer monitors, a Mac mini and Resolve panels. Sciarratta’s living room became a makeshift color bay.

“It was during work on the promos that Jason and Rob started working together, and they locked in pretty quickly,” says David Feldman, Sim’s senior VP, film and television, East Coast. “Jason knows what he wants, and Rob was able to quickly show him a few color looks to give him options.

David Feldman

“So when Sim transitioned to a remote workflow, Sciarratta was already in sync with what the director, Jason Hehir, was looking for. Rob graded each of the remaining seven episodes from his apartment on his X300 unsupervised. Sim then created watermarked QTs with final color and audio. Rob reviewed each QT to make sure his grade translated perfectly when reviewed on Jason’s retina display MacBook. At that point, Sim provided the director and editorial team access for final review.”

The biggest remote challenge, according to producer Matt Maxson, was that the rest of the team couldn’t see Sciarratta’s work on the X300 monitor.

“You moved from a facility with incredible 4K grading monitors and scopes to the more casual consumer-style monitors we all worked with at home,” says Maxson. “In a way, it provided a benefit because you were watching it the way millions of people were going to experience it. The challenge was matching everyone’s experience — Jason’s, Rob’s and our editors’ — to make sure they were all seeing the same thing.”

Keth Hodne

For his part, Hodne had enough gear in his house in Bay Ridge, Brooklyn. Using Pro Tools with Mac Pro computers at Sim, he had to work with a pared-down version of that in his home studio. It was a challenge, but he got the job done.

Hodne says he actually had more back-and-forth with Hehir on the final episode than any of the previous nine. They wanted to capture Jordan’s moments of reflection.

“This episode contains wildly loud, intense crowd and music moments, but we counterbalance those with haunting quiet,” says Hodne. “We were trying to achieve what it feels like to be a global superstar with all eyes on Jordan, all expectations on Jordan. Just moments on the clock to write history. The buildup of that final play. What does that feel and sound like? Throughout the episode, we stress that one of his main strengths is the ability to be present. Jason and I made a conscious decision to strip all sound out to create the feeling of being present and in the moment. As someone whose main job it is to add sound, sometimes there is more power in having the restraint to pull back on sound.”

ESPN Films_Netflix_Mandalay Sports Media_NBA Entertainment

Even when they were working remotely, the creatives were able to communicate in real time via phone, text or Zoom sessions. Still, as Chaet points out, “you’re not getting the body language from that newly official feedback.”

From a remote post production technology standpoint, Chaet and Feldman both say one of the biggest challenges the industry faces is sufficient and consistent Internet bandwidth. Residential ISPs often do not guarantee speeds needed for flawless functionality. “We were able to get ahead of the situation and put systems in place that made things just as smooth as they could be,” says Chaet. “Some things may have taken a bit longer due to the remote situation, but it all got done.”

One thing they didn’t have to worry about was their team’s dedication to the project.

“Whatever challenges we faced after the shutdown, we benefitted from having lived together at the facility for so long,” says Feldman. “There was this trust that, somehow, we were going to figure out a way to get it done.”


Craig Ellenport is a veteran sports writer who also covers the world of post production. 

Invisible VFX on Hulu’s Big Time Adolescence

By Randi Altman

Hulu’s original film Big Time Adolescence is a coming-of-age story that follows 16-year-old Mo, who is befriended by his sister’s older and sketchy ex-boyfriend Zeke. This aimless college dropout happily introduces the innocent-but-curious Mo to drink and drugs and a poorly thought-out tattoo.

Big Time Adolescence stars Pete Davidson (Zeke), Griffin Gluck (Mo) and Machine Gun Kelly (Nick) and features Jon Cryer as Mo’s dad. This irony will not be lost on those who know Cryer from his own role as disenfranchised teen Duckie in Pretty in Pink.

Shaina Holmes

While this film doesn’t scream visual effects movie, they are there — 29 shots — and they are invisible, created by Syracuse, New York-based post house Flying Turtle. We recently reached out to Flying Turtle’s Shaina Holmes to find out about her work on the film and her process.

Holmes served as VFX supervisor, VFX producer and lead VFX artist on Big Time Adolescence, creating things like flying baseballs, adding smoke to a hotboxed car, removals, replacements and more. In addition to owning Flying Turtle Post, she is a teacher at Syracuse University, where she mentors students who often end up working at her post house.

She has over 200 film and television credits, including The Notebook, Tropic Thunder, Eternal Sunshine of the Spotless Mind, Men in Black 3, Swiss Army Man and True Detective.

Let’s find our more…

How early did you get involved on Big Time Adolescence?
This this was our fifth project in a year with production company American High. With all projects overlapping in various stages of production, we were in constant contact with the client to help answer any questions that arose in early stages of pre-production and production.

Once the edit was picture-locked, we bid all the VFX shots in October/November 2018, VFX turnovers were received in November, and we had a few short weeks to complete all VFX in time for the premiere at the Sundance Film Festival in January 2019.

What direction were you given from your client?
Because this was our fifth feature with American High and each project has similar basic needs, we already had plans in place for how to shoot certain elements.

For example, most of the American High projects deal with high school, so cell phones and computer screens are a large part of how the characters communicate. Production has been really proactive about hiring an on-set graphics artist to design and create phone and computer screen graphics that can be used either during the shoot or provided to my team to add in VFX.

Having these graphics prebuilt has saved a lot of design time in post. While we still need to occasionally change times and dates, remove the carrier, change photos, replace text and other editorial changes, we end up only needing to do a handful of shots instead of all the screen replacements. We really encourage communication during the entire process to come up with alternatives and solutions that can be shot practically, and that usually makes our jobs more efficient later on.

Were you on set?
I was not physically needed on set for this film, however after filming completed, we realized in post that we were missing some footage during the batting cages scene. The post supervisor and I, along with my VFX coordinator, rented a camera and braved the freezing Syracuse, New York, winter to go to the same batting cages and shoot the missing elements. These plates became essential, as production had turned off the pitching machine during the filming.

Before and After: Digital baseballs

To recreate the baseball in CG, we needed more information for modeling, texture and animation within this space to create more realistic interaction with the characters and environment in VFX. After shoveling snow and ice, we were able to set the camera up at the batting cage and create the reference footage we needed to match our CG baseball animation. Luckily, since the film shot so close to where we all live and work, this was not a problem… besides our frozen fingers!

What other effects did you provide?
We aren’t reinventing the wheel here in the work we do. We work on features wherein invisible VFX are the supporting roles that help create a seamless experience for the audience without distractions from technical imperfections and without revising graphics to enable the story to unfold properly. I work with the production team to advise on ways to shoot to save on costs in post production and use creative problem solving to cut down costs in VFX to satisfy their budget and achieve their intended vision

That being said, we were able to do some fun sequences including CG baseballs, hotboxing a car, screen replacements, graphic animation and alterations, fluid morphs and artifact cleanup, intricate wipe transitions, split screens and removals (tattoos, equipment, out-of-season nature elements).

Can you talk about some of those more challenging scenes/effects?
Besides the CG baseball, the most difficult shots are the fluid morphs. These usually consist of split screens where one side of the split has a speed change effect to editorially cut out dialogue or revise action/reactions.

They seem simple, but to seamlessly morph two completely different actions together over a few frames and create all the in-betweens takes a lot of skill. These are often more advanced than our entry-level artists can handle, so they usually end up on my plate.

What was the review and approval process like?
All the work starts with me receiving plates from the clients and ends with me delivering final versions to the clients. As I am the compositing supervisor, we go through many internal reviews and versions before I approve shots to send to the client for feedback, which is a role I’ve done for the bulk of my career.

For most of the American High projects, the clients are spread out between Syracuse, LA and NYC. No reviews were done in person, although if needed, I could go to Syracuse Studios at any time to review dailies if there was any footage I thought could help with some fix-it-in-post VFX requests.

All shots were sent online for review and final delivery. We worked closely with the executive producer, post supervisor, editor and assistant editor for feedback, notes, design and revisions. Most review sessions were collaborative as far as feedback and what’s possible.

What tools did you use on the film?
Blackmagic’s Fusion is the main compositing software. Artists were trained on Fusion by me when they were in college, so it is an easy and affordable transition for them to use for professional-quality work. Since everyone has their own personal computer setup at home, it’s been fairly easy for artists to send comp files back to me and I render on my end after relinking. That has been a much quicker process for internal feedback and deliveries as we’re working on UHD and 4K resolutions.

For Big Time Adolescence specifically, we also needed to use Adobe After Effects for some of the fluid morph shots, plus some final clean-up in Fusion. For the CG baseball shots, we used Autodesk Maya and Substance Painter, rendered with Arnold and comped in Fusion.

You are female-owned and you are in Syracuse, New York. Not something you hear about every day.
Yes, we are definitely set up in a great up-and-coming area here in Ithaca and Syracuse. I went to film school at Ithaca College. From there, I worked in LA and NYC for 20 years as a VFX artist and producer. In 2016, I was offered the opportunity to teach VFX back at Ithaca College, so I came back to the Central New York area to see if teaching was the next chapter for me.

Timing worked out perfectly when some of my former co-workers were helping create American High, using the Central New York tax incentives and they were prepping to shoot feature films in Syracuse. They brought me on as the local VFX support since we had already been working together off and on since 2010 in NYC. When I found myself both teaching and working on feature films, that gave me the idea to create a company to combine forces.

Teaching at Syracuse University and focusing on VFX and post for live-action film and TV, I am based at The Newhouse School, which is very closely connected with American High and Syracuse Studios. I was already integrated into their productions, so this was just a really good fit all around to bring our students into the growing Central New York film industry, aiming to create a sustainable local talent pool.

Our team is made up of artists who started with me in post mentorship groups I created at both Ithaca College (Park Post) and Syracuse University (SU Post). I teach them in class, they join these post group collaborative learning spaces for peer-to-peer mentorship, and then a select few continue to grow at Flying Turtle Post.

What haven’t I asked that’s important?
When most people hear visual effects, they think of huge blockbusters, but that was never my thing. I love working on invisible VFX and the fact that it blows people’s minds — how so much attention is paid to every single shot, let alone frame, to achieve complete immersion for the audience, so they’re not picking out the boom mic or dead pixels. So much work goes on to create this perfect illusion. It’s odd to say, but there is such satisfaction when no one noticed the work you did. That’s the sign of doing your job right!

Every show relies on invisible VFX these days, even the smallest indie film with a tiny budget. These are the projects I really like to be involved in as that’s where creativity and innovation are at their best. It’s my hope that up-and-coming filmmakers who have amazing stories to tell will identify with my company’s mentorship-focused approach and feel they also are able to grow their vision with us. We support female and underrepresented filmmakers in their pursuit to make change in our industry.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Adobe CC updates include ProRes Raw support for Premiere and AE

Adobe updated the tools within its Creative Cloud offerings, including for Premiere Pro, After Effects, Audition, Character Animator, Media Encoder and Premiere Rush. The updates are available now and offer support for Apple ProRes Raw, new creative tools in After Effects, workflow refinements in Character Animator and performance improvements, such as faster Auto Reframe in Premiere Pro.

Here are some details of the updates:

– ProRes Raw support in Premiere Pro and After Effects provides a cross-platform solution for Apple ProRes workflows from camera media through to delivery.
– More streamlined graphics workflows in Premiere Pro include an improved pen tool with better support for Bezier curves, enabling greater precision for creating lines and shapes. Filter effects show attributes that only have keyframes or adjusted parameters so you can focus on the currently active effects.
– Auto Reframe in Premiere is now faster. Powered by Adobe Sensei, the company’s artificial intelligence (AI) and machine learning technology, Auto Reframe automatically reformats and repositions video within different aspect ratios, such as square and vertical video, accelerating workflows for social media and content platforms, such as Quibi.
– Hardware encoding on Windows for H.264 and H.265 (HEVC) is available for Nvidia and AMD GPUs, providing consistently faster exports for these widely used formats with Premiere and Media Encoder.
– Support for audio files in Creative Cloud Libraries enables Premiere users to save, organize and share frequently used audio assets for easy access right from the CC Libraries panel.
– Tapered Shape Strokes in After Effects give motion graphics artists new creative options for animation and design. They can make tapered, wavy, pointed or rounded strokes on shape layers and animate the strokes for stylized looks and motion designs.
– Concentric Shape Repeater in After Effects offers new parameters in the Offset Paths shape effect to create copies of a path that radiate outward or inward for funky designs with a cool retro vibe.
– Mask and Shape Cursor Indicators in After Effects show which tool someone is using and help avoid unnecessary un-dos when drawing shapes and masks.
– Improvements to Audio Triggers and Timeline filtering in Character Animator increase efficiency in animation workflows. A new collection of background puppets let users trigger animated elements within a scene behind their character.
– Automatic audio hardware switching is now available on macOS for After Effects, Media Encoder, Audition, Character Animator, Prelude, Premiere and Premiere Rush. When changing audio devices or simply plugging in headphones, the OS recognizes the hardware, and the Adobe application automatically switches to the current hardware.
– Premiere Rush users can now automatically resize projects to the 4:5 aspect ratio to match formats for Facebook and Instagram videos. Also, back camera switching on an iOS device (requires iOS 13 and a current iPhone) enables capture within Premiere Rush from the selected back camera (ultra-wide, wide or telephoto).
– Lastly, users can now import media from the Files app directly from the Premiere Rush media browser on iOS devices, simplifying access to files stored on device or different cloud services.

Arch platform launches for cloud-based visual effects

Arch Platform Technologies, a provider of cloud-based infrastructure for content creation, has made its secure, scalable, cloud-based visual effects platform available commercially. The Arch platform is designed for movie studios, productions and VFX companies and enables them to leverage a VFX infrastructure in the cloud from anywhere in the world.

An earlier iteration of the Arch platform was only available to those companies who were already working with Hollywood-based Vitality VFX, where the technology was created by Guy Botham. Now, Arch is making its next-generation version of its “rent vs. own” cloud-based VFX platform commercially available broadly to movie studios, productions and VFX companies. This version was well along in its development when COVID-19 arrived, making it a very timely offering.

By moving VFX to the cloud, the platform lets VFX teams scale up and down quickly from anywhere and build and manage capacity with cloud-based workstations, renderfarms, storage and workflow management – all in a secure environment.

“We engineered a robust Infrastructure as a Service (IaaS), which now enables a group of VFX artists to collaborate on the same infrastructure as if they were using an on-premises system,” says Botham. “Networked workstations can be added in minutes nearly anywhere in the world, including at an artist’s home, to create a small to large VFX studio environment running all the industry-standard software and plugins.”

Recently, Solstice Studios, a Hollywood distribution and production studio, used the Arch platform for the VFX work on the studio’s upcoming first movie, Unhinged. The platform has also been used by VFX companies Track VFX and FatBelly VFX and is now commercially available to the industry.

Due to COVID, The Blacklist turns to live-action/animation season finale

By Daniel Restuccio

When the COVID-19 crisis shut down production in New York City, necessity became the mother of invention for The Blacklist showrunners. They took the 21 minutes of live-action footage they had shot for Episode 19, “The Kazanjian Brothers,” and combined it with 21 minutes of graphic-novel-style animation to give viewers the season finale they deserved.

Adam Coglan

Thanks to previs/visual effects company Proof, the producers were able to transition from scenes that were shot traditionally to a world where FBI agent Elizabeth Keane and wanted fugitive Raymond Reddington lived as animated characters.

The Blacklist team reached out to Proof the week everyone at the studio was asked to start working from home. In London, artists were given workstations as needed; in the US, they had all the computers set up in the office while the team remoted into those workstations based on the different proprietary and confidentiality rules, and to keep everything on the same servers.

Over six weeks, 29 people in London, including support staff and asset people, worked on the show. While in the US, the numbers varied between 10 to 15 people. As you can imagine, it’s a big undertaking.

Patrice Avery

We reached out to Adam Coglan and Matt Perrin, Proof animation supervisors based in London, and Patrice Avery global head of production for Proof, about the production and post workflow.

How did you connect with the producers on the show?
Patrice Avery: Producer Jon Bokenkamp and John Eisendrath knew Proof’s owner and president, Ron Frankel. After The Blacklist shut down, they brainstormed ideas, and animation was one they thought might make sense.

Adam Coglan: The Proof US offices tend to work using toon shaders on models, and the producers had seen our previs work on the Hunger Games, Guardians of the Galaxy and Wrinkle in Time.

Can you walk us through the workflow?
Coglan: Everybody was working in parallel. There was no time to wait for the models to get built, then start texturing, then start rigging, then start animating. Animation had to start from the beginning, so we started out using proxy geometry for the characters.

Matt Perrin

Character build and animation was going on at the same time. We were building the sequences, blocking them out, staging them. We had a client call every day, so we’d be getting notes every day. The clients wouldn’t be looking at anything that resembled their main actors until a good three or four weeks into the actual project.

Avery: It meant they had to run blind a bit with their animation. They got scratch dialog, and then they got final. They didn’t get the real dialogue until almost the end because they still were trying to figure out how to get the best quality dialogue recorded from their actors.

Obviously, the script had been written, so you essentially animated the existing script?
Coglan: Yes. We were given the script early on and it did evolve a little bit, but not wildly. We managed to stick to the sequences that we’d actually blocked out from the start. There were no storyboards; we basically just interpreted what their script gave us.

Can you talk a bit about some of the scenes you enhanced?
Coglan: There’s a helicopter sequence at the end of the show that they hadn’t planned to shoot with live action, because of safety issues with the helicopter. They brought it back when they realized they could do all of those shots in animation. So there are big aerials over the helicopter landing pad. The helicopter is going while the main villain approaches the helicopter.

Matt Perrin: There are shots peppered throughout the whole thing that would have been tricky to fit into the timescales that they normally shoot the show in. Throughout the show, there are angles and camera work that were easier in animation. In the pilot episode, they visited Washington Mall and the Capitol building, so we got to return to that.

You used Autodesk Maya as your main tool? What else was used?
Coglan: Yes, predominantly Maya. Character heads were built in Z-Brush and then brought into Maya and textured using Substance and Photoshop. The toon shader is a set of proprietary shaders that Proof developed in Maya with filters on the textures to give them the toon shaded look.

Your networks are obviously connected?
Coglan: Absolutely. We’ve been using Teradici, which has really saved our skin on this show. It’s been a godsend and offers really good remote access.

Aside from the truncated production schedule, what were some of the other challenges that you had?
Coglan: Working completely remotely with a team of over 20 odd people was a big challenge.

Perrin: Yes. Everything slows down. Coordinating the work from home with all the artists, the communication that you have face-to-face with the team being in the same room as you is, obviously, stretched. We would communicate over Zoom chats daily, multiple times a day with them, and with the producers.

On the flip side, it felt like we had more access to the producers of the show than we might under normal circumstances, because we had a scheduled meeting with them every day as well. It was great to tap directly into their taste and get their feedback so immediately.

Can you describe how the animation was done? Keyframe, rotoscoping, motion capture, or some combination of those?
Perrin: We started with very simple blockouts of action and cameras for each scene. This allowed us to get the layout and timing approved fast, saving as much time as possible for the keyframe animation stage. There are a couple of transitions between live action and animation that required some rotoanimation. We also did a little mocap (mostly for background character motions.) On the whole though, it was a lot of keyframe animation.

How were the editors involved?
Perrin: Chris Brookshire and Elyse Holloway “got it” from the beginning. They gave us the cut of the live-action show with placeholders slugged in for the scenes we would be animating. Between watching that and the script, which was already pretty tight, it gave us an idea of what the scope of our role was going to be.

We decided to go straight into a very basic blocking pass rendered in gray scale 3D so they could see the space and start testing angles with faster iterations. It allowed them to start cutting earlier and give us those edits back. They never got an excess of footage from us.

When they shoot the show, they’ve got reels of footage to go through, whereas with us they get the shots we created and not many spare. But the editors and showrunners got the idea that they could actually start calling out for shots too. They’d ask us to change the layout in some instances because they want to shuffle the shots around from what we’d initially intended.

From that point it allowed our asset makers and R&D teams to be looking into what the character should look like in the environments and building that parallel with us. Then we were ready to go into animation.

How did you set the style for the final look of the piece?
Perrin: The client had a strong idea. They already had done a spinoff comic book series. We’d seen what they’ve done with that, and they talked about the kind of styling of The Blacklist being quite noir.

Coglan: They gave the current episode and past episodes. So they could always reference scenes that were similar to other episodes. As soon as one of the show runners started talking about leaning into the graphic novel styling of things a little light went off and we thought, okay, we know exactly what they’re after for this now. It gave us a really good creative direction.

That was the biggest development on the project — to get the fidelity of the toon shaders to stand up to broadcast quality, better than we’ve been used to in the past. Because normally these don’t go past producers and directors when we work in previs.

When you say better quality what is that actually?
Coglan: There are some extreme close-ups in this where we were right on the main characters’ faces. They had detail actually hand-painted in Photoshop and then Substance. A lot of the lines to define features were actually painted by hand.

Avery: When using the toon shading in previs, we didn’t really do much to the backgrounds, it was all character line toon shaded, and in this one we created a process for the background sets to also make them look toon shaded.

Did you recreate any of the existing sets?
Coglan: They gave us blueprints for a lot of their set builds, so, yes, some of the sets were straight from the show.

Perrin: One set, a medical facility, we’d built already from their blueprints, so that when we transition out of the live-action into the animation it’s kind of seamless.

What other things did you accomplish that you’re proud of that you haven’t mentioned yet?
Coglan: I think for me really, the amount of work that we did in the compressed amount of time really was the big takeaway for me. Dealing with people totally remotely I just didn’t know whether that could work and we made it work.

Perrin: The whole way through was very exciting, because of the current situation everybody’s in, and the time constraints. It was very liberating for us. We didn’t have the multi-tiered approval stages or the normal infrastructure. It was immediate feedback and fast results.

Avery: What was cool for me was watching the creative discussions. There was a point a few weeks in when the client was giving more notes about the comic book-style and leaning into that. Our teams are so used to the constraints of live-action and the rules that they need to follow. There was this switch when they finally were like, “Oh, we can do really cool comic book angles. Oh, we could do this and that.” To just see the team really embracing, untethered a bit, to just go for it. It was really cool to see.

What would you do if you got a call, “Hey, we’ve got an entire series that wants to go this route?”
Perrin: I think I’d jump at it really. Although I don’t think I could do it in the same timescale for everything. I think there needs to be slightly more planning involved, but it’s been pretty enjoyable.


Dan Restuccio is a writer/director with Realwork Entertainment and part of the Visual Arts faculty at California Lutheran University. He is a former Disney Imagineer. You can reach him at dansweb451@gmail.com.

Unit9 signs trilingual director Maya Albanese  

Production studio Unit9 has signed director Maya Albanese to its roster. She specializes in emotional and comedic stories that highlight diverse characters and under-represented social themes. Albanese brings experience in filmmaking, branded entertainment and screenwriting.

As a trilingual director, she has crafted work in English, Spanish and French for brands including Disney, Warner Bros, Chevrolet, L’Oréal, IBM, Visa and Google. Albanese is fresh off directing a series of magical-realism spots for Bic and comedy spots for Visa, all of which combine her expertise in directing talent, visual effects and animation.

Earlier this year, she finished a dark surrealist comedy called Freeze, which she wrote and directed about women’s fertility. It stars Chris Parnell, Adrian Grenier, Mindy Sterling, Nora Zehetner, Rick Overton, Kel Mitchell and Queen Jazzmun.  “Freeze” is an official selection of the Diversity in Cannes Short Film Showcase at the 73rd Cannes Film Festival.

Albanese has shot and directed three documentaries: Cuba’s Violin (2014), which screened at festivals worldwide; Blind Date (2015), which premiered at Doc NYC; and Bigger Than Us (2020), which is an intimate behind-the-scenes story of the first-ever, SAG-registered feature film made made by a cast and crew, more than half of whom have a disability.

Albanese found her way to Unit9 through the Commercial Directors Diversity Program fellowship. In 2018, she was one of six directors chosen by the Directors Guild of America and Association of Independent Commercial Producers for the competitive fellowship.

“Telling moving stories that create more inclusivity in front of and behind the camera, whether that’s about women, minorities or people with disabilities, is what drives me,” says Albanese. “I believe we need to continue pushing for this now more than ever. I’m really looking forward to working with the Unit9 team to make bold new stories come to life on screen. Together, I believe we can make sure that all kinds of people get to see themselves reflected in media and advertising.”

Caption: Maya Albanese is pictured left, on set.

Dell intros redesigned Precision mobile workstation line

Dell Technologies has introduced new mobile workstations in its Precision line, which targets professional content creators. The Precisions feature the Dell Optimizer, an automated AI-based optimization technology that learns how each person works and adapts to their behavior. It is designed to improve overall application performance; enable faster log-in and secure lock outs; eliminate echoes and reduce background noise on conference calls; and extend battery run time.

The reengineered  Precision workstation portfolio is designed to handle demanding workloads, such as intensive graphics processing, data analysis and CAD modeling. With smaller footprints and thermal innovations, the new Precision mobile workstations offer increased performance and ISV certifications with professional graphics from Nvidia and the latest 10th Gen Intel Core vPro and Xeon processors.

Designed for creators and pros, the Dell Precision 5550 and 5750 are small and thin 15-inch and 17-inch mobile workstations, respectively, and offer a 16:10, four-sided InfinityEdge (up to HDR 400) display.

The new Precision 5750 is also VR/AR and AI-ready to handle fast rendering, detailed visualizations and complex simulations. Targeting media and entertainment pros, the 5750 comes with the option of an Nvida Quadro RTX 3000 GPU, weighing in at only 4.7 pounds. It is available with a UHD+ (3840 x 2400) HDR400 screen, dual Thunderbolt (four ports total) and up to two M.2 NVMe drives.

The Dell Precision 5550 is available now starting at $1,999. The Dell Precision 5750 is available in early June starting at $2,399.

The Precisions are designed for sustainability with recycled materials, sustainable packaging, energy efficient designs and EPEAT Gold registrations.

 

Color grading Togo with an Autochrome-type look

Before principal photography began on the Disney+ period drama Togo, the film’s director and cinematographer, Ericson Core, asked Company 3 senior colorist Siggy Ferstl to help design a visual approach for the color grade that would give the 1920s-era drama a unique look. Based on a true story, Togo is named for the lead sled dog on Leonhard Seppala’s (Willem Dafoe) team and tells the story of their life-and-death relay through Alaska’s tundra to deliver diphtheria antitoxin to the desperate citizens of Nome.

Siggy Ferstl

Core wanted a look that was reminiscent of the early color photography process called Autochrome, as well as an approach that evoked an aged, distressed feel. Ferstl, who recently colored Lost in Space (Netflix) and The Boys (Amazon), spent months — while not working on other projects — developing new ways of building this look using Blackmagic’s Resolve 16.

Many of Ferstl’s ideas were realized using the new Fusion VFX tab in Resolve 16. It allowed him to manipulate images in ways that took his work beyond the normal realm of color grading and into the arena of visual effects.

By the time he got to work grading Togo, Ferstl had already created looks that had some of the visual qualities of Autochrome melded with a sense of age, almost as if the images were shot in that antiquated format. Togo “reflects the kind of style that I like,” explains Ferstl. “Ericson, as both director and cinematographer, was able to provide very clear input about what he wanted the movie to look like.”

In order for this process to succeed, it needed to go beyond the appearance of a color effect seemingly just placed “on top” of the images. It had to feel organic and interact with the photography, to seem embedded in the picture.

A Layered Approach
Ferstl started this large task by dividing the process into a series of layers that would work together to affect the color, of course, but also to create lens distortion, aging artifacts and all the other effects. A number of these operations would traditionally be sent to Company 3’s VFX department or to an outside vendor to be created by their artists and returned as finished elements. But that kind of workflow would have added an enormous amount of time to the post process. And, just as importantly, all these effects and color corrections needed to work interactively during grading sessions at Company 3 so Ferstl and Core could continuously see and refine the overall look. Even a slight tweak to a single layer could affect how other layers performed, so Ferstl needed complete, realtime control of every layer for every fine adjustment.

Likewise the work of Company 3 conform artist Paul Carlin could not be done in the way conform has typically been done. It couldn’t be sent out of Resolve and into a different conform/compositing tool, republished to the company network and then returned to Ferstl’s Resolve timeline. This would have taken too long and wouldn’t have allowed for the interactivity required in grading sessions.

Carlin needed to be able to handle the small effects that are part of the conform process — split screens, wire removals, etc. — quickly, and that meant working from the same media Ferstl was accessing. Carlin worked entirely in Resolve using Fusion for any cleanup and compositing effects — a practice becoming more and more common among conform artists at Company 3. “He could do his work and return it to our shared timeline,” Ferstl says. “We both had access to all the original material.”


Most of the layers actually consisted of multiple sublayers. Here is some detail:
Texture: This group of sublayers was based on overlaid textures that Ferstl created to have a kind of “paper” feel to the images. There were sublayers based on photographs of fabrics and surfaces that all play together to form a texture over the imagery.
Border: This was an additional texture that darkened portions of the edges of the frame. It inserts a sense of a subtle vignette or age artifact that framed the image. It isn’t consistent throughout; it continually changes. Sublayers bring to the images a bit of edge distortion that resembles the look of diffraction that can happen to lenses, particularly lenses from the early 20th century, under various circumstances.
Lens effects: DP Core shot with modern lenses built with very evolved coatings, but Ferstl was interested in achieving the look of uncoated and less-refined optics of the day. This involved the creation of sublayers of subtle distortion and defocus effects.
Stain: Ferstl applied a somewhat sepia-colored stain to parts of the image to help with the aging effect. He added a hint of additional texture and brought some sepia to some of the very bluish exterior shots, introducing hints of warmth into the images.
Grain-like effect: “We didn’t go for something that exactly mimicked the effect of film grain,” Ferstl notes. “That just didn’t suit this film. But we wanted something that has that feel, so using Resolve’s Grain OFX, I generated a grain pattern, rendered it out and then brought it back into Resolve and experimented with running the pattern at various speeds. We decided it looked best slowed to 6fps, but then it had a steppiness to it that we didn’t like. So I went back and used the tool’s Optical Flow in the process of slowing it down. That blends the frames together, and the result provided just a hint of old-world filmmaking. It’s very subtle and more part of the overall texture.”

Combining Elements
“It wasn’t just a matter of stacking one layer on top of the other and applying a regular blend. I felt it needed to be more integrated and react subtly with the footage in an organic-looking way,” Ferstl recalls.

One toolset he used for this was a series of customized lens flare using Resolve’s OFX, not for their actual purpose but as the basis of a matte. “The effect is generated based on highlight detail in the shot,” explains Ferstl. “So I created a matte shape from the lens flare effect and used that shape as the basis to integrate some of the texture layers into the shots. It’s the textures that become more or less pronounced based on the highlight details in the photography and that lets the textures breathe more.”

Ferstl also made use of the Tilt-Shift effect in Fusion that alters the image in the way movements within a tilt/shift lens would. He could have used a standard Power Window to qualify the portion of the image to apply blur to, but that method applied the effect more evenly and gave a diffused look, which Ferstl felt wasn’t like a natural lens effect. Again, the idea was to avoid having any of these effects look like some blanket change merely sitting on top of the image.

“You can adjust a window’s softness,” he notes, “but it just didn’t look like something that was optical… it looked too digital. I was desperate to have a more optical feel, so I started playing around with the Tilt-Shift OFX and applying that just to the defocus effect.

“But that only affected the top and bottom of the frame, and I wanted more control than that,” he continues. “I wanted to draw shapes to determine where and how much the tilt/shift effect would be applied. So I added the Tilt-Shift in Fusion and fed a poly mask into it as an external matte. I had the ability to use the mask like a depth map to add dimensionality to the effect.”

As Ferstl moved forward with the look development, the issue that continually came up was that while he and Core were happy with the way these processes affected any static image in the show, “as soon as the camera moves,” Ferstl explains, “you’d feel like the work went from being part of the image to just a veil stuck on top.”

He once again made use of Fusion’s compositing capabilities: The delivery spec was UHD, and he graded the actual photography in that resolution. But he built all the effects layers at the much larger 7K. “With the larger layers,” he says, “if the camera moved, I was able to use Fusion to track and blend the texture with it. It didn’t have to just seem tacked on. That really made an enormous difference.”

Firepower
Fortunately for Ferstl, Company 3’s infrastructure provided the enormous throughput, storage and graphics/rendering capabilities to work with all these elements (some of which were extremely GPU-intensive) playing back in concert in a color grading bay. “I had all these textured elements and external mattes all playing live off the [studio’s custom-built] SAN and being blended in Resolve. We had OpenFx plugins for border and texture and flares generated in real time with the swing/tilt effect running on every shot. That’s a lot of GPU power!”

Ferstl found this entire experience artistically rewarding, and looks forward to similar challenges. “It’s always great when a project involves exploring the tools I have to work with and being able to create new looks that push the boundaries of what my job of colorist entails.”

Behind the Title: Gretel designer/animator Johannes Geier

Back in his home country of Germany, Johannes Geier started building cars in Photoshop when he was 11. Now he’s working on campaigns for IFC and The New York Times.

Name: Johannes Geier

Company: NYC’s Gretel

Can you describe what Gretel does?
Gretel is a design and branding studio. We work with clients to get to the heart of who they are, then we express their brand through image and language. Intuition is a crucial part in every step of the process.

What’s your job title?
Designer and animator

Can you talk more about your role?
Working in branding means a focus on systems and longer-lasting solutions. Even campaigns get systematized here. My job requires the ability to switch quickly between both micro and macro — in the details and at a higher level. Everything we create must accurately express a brand’s unique truths, so keeping strategy in mind is important. This starts with an awareness of what else is out there before quickly jumping into sketches, style frames, first proofs of concept and thinking about motion languages.

How are you handling the shutdown and working remotely?
It took some time to properly break up a day but other than that it works pretty well with the same workflow we always had. Something I miss about being in the studio together is seeing different projects on other screens.

New York Times

What’s your favorite part of the job?
The beginning when everything is open and possible. The fear and respect of the blank paper. Where a short abstract sketch can define a whole direction and has the potential to trigger the fantasy to create greater stuff on top of it.

What’s your least favorite?
Joining a project after everything is defined.

What is your most productive time of day?
Early in the morning and at night. No meetings, no nothing.

If you didn’t have this job, what would you be doing instead?
I could imagine teaching at a university, designing workshop formats and such. Initiating my own faculty has always been a dream, especially in my hometown of Passau in Germany, where I grew up. When you offer programs to educate young people with reasonable design skills, cities look and think differently. When the awareness of design is present in a place, it can have a huge impact. Even when it’s bad, people see that and get inspired to make it better.

How early on did you know this would be your path?
I started Photoshopping cars at the age of 11 and editing basketball mix tapes at 14. Suddenly, those areas came together. I always did what interested me at the moment, and this led organically to what I do now. I went to a secondary school where we had art history and practical art taught early. Working with wood, clay and all things fun. One day we built rockets that could fly 1,000 feet high and had their own parachute.

My intention first was to learn classic drawing there. While others were practicing realistic drawing, I was working on a short film. When my teacher discovered that I could edit in Apple Final Cut, I ended up doing short films every year.

After that I went to a technical school for design and arts, but this time we built things like chairs and jewelry. It was about getting a feeling for every kind of material. During this time, I also joined a new magazine launched in my hometown in Germany and did editorial work there. Then I got to know an artist who exposed me to new philosophies about design and the understanding of abstraction. I supported him for a book release with a sculpture.

When I was interested in studying animation and visual effects, I joined a secret international team for a channel rebrand. As films became interesting again, I went to film school (Baden-Württemberg Film Academy) where I attended a motion design class. The focus was the interplay between sound, image and much more experimental approaches, called “visual music.”

During this time, I was an artist-in-residence in the remote Bavarian forest and collaborated with a musicologist. The university was very open-minded and free — a platform to do anything you wanted once you found the right people to work with. I somehow found them and did a graduation film about the 100m sprint using inflated, metaphorical worlds with light installations to stretch time. Then it was enough film for me. I was looking for something that could channel all my interests into one thing, and the result is branding.

Can you name some recent projects you have worked on?
Work for the New York Times, Amazon Prime Video and MoMA.

IFC

What is the project that you are most proud of?
IFC is one. I worked on it at the beginning of my internship at Gretel with a very small dream team. I still look back at earlier frames we did then for ongoing projects today. I like that although it’s strictly type and color on a screen, the motion behavior and design can feel so unique.

Name three pieces of technology you can’t live without.
Apple Pay, Uber, Google Maps

What social media channels do you follow?
The ones that stand out for me are:
@rentalmag: It’s an aesthetic I like, and every single post has a strong visual impact full of derivations.
@alv_alv: When you scroll down there’s a really nice archive of interesting indie film title sequences.
@RIPstreets: It’s a great resource for street races around NJ burnouts and stuff, you know?

Do you listen to music while you work?
Yes, and I very much prefer loud music. Mostly electro and minimal. Philip Glass and Steve Reich are also in my top five. For other music, we have a Gretel playlist on Spotify.

What do you do to de-stress from it all?
(Answered before the shutdown) To free my mind, I take long walks on Fridays and see where it leads me. I mostly land at Washington Square Park and go to venues around that area. The Oculus at World Trade Center is a good place on weekends. Seeing people from far above moving like one wave, the same rhythm over and over, is like meditation.

How has animation changed since you first started your career?
At the beginning, there was this era of homemade VFX and DSLR cinematography with integrated motion graphics. Highly saturated Vimeo videos are a good example of that. The quality didn’t matter that much, it was more the idea.

The awareness and accessibility of it changed rapidly. In 2016, the Google Creative Lab 5 developed a job application page to find “the next” The Five, a one-year paid program in the lab. They had keyframes there to animate the Google logo in order to submit as a first task. This was a sign for me that people recognized the craft and knew what keyframes were. We now see it on Instagram, how images and graphics are starting to move within one swipe.

Today, the graphic approach feels cleaner. More precise and on point. Animation exists everywhere now —everything needs the ability to live on a screen. This simpler, clearer approach translates easily, so the focus on animation has never felt so important.

To be fair, animation is a much bigger, complex field, and this answer is directed to commercial animation. What people do at Gobelins in Paris or Eastern European animation like in Lodz, Poland, is a whole different story. It’s more artful and doesn’t necessarily need to be tied to trends.

Quantum upgrades StorNext to v6.4

Quantum has made upgrades to its StorNext file system and data management software that the company says will make cloud content more accessible, with significantly improved read and write speeds for any cloud and object store-based storage solution. The new StorNext 6.4 software features enable hybrid-cloud and multi-cloud storage use cases, for greater flexibility in media and entertainment and other data-intensive workflows.

StorNext 6.4 software incorporates self-describing objects to make cloud content more easily accessible, enabling new hybrid-cloud workflows. The client writes files into StorNext file system, and then, based on policy, StorNext 6.4 software copies files to the public or private cloud, with the option to include additional object metadata. Non-StorNext software clients and cloud-resident processes may now access objects directly, thereby using the new extended metadata.

Multi-threaded put/get operations improve retrieval speed to and from large object stores and the cloud. Quantum says users can expect to see a 5x to 7x performance increase with StorNext 6.4 software, depending on the size of their objects and other factors. This feature enhances performance where single stream object performance is limited.

Among the other features in the StorNext upgrade is dynamic library pooling. This feature improves resiliency for large tape archives, enabling the use of multiple libraries for performance and redundancy, including scale-out tape with vertical libraries like Quantum’s Scalar i6 tape library. Customers can rotate file stores to different libraries to increase availability.

StorNext 6.4 also adds support for Amazon Glacier Deep Archive to StorNext’s integration with Amazon Web Services, Microsoft Azure, Google Cloud Platform and others.

Epic Games offers first look at Unreal Engine 5

Epic Games has offered a first look at Unreal Engine 5 — the next-generation of its technology designed to create photorealistic images on par with movie CG and real life. Designed for development teams of all sizes, it offers productive tools and content libraries

Unreal Engine 5 will be available in preview in early 2021, and in full release late in 2021, supporting next-generation consoles, current-generation consoles, PC, Mac, iOS and Android.

The reveal was introduced with Lumen in the Land of Nanite, a realtime demo running live on PlayStation 5, to showcase Unreal Engine technologies that can allow creators to reach the highest level of realtime rendering detail in the next generation of games and beyond.

New core technologies in Unreal Engine 5
Nanite virtualized micropolygon geometry will allow artists to create as much geometric detail as the eye can see. Nanite virtualized geometry means that film-quality source art comprising hundreds of millions or billions of polygons can be imported directly into Unreal Engine — anything from ZBrush sculpts to photogrammetry scans to CAD data. Nanite geometry is streamed and scaled in real time, so there are no more polygon count budgets, polygon memory budgets, or draw count budgets. Users won’t need to bake details to normal maps or manually author LODs, and, according to Epic, there is no loss in quality.

Lumen is a fully dynamic global Illumination solution that reacts to scene and light changes. The system renders diffuse interreflection with infinite bounces and indirect specular reflections in detailed environments, at scales ranging from kilometers to millimeters. Artists can create more dynamic scenes using Lumen, for example, changing the sun angle for time of day, turning on a flashlight, or blowing a hole in the ceiling. Additionally, indirect lighting will adapt accordingly. Lumen erases the need to wait for lightmap bakes to finish and to author light map UVs — a big time savings when an artist can move a light inside the Unreal Editor and lighting looks the same as when the game is run on console.

To build large scenes with Nanite geometry technology, Epic’s team made heavy use of the Quixel Megascans library, which provides film-quality objects up to hundreds of millions of polygons. To support vastly larger and more detailed scenes than previous generations, PlayStation 5 provides a dramatic increase in storage bandwidth.

The demo also showcases existing engine systems such as Chaos physics and destruction, Niagara VFX, convolution reverb and ambisonics rendering.

Unreal Engine 4 and 5 Timeline
Unreal Engine 4.25 already supports next-generation console platforms from Sony and Microsoft, and Epic is working closely with console manufacturers and dozens of game developers and publishers using Unreal Engine 4 to build next-gen games. Epic is designing for forward compatibility, so developers can get started with next-gen development now in UE4 and move projects to UE5 when ready.

Epic will release Fortnite, built with UE4, on next-gen consoles at launch and, in keeping with the team’s commitment to prove out industry-leading features through internal production, migrate the game to UE5 in mid-2021.

Waiving Unreal Engine Royalties: first $1 Million in Game Revenue
Game developers can still download and use Unreal Engine for free, but now royalties are waived on the first $1 million in gross revenue per title. The new Unreal Engine license terms are retroactive to January 1, 2020.

Epic Online Services
Friends, matchmaking, lobbies, achievements, leaderboards and accounts: Epic built these services for Fortnite, and launched them across seven major platforms — PlayStation, Xbox, Nintendo Switch, PC, Mac, iOS, and Android. Now Epic Online Services are opened up to all developers for free in a simple multiplatform SDK.

Developers can mix and match these services together with their own account services, platform accounts, or Epic Games accounts, which reach the world’s largest social graph with over 350 million players and their 2.2 billion friend connections across half a billion devices.

Review: Boris FX Continuum 2020.5 and Sapphire 2020

By Brady Betzel

The latest Boris FX 2020 plugin releases like Continuum, Sapphire and Mocha, as well as the addition of the Silhouette software (and paint plugin!), have really changed the landscape of effects and compositing.

Over the course of two reviews I will be covering all four of Boris FX’s 2020 offerings — Continuum 2020.5 and Sapphire 2020 now, and Mocha Pro 2020.5 and Silhouette 2020.5 to come soon — for NLE applications like Avid Media Composer, Adobe Premiere and Blackmagic Resolve. Silhouette is a bit different in that it comes as a stand-alone or a compatible plugin for Adobe Premiere or After Effects (just not Avid Symphony/Media Composer at the moment).

Because they are comparable, and editors tend to use both or choose between the two, Continuum 2020.5 and Sapphire 2020 are first. In an upcoming review, I will cover Mocha 2020.5 and Silhouette 2020.5; they have a similar feature set from the outside but work symbiotically on the inside.

While writing this review, Boris FX released the 2020.5 updates for everything but Sapphire, which will eventually come out, but they are dialing it in. You’ll see that I jump back and forth between 2020 and 2020.5 a little bit. Sorry if it’s confusing, but 2020 has some great updates, and 2020.5 has even more improvements.

All four Boris FX plugins could have a place in your editing tool kit, and I will point out the perks of each as well as how all of them can come together to make the ultimate Voltron-like plugin package for editors, content creators, VFX artists and more.

Boris FX has standardized the naming of each plugin and app with release 2020. Beyond that, Continuum and Sapphire 2020 continue to offer the same high-quality effects you know, continue to integrate Mocha tracking, and have added even more benefits to what I always thought was an endless supply of effects.

You have a few pricing options called Annual Subscription, Permanent, Renewals (upgrades), Units and Enterprise. While I have always been a fan of outright owning the products I use, I do like the yearly upgrades to the Boris FX products and think the Annual Subscription price (if you can afford it) is probably the sweet spot. Continuum alone ranges from $295 per year for Adobe-only to $695 per year for Avid, Adobe, Apple and OFX (Resolve). Sapphire alone ranges from $495 to $895 per year, Mocha Pro ranges from $295 to $595 per year, and Silhouette goes for $995 per year. You can bundle Continuum, Sapphire and Mocha Pro from $795 to $1,195 per year. If the entire suite of plugins is too expensive for your wallet, you can purchase individual categories of plugins called “units,” and you can find more pricing options here.

Ok, let’s run through some updates …

Continuum 2020.5
Boris FX Continuum 2020.5 has a few updates that make the 2020 and 2020.5 releases very valuable. At its base level, I consider Continuum to be more of an image restoration, utility and online editor tool kit. In comparison, Sapphire is more of a motion graphics, unicorn poop, particle emitter sparkle-fest. I mean unicorn poop in the most rainbow-colored and magnanimous way possible. I use Continuum and Sapphire every day, and Continuum is the go-to for keying, tracking, roto, film grain and more. Sapphire can really spice up a scene, main title or motion-graphics masterpiece.

My go-to Continuum tools are Gaussian Blur, Primatte Keyer (which has an amazing new secondary spill suppressor update) and Film Grain — all of which use the built-in Mocha planar tracking. There are more new tools to look at in Continuum 2020, including the new BCC Corner Pin Studio, BCC Cast Shadow and BCC Reflection. BCC Corner Pin Studio is a phenomenal addition to the Continuum plugin suite, particularly inside of NLEs such as Media Composer, which don’t have great built-in corner pinning abilities.

As an online editor, I often have to jump out of the NLE I’m using to do title work. After Effects is my tool of choice because I’m familiar with it, but that involves exporting QuickTime files, doing the work and re-exporting either QuickTime files with alpha channels or QuickTime files with the effect baked into the footage. If possible, I like to stay as “un-baked” as possible (feel free to make your own joke about that).

BCC Corner Pin Studio is another step forward in keeping us inside of one application. Using Corner Pin Studio with Mocha planar tracking is surprisingly easy. Inside of Media Composer, place the background on v2 and foreground on v1 of the timeline, apply BCC Corner Pin Studio, step into Effects Mode, identify the foreground and background, use Mocha to track the shot, adjust compositing elements inside of Avid’s Effect window, and you’re done. I’ve over-simplified this process, but it works pretty quickly, and with a render, you will be playing back a rock-solid corner pin track inside of the same NLE you are editing in.

Avid has a few quirks when working with alpha channels to say the least. When using BCC Corner Pin Studio along with the Avid title tool, you will have to “remove” the background when compositing the text. To do this, you click and drag (DO NOT Alt + Drag) a plugin like BCC Brightness and Contrast on top of the Avid title tool layer, enable “Apply to Title Matte” and set background to “None.”

It’s a little cumbersome, but once you get the workflow down, it gets mechanical. The only problem with this method is that when you replace the matte key on the Avid title tool layer, you lose the ability to change, alter or reposition the title natively inside of the Avid title effect or title tool itself. Just make sure your title is “final,” whatever final means these days. But corner pinning with this amount of detail inside of Media Composer can save hours of time, which in my mind equals saving money (or making more money with all your newly found free time). You can find a great six-minute tutorial on this by Vin Morreale on Boris FX’s YouTube page.

Two more great new additions to Continuum in release 2020 are BCC Cast Shadow and Reflection. What’s interesting is that all three — Corner Pin Studio, Cast Shadow and Reflection — can be used simultaneously. Well, maybe not all three at once, but Corner Pin Studio with Shadow or Reflection can be used together when putting text into live-action footage.

Life Below Zero, a show I online edit for Nat Geo, uses this technique. Sometimes I composite text in the snow or in the middle of a field with a shadow. I don’t typically do this inside of Media Composer, but after seeing what Corner Pin Studio can do, I might try it. It would save a few exports and round trips.

To ramp up text inserted into live-action footage, I like to add shadows or reflections. The 2020 Continuum update with Cast Shadow and Reflection makes it easy to add these effects inside of my NLE instead of having to link layers with pick whips or having special setups. Throw the effect onto my text (pre-built graphic in After Effects with an alpha channel) and boom: immediate shadow and/or reflection. To sell the effect, just feather off the edge, enable a composite-mode overlay, or knock the opacity down and you are done. Go print your money.

In the Continuum 2020.5 update, one of my most prized online editing tools that has been updated is BCC Remover. I use BCC Remover daily to remove camera people, drones in the sky, stray people in the background of shots and more. In the 2020.5 update, BCC Remover added some great new features that make one of the most important tools even more useful.

From an ease-of-use standpoint, BCC Remover now has Clone Color and Clone Detail sliders. Clone Color can be used to clone only the color from the source, whereas Clone Detail can be used to take the actual image from the source. You can mix back and forth to get the perfect clone. Inside of Media Composer, the Paint Effect has always been a go-to tool for me, mainly for its blurring and cloning abilities. Unfortunately, it is not robust — you can’t brighten or darken a clone; you can only clone color or clone the detail. But you can do both in BCC Remover in Continuum 2020.5.

In addition, you can now apply Mocha Tracking data to the Clone Spot option and specify relative offset or absolute offset under the “Clone” dropdown menu when Clone Spot is selected. Relative offset allows you to set the destination (in the GUI or Effects panel), then set the source (where you want to clone from), and when you move the destination widget, the source widget will be locked at the same distance it was set at. Absolute offset allows both the source and destination to be moved independently and tracked independently inside of Mocha.

There are a lot more Continuum 2020 updates that I couldn’t get into in this space, and even more for the 2020.5 update. More new transitions were added, like the trendy spin blur dissolve, the area brush in Mocha (which I now use all the time to make a quick garbage matte), huge Particle Illusion improvements (including additional shapes) and Title Studio improvements.

In 2020.5, Particle Illusion now has force and turbulence options, and Title Studio has the ability to cast shadows directly inside the plugin. Outside of Title Studio (and back inside of an NLE like Avid), you have direct access to Composite modes and Transformations, letting you easily adjust parameters directly inside of Media Composer instead of jumping back and forth.

Title Studio is really becoming a much more user-friendly plugin. But I like to cover what I actually use in my everyday editing work, and Corner Pin Studio, Cast Shadow/Reflection and Remover are what I use consistently.

And don’t forget there are hundreds of effects and presets including BCC Flicker Fixer, which is an easy fix to iris shifts in footage (I’m looking at you, drone footage)!

Sapphire 2020
I’ve worked in post long enough to remember when Boris FX merged with GenArts and acquired Sapphire. Even before the merger, every offline editor used Sapphire for its unmistakable S_Glow, Film Looks and more.

It’s safe to say that Sapphire is more of an artsy-look plugin. If you are wondering how it compares to Continuum, Sapphire will take over after you are done performing image restoration and technical improvements in Continuum. Adding glows, blurs, dissolves, flutter cuts and more. Sapphire is more “video candy” than technical improvements. But Sapphire also has technical plugins like Math Ops, Z Depth and more, so each plugin has its own perks. Ideally both work together very well if you can afford it.

What’s new in Sapphire 2020? There are a few big ones that might not be considered sexy, but they are necessary. One is OCIO support and the ability to apply Mocha-based tracking to 10 parameter-driven effects: S_LensFlare, S_EdgeRays, S_Rays, S_Luna, S_Grunge, S_Spotlight, S_Aurora, S_Zap, S_MuzzleFlash and S_FreeLens.

In addition, there are some beauty updates, like the new S_FreeLens. And one of the biggest under-the-hood updates is the faster GPU rendering. A big hurdle with third-party effects apps like Continuum and Sapphire is the render times when using effects like motion blur and edge rays with Mocha tracking. In Sapphire 2020 there is a 3x speed and performance increase (depending on the host app you are using it on). Boris FX has a great benchmark comparison.

So up first I want to cover the new OCIO support inside of Sapphire 2020. OCIO is an acronym for “OpenColorIO,” which was created by Sony Picture Imageworks. It’s essentially a way to use Sapphire effects, like lens flares, in high-end production workflows. For example, for Netflix final deliverables, they ask the colorist to work in an ACES environment, but the footage may be HLG-based. The OCIO options can be configured in the effect editor. So just choose the color space of the video/image you are working on and what the viewing color space is. That’s it.

If you are in an app without OpenColorIO, you can apply the effect S_OCIOTransform. This will allow you to use the OCIO workflow even inside apps that don’t have OCIO built in. If you aren’t worried about color space, this stuff can make your eyes glaze over, but it is very important when delivering a show or feature and definitely something to remember if you can.

On top of the tried-and-true Sapphire beauty plugins like S_Glow or S_Luna (to add a moon), Boris FX has added S_FreeLens to its beauty arsenal. Out in the “real world,” free lensing or “lens whacking” is when you take your lens off of your camera, hold it close to where it would normally mount and move the lens around to create dream-like images. It can add a sort of blur-flare dreamy look; it’s actually pretty great when you need it, but you are locked into the look once you do it in-camera. That’s why S_FreeLens is so great; you can now adjust these looks after you shoot instead of baking in a look. There are a lot of parameters to adjust, but if you load a preset, you can get to a great starting point. From defocus to the light leak color, you can animate and dial in the exact look you are going for.

Parameter tracking has been the next logical step in tying Mocha, Continuum and Sapphire together. Finally, in Sapphire 2020, you can use Mocha to track individual parameters. Like in S_LensFlare, you can track the placement of the hotspot and separately track its pivot.

It’s really not too hard once you understand how it correlates inside the Mocha interface. Sapphire sets up two trackers inside of Mocha: 1) the hotspot search area and position of the actual flare, and 2) the pivot search area and position of the pivot point. The search area gathers the tracking data, while the position crosshair is the actual spot on which the parameter will be placed.

While I’m talking about Mocha, in the Sapphire 2020 update, Mocha has added the Area Brush tool. At first, I was skeptical of the Area Brush tool — it seemed a little too easy — but once I gave in, I realized the Area Brush tool is a great way to make a rough garbage matte. Think of a magnetic lasso but with less work. It’s something to check out when you are inside of Mocha.

Summing Up
Continuum and Sapphire continue to be staples of broadcast TV editors for a reason. You can even save presets between NLEs and swap them (for instance, Media Composer to Resolve).

Are the Boris FX plugins perfect? No, but they will get you a lot further faster in your Media Composer projects without having to jump into a bunch of different apps. One thing I would love to see Boris FX add to Continuum and Sapphire is the ability to individually adjust your Mocha shapes and tracks in the Avid Effects editor.

For instance, if I use Mocha inside of BCC Gaussian Blur to track and blur 20 license plates on one shot — I would love to be able to adjust each “shape’s” blur amount, feather, brightness, etc., without having to stack additional plugin instances on top.

But Boris FX has put in a lot of effort over the past few updates of Continuum and Sapphire. Without a doubt, I know Continuum and Sapphire have saved me time, which saves me and my clients money. With the lines between editor, VFX artist and colorist being more and more blurred, Continuum and Sapphire are necessary tools in your arsenal.

Check out the many tutorials Boris FX has put up: and go update Continuum and Sapphire to the latest versions.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producers Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

DP Chat: Jeffrey Waldron on Hulu’s Little Fires Everywhere

By Randi Altman

Hulu’s Little Fires Everywhere stars Reese Witherspoon and Kerry Washington as, well, frenemies? Maybe? It’s hard to describe their relationship, other than a powderkeg covered with a fake smile. From the minute Washington’s Mia, a wandering artist and single mom, pulls into the upper-class Cleveland suburb of Shaker Heights and meets Witherspoon’s Elena, an uptight and OCD mother of four, you know the close-knit town won’t ever be the same.

Jeffrey Waldron

Set in the 1990s, this limited series, based on Celeste Ng’s book of the same name, puts a microscope on just-under-the-surface racism combined with your everyday, run-of-the-mill mother’s remorse.

We reached out to DP Jeffrey Waldron to find out about the look of the show and how he worked with his alternating DP Trevor Forrest, the directors and the EPs.

How early did you get involved on the show?
I was brought on with the other DP Trevor Forrest several weeks before principal photography began. We hit it off in an initial meeting that included the pilot director and the executive producers.

Can you talk about the look they wanted for the show?
My initial instincts for the show, based on the book and the first two scripts, turned out to be really in sync with what they’d been discussing. In the early stages we were all bringing visual references to the table and deciding on the overall visual arc of the eight episodes.

Ultimately, the visual ideas we landed on held that the character of Elena represented a sense of order — tight control of her life and family — and that Mia represented chaos. She’s a strong mother and a wandering artist, but there’s a lot we don’t know about her, and that’s where much of the early tension comes from.

Can you talk about the different looks?
Sure. The other key visual idea was the changing of the seasons, from August to December — which we represented through color in lighting and LUTs — warmer to cooler. In the late summer we see warmer highlights and maintain a bit of cyan in the shadows. But as the days grow shorter, they also grow bluer in the shadows, and the lighting becomes darker and more edgy, and the camerawork starts to loosen and become more reactive.

You mentioned the novel earlier. Did the look described in the book translate to the show?
There’s nothing super-specific to the novel that plays a big role in our approach to the look. But I have now done a couple of book adaptations as limited series and I do try to absorb the author’s prose style to see if there is a visual equivalent. The “voice” of the dialogue is most easily brought to life in a script, but the descriptive style of the novel is an interesting place for the cinematographer to look for tonal cues. What’s the tense? Who’s point of view? Is the narrator omniscient? Is it told loosely? Formally? Beautifully? Gritty? I feel that we did bring a sense of this voice to the show’s look — even if subconsciously.

It takes place in the 1990s. How did that affect how you shot it?
The ‘90s played into every aspect inside of the frames — the amazing production design, cars, costumes, hair and makeup — so there wasn’t a huge responsibility for the look of the show to scream 1990s.

That said, the combination of all of these elements and a more neutral, soft, formal style for Elena — especially in the first couple of episodes — does result in an overwhelmingly ‘90s vibe that we start to undo as her life gets tangled up with Mia’s. You’ll start to see darker, moodier, bluer lighting and a more handheld sense in the camera work as the plot thickens!

Where was it shot?
The show is set in Shaker Heights, Ohio, but we shot in Los Angeles. This brought the challenge of finding locations that looked right for the written scene, but also for the 1990s, and Ohio. As mentioned, the story begins in August but ends in December, so this meant bringing in rain and snow and giant silks to create overcast-feeling skies.

How did you go about choosing the right camera and lenses for this project?
The DP for Episode 1, Trevor Forrest, and Panavision magician Dan Sasaki had customized a few versions of Panavision’s Sphero 65 primes to differing strengths of flare, veiling glare and diffusion characteristics.

We tested them on the Alexa LF and ended up really loving the heaviest strength — tasking Panavision to create us an extended set. As we ventured into flashback sequences and ultimately the sixth episode, we brought on a mix of Panavision T and C series anamorphic primes to establish a different feel for seeing “memory.”

Any challenging scenes that you are particularly proud of, or ones that you found more challenging?
Every day is truly a challenge on a television schedule — even scenes that don’t look technically difficult can be Herculean to pull off when you see what else has to be shot on the same day.

All of the homecoming dance scenes in the third episode had to be shot in one day. When you consider the size of the space, the amount of extras and the limited hours of the minor actors, it was quite a task. We built two giant soft boxes that we could control on the lighting board allowing us to create contrast by turning sides of the boxes on and off as we moved the camera around the gymnasium. We had a Technocrane that allowed us to move the camera around the gym quite easily. Even if we weren’t using it for a high-angle crane-style shot, we’d use it like a Steadicam for moving masters. These tools definitely help to make the day.

Who did the color grade on the show?
Company 3’s Stefan Sonnenfeld using Blackmagic Resolve.

Now, moving away from the show, how did you become interested in cinematography?
As a kid I was obsessed with animation and wanted to be a hand-drawn animator. In fifth grade I joined an animation club. My mom would drop me off at their studio, you’d pay like $20 a month and you could use their 16mm camera and everything. This got me wanting to know more about film, which led me to still photography and ultimately to live-action film cinematography. I still love animation!

What inspires you artistically?
The work of other great cinematographers in the episodic realm really does inspire me. I know we share limitations of budget and time, and when I see incredibly beautiful and consistent imagery across a show, I’m excited to learn from them. Right now in episodic, it’s Igor Martinovic’s work on The Outsider. I’m also always excited to see whatever Christian Sprenger or Tim Ives is up to.

What about keeping up with new technology?
I’m not the most camera tech-y DP — my world is generally more affected by advances in the lighting realm. A new set of lenses is always interesting to me, but how it affects my work? A super-versatile new lighting instrument is far more likely to help my craft.

What new technology has changed the way you work over the past few years?
The cameras have obviously changed a bit, but these advances don’t usually change the way I work. The Sony Venice has to some extent — the ability to swap NDs so quickly and the dual native ISO, allowing a 2500 base ISO, have impacted how I put scenes together on set.

Lighting instruments like LiteMats, Astera tubes and SkyPanels have probably had the bigger effect on how I work, making it easier (and cooler) to create soft sources, choose colors without the need for gel and dim without affecting color. The ability to dim everything alone is such a game changer versus Kinos and hot lights.

Jeffrey Waldron on set

Care to share some of your best practices?
I have an ever-growing personal list of best practices — many super-specific to situations I’ve found a good solution for in the past or things to avoid. But the big thing for me is starting the day with some personal time at home to journal about the previous day’s work and the work ahead, and taking some deep breaths and thinking about the challenges coming up.

A film set can be a stressful place, so I try to set a calm and fun vibe and let people know I appreciate them. I honestly have so much fun on a film set; it’s long hours, but the time just flies for me. I love my crews, and I love creating with them.

Does your process change at all when working on a film versus an episodic or vice versa?
On a show like Little Fires Everywhere, with alternating DPs, my process is exactly the same as it would be on a feature. You intimately prep with your director, visiting locations together, craft your specific approach; work with your AD to assure the timing makes sense and meet with the production designer to look at plans.

On a show without an alternating DP, it’s quite different. You treat the first episode or episode block like a feature, but then you start shooting… and you’re juggling future episodes as you shoot. You’re generally not visiting locations with the director, instead relying on your crew to help relay the pertinent information. It’s different, it requires a bit more improvisation to pull things together on the day, but I actually love it too.

Explain your ideal collaboration with the director when setting the look of a project?
The ideal collaboration is one where it quickly becomes apparent that you have similar instincts — this allows you to run with your best and most ambitious ideas, knowing that they’ll be met with a sense of co-ownership and excitement.

With solid pre-production, shared instincts give way to a shorthand on set that just makes things come together more easily and more beautifully. It’s a lot harder when you’re not on the same page. You may read the same scene and have opposite ideas for what it should look like or how the blocking might look or what focal lengths might feel right. In these situations, it’s all about remaining flexible. I’ve had great collaborations with wonderful directors where we weren’t in sync visually, and we did wonderful work by finding a balance, but it’s harder.

What’s your go-to gear — things you can’t live without?
I don’t have any go-tos really. I love prime lenses of all sorts — getting to know their inherent strengths and flaws and exploiting them to tell the story is an exciting part of the job. I love ARRI cameras, and I’m also a big fan of Sony’s Venice camera. There isn’t anything I can’t live without camera-wise — I really feel that amazing images can be made on any of the common cameras and lenses that people are using.

I feel there are lighting tools I can’t live without: LiteMats, Astera tubes, SkyPanels — easy tweaks of color, everything is dimmable. It’s truly amazing the extra trouble we used to go through in lighting. I wouldn’t want to go back.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Cloud workflow vets launch M&E cloud migration company

A team of cloud workflow experts have launched Fusion Workflows, a new media-workflow design company designed to help M&E companies migrate their infrastructures to scalable cloud platforms.

The company says that moving to cloud-based workflows represents an opportunity to improve the inefficiencies in current systems. However, many companies struggle to understand what is involved in cloud migrations, the costs entailed and how it will affect the demands of their workforce, processes and tools. Fusion Workflows aims to remedy these issues by providing clients a customized “Workflow Migration Guide,” which acts as their design blueprint to rebuild their operations on scalable cloud infrastructure and software-defined processes.

Mark Turner

Fusion provides a holistic approach — working with their clients from inception through to deployment. Fusion provides a comprehensive initial analysis of workflow process and customizes business operations during the migration. The work continues post-migration to include training and onboarding, new software, security and documentation. This one-stop-shop approach is designed so internal teams and systems are working in sync and without interruption.

“The COVID crisis has forced media companies to create temporary hacks and interim cloud workflows but also exposed the need for them to develop a long-term cloud migration vision,” says Mark Turner, Fusion Workflows’ managing partner. “Every company now needs a plan to effectively operate their business without ties to physical locations, on-premises storage or hardware processing. At Fusion we look forward to helping companies design their own cloud migrations.”

Fusion’s team comprises domain experts from the US and Europe, who have designed and implemented cloud-based workflows and created first-in-market re-engineering standards. Fusion’s team has worked across all media industries including major movie and TV production, visual FX, animation, sports, live broadcast, digital cinema, music and OTT streaming.

In addition to Turner, who co-authored the 2020 MovieLabs paper “The Evolution of Media Creation: a 10 year vision for the future of Media Production, Post and Creative Technologies, the team includes ITV vet Emma Clifford, OTT engineer Andrew Ioannou, Lionsgate vet Thomas Hughes, former chief digital strategy officer at Sony Pictures Mitch Singer, recent Techicolor data systems engineer Daryll Strauss, former Sony Pictures CTO Spencer Stephens, Autodesk vet Chris Vienneau and ETC’s Erik Weaver.

AMD’s new Radeon Pro VII graphics card for 8K workflows

AMD has introduced the AMD Radeon Pro VII workstation graphics card designed for those working in broadcast and media in addition to CAE and HPC applications. According to AMD, the Radeon Pro VII graphics card offers 16GB of extreme speed HBM2 (high bandwidth memory) and support for six synchronized displays and high-bandwidth PCIe 4.0 interconnect technology.

AMD says the new card considerably speeds up 8K image processing performance in Blackmagic’s DaVinci Resolve in addition to performance speed updates in Adobe’s After Effects and Photoshop and Foundry’s Nuke.

The AMD Radeon Pro VII introduces AMD Infinity Fabric Link technology to the workstation market, which speeds application data throughput by enabling high-speed GPU-to-GPU communications in multi-GPU system configurations. The new workstation graphics card provides the high performance and advanced features that enable post teams and broadcasters to visualize, review and interact with 8K content.

The AMD Radeon Pro VII graphics card is expected to be available beginning mid-June for $1,899. AMD Radeon Pro VII-equipped workstations are expected to be available in the second half of 2020 from OEM partners.

Key features include:
– 16GB of HBM2 with 1TB/s memory bandwidth and full ECC capability to handle large and complex models and datasets smoothly with low latency.
– A high-bandwidth, low-latency connection that allows memory sharing between two AMD Radeon Pro VII GPUs, enabling users to increase project workload size and scale, develop more complex designs and run larger simulations to drive scientific discovery. AMD Infinity Fabric Link delivers up to 5.25x PCIe 3.0 x16 bandwidth with a communication speed of up to 168GB/s peer-to-peer between GPUs.
– Users can access their physical workstation from virtually anywhere with the remote workstation IP built into AMD Radeon Pro Software for Enterprise driver.
– PCIe 4.0 delivers double the bandwidth of PCIe 3.0 to enable smooth performance for 8K, multichannel image interaction.
– Enables precise synchronized output for display walls, digital signage and other visual displays (AMD FirePro S400 synchronization module required).
– Supports up to six synchronized display panels, full HDR and 8K screen resolution (single display) combined with ultra-fast encode and decode support for enhanced multi-stream workflows.
– Optimized and certified with pro applications for stability and reliability. The list of Radeon Pro Software-certified ISV applications can be found here.
– ROCm open ecosystem, an open software platform for accelerated compute, provides an easy GPU programming model with support for OpenMP, HIP and OpenCL and for ML and HPC frameworks.

AMD Radeon Pro workstation graphics cards are supported by the Radeon Pro Software for Enterprise driver, offering enterprise-grade stability, performance, security, image quality and other features, including high-resolution screen capture, recording and video streaming. The company says the latest release offers up to a 14 percent year-over-year performance improvement for current-generation AMD Radeon Pro graphics cards. The new software driver is now available for download from AMD.com.

AMD also released updates for AMD Radeon ProRender, a physically-based rendering engine built on industry standards that enables accelerated rendering on any GPU, any CPU and any OS. The updates include new plugins for Side Effects Houdini and Unreal Engine and updated plugins for Autodesk Maya and Blender.

For developers, an updated AMD Radeon ProRender SDK is now available on the redesigned GPUOopen.com site and is now easier to implement with an Apache License 2.0. AMD also released a beta SDK of the next-generation Radeon ProRender 2.0 rendering engine with enhanced CPU and GPU rendering support with open-source versions of the plugins.

Production begins again on New Zealand’s Shortland Street series

By Katie Hinsen

The current global pandemic has shut down production all over the world. Those who can have moved to working from home, and there’s speculation about how and when we’ll get back to work again.

New Zealand, a country with a significant production economy, has announced that it will soon reopen for shoots. The most popular local television show, Shortland Street, was the first to resume production after an almost six-week break. It’s produced by Auckland’s South Pacific Pictures.

Dylan Reeve

I am a native New Zealander who has worked in post there on and off over the years. Currently I live in Los Angeles, where I am an EP for dailies and DI at Nice Shoes, so taking a look at how New Zealand is rolling things out interests me. With that in mind, I reached out to Dylan Reeve, head of post production at Shortland Street, to find out how it looked the week they went back to work under Level 3 social distancing restrictions.

Shortland Street is a half-hour soap that runs five nights a week on prime-time television. It has been on air for around 28 years and has been consistently among the highest-rated shows in the nation. It’s a cultural phenomenon. While the cast and crew take a single three-week annual break from production during the Christmas holiday season, the show has never really stopped production … until the pandemic hit.

Shortland Street’s production crew is typically made up of about 100 people; the post department consists of two editors, two assistants, a composer and Reeve, who is also the online editor. Sound mixes and complex VFX are done elsewhere, but everything else for the production is done at the studio.

New Zealand responded to COVID-19 early, instituting one of the harshest lockdowns in the world. Reeve told me that they went from alert Level 1 — basic social distancing, more frequent handwashing — to Level 3 as soon as the first signs of community transmission were detected. They stayed at this level for just two days before going to Level 4: complete lockdown. New Zealanders had 48 hours to get home to their families, shop for supplies and make sure they were ready.

“On a Monday afternoon at about 1:30pm, the studio emptied out,” explains Reeve. “We were shut down, but we were still on air, and we had about five or six weeks’ worth of episodes in various stages of production and post. I then had two days to figure out and prepare for how we were going to finish all of those and make sure they got delivered so that the show could continue to be on air.”

Shortland Street’s main production building dressed as the exterior of the hospital where the show is set, with COVID workplace safety materials on the doors.

The nature of the show’s existing workflow meant that Reeve had to copy all the media to drives and send Avids and drives home with the editors. The assistant editors logged in remotely for any work they needed to do, and Reeve took what he needed home as well to finish onlining, prepping and delivering those already-shot episodes to the broadcaster. They used Frame.io for review and approval with the audio team and with the directors, producers and network.

“Once we knew we were coming back into Level 3, and the government put out more refined guidelines about what that required, we had a number of HoD meetings — figuring out how we could produce the show while maintaining the restrictions necessary.”

I asked Reeve whether he and his crew felt safe going back to work. He reminded me that New Zealand only went back down to Level 3 once there had been a period with no remaining evidence of community transmission. Infection rates in New Zealand had spent two weeks in single digits, including two days when no new cases had been reported.

Starting Up With Restrictions
My conversation with Reeve took place on May 4, right after his first few days back at work. I asked him to explain some of the conditions under which the production was working while the rest of the country was still in isolation. Level 3 in New Zealand is almost identical to the lockdown restrictions put in place in US cities like New York and Los Angeles.

“One of the key things that has changed in terms of how we’re producing the show is that we physically have way less crew in the building. We’re working slower, and everyone’s having to do a bit more, maybe, than they would normally.

Shortland Street director Ian Hughes and camera operator Connagh Heath discussing blocking with a one-metre guide.

“When crew are in a controlled workspace where we know who everyone is,” he continues, “that allows us to keep track of them properly — they’re allowed to work within a meter of one another physically (three feet). Our policy is that we want staff to stay two meters (six feet) apart from one another as much as possible. But when we’re shooting, when it’s necessary, they can be a meter from one another.”

Reeve says the virus has certainly changed the nature of what can be shot. There are no love scenes, no kissing and no hugs. “We’re shooting to compensate for that; staging people to make them seem closer than they are.

Additionally, everything stays within the production environment. Parts of our office have been dressed; parts of our building have been dressed. We’ll do a very low-profile exterior shoot for scenes that take place outside, but we’re not leaving the lot.”

Under Level 3, everyone is still under isolation at home. This is why, explains Reeve, social distancing has to continue at work. That way any infection that comes into the team can be easily traced and contained and affect as few others as possible. Every department maintains what they call a “bubble,” and very few individuals are allowed to cross between them.

Actors are doing their own hair and makeup, and there are no kitchen or craft services available. The production is using and reusing a small number of regular extras, with crew stepping in occasionally as well. Reeve noted that Australia was also resuming production on Neighbours, with crew members acting as extras.

“Right now in our studio, our full technical complement consists of three camera operators at the moment, just one boom operator and one multi-skilled person who can be the camera assist, the lighting assist and the second boom op if necessary. I don’t know how a US production would get away with that. There’s no chance that someone who touches lights on a union production can also touch a boom.”

Post Production
Shortland Street’s post department is still working from home. Now that they are back in production, they are starting to look at more efficient ways to work remotely. While there are a lot of great tools out there for remote post workflows, Reeve notes that for them it’s not that easy, especially when hardware and support are halfway across the world, borders are closed and supply chains are disrupted.

There are collaboration tools that exist, but they haven’t been used “simply because the pace and volume of our production means it’s often hard to adapt for those kinds of products,” he says. “Every time we roll camera, we’re rolling four streams of DNxHD 185, so nearly 800Mb/s each time we roll. We record that media directly into the server to be edited within hours, so putting that in the cloud or doing anything like that was never the best workflow solution. When we wanted feedback, we just grabbed people from the building and dragged them into the edit suite when we wanted them to look at something.”

Ideally, he says, they would have tested and invested in these tools six months ago. “We are in what I call a duct tape stage. We’re taking things that exist, that look useful, and we’re trying to tape them together to make a solution that works for us. Coming out of this, I’m going to have to look at the things we’ve learned and the opportunities that exist and decide whether or not there might be some ways we can change our future production. But at the moment, we’re just trying to make it through.”

Because Shortland Street has only just resumed shooting, they haven’t reached the point yet where they need to do what Reeve calls “the first collaborative director/editor thing” from start to finish. “But there are two plans that we’re working toward. The easy, we-know-it-works plan is that we do an output, we stick it on Frame.io, the director watches it, puts notes on it, sends it back to us. We know that works, and we do that sometimes with directors anyway.

“The more exciting idea is that we have the directors join us on a remote link and watch the episodes as they would if they were in the room. We’ve experimented with a few things and haven’t found a solution that makes us super-happy. It’s tricky because we don’t have an existing hardware solution in place that’s designed specifically for streaming a broadcast output signal over an internet connection. We can do a screen-share, and we’ve experimented with Zoom and AnyDesk, but in both those cases, I’ve found that sometimes the picture will break up unacceptably, or sync will drift — especially using desktop-sharing software that’s not really designed to share full-screen video.”

Reeve and crew are just about to experiment with a tool used for gaming called Parsec. It’s designed to share low-latency, in-sync, high-frame-rate video. “This would allow us to share an entire desktop at, theoretically, 60fps with half-second latency or less. Very brief tests looked good. Plan A is to get the directors to join us on Parsec and screen-share a full-screen output off Avid. They can watch it down and discuss with the editor in real time or just make their own notes and work through it interactively. If that experience isn’t great, or if the directors aren’t enjoying it, or if it’s just not working for some reason, we’ll fall back to outputting a video, uploading it to Frame.io and waiting for notes.

What’s Next?
What are the next steps for other productions returning to work? Shortland Street is the only production that chose to resume under Level 3. The New Zealand Film Commission has said that filming will resume eventually under Level 2, which is being rolled out in several stages beginning this week. Shortland Street’s production company has several other shows, but none have plans to resume yet.

“I think it’s a lot harder for them to stay contained because they can’t shoot everything in the studio,” explains Reeve. “Our production has an added advantage because it is constantly shooting and the core cast and crew are mostly the same every day. I think these types of productions will find it easiest to come back.”

Reeve says that anyone coming into their building has to sign in and deliver a health declaration — recent travel, contact with any sick person, other work they’ve been engaged in. “I think if you can do some of that reasonable contact tracing with the people in your production, it will be easier to start again. The more contained you can keep it, the better. It’s going to be hard for productions that are on location, have high turnover or a large number of extras — anything where they can’t keep within a bubble.

“From a post point of view, I think we’re going to get a lot more comfortable working remotely,” he continues. “And there are lots of editors who already do that, especially in New Zealand. If that can become the norm, and if there are tools and workflows that are well established to support that, it could be really good for post production. It offers a lot of great opportunities for people to essentially broaden their client essentially or the geographic regions in which they can work.

Productions are going to have to make their own sort of health and safety liability decisions, according to Reeve. “All of the things we are doing are effectively responding to New Zealand government regulation, but that won’t be the case for everyone else.”

He sees some types of production finding an equilibrium. “Love Island might be the sort of reality show you can make. You can quarantine everyone going into that show for 14 days, make sure they’re all healthy, and then shoot the show because you’re basically isolated from the world. Survivor as well, things like that. But a reality show where people are running around the streets isn’t happening anymore. There’s no Amazing Race, that’s for sure.”


After a 20-year career talent-side, Katie Hinsen turned her attention to building, developing and running post facilities with a focus on talent, unique business structures and innovative use of technology. She has worked on over 90 major feature and episodic productions, founded the Blue Collar Post Collective, and currently leads the dailies & DI department at Nice Shoes.

Faceware Studio uses ML to create facial animation in realtime

Faceware Technologies, which provides markerless 3D facial motion capture solutions, has released Faceware Studio, a new platform for creating high-quality facial animation in realtime. Faceware Studio is built from the ground up to be a complete replacement for the company’s former Live product.

According to the company, Studio reimagines the realtime streaming workflow with a modern and intuitive approach to creating instant facial animation. Using single-click calibration, Studio can track and animate a face in real time by using machine learning and the latest neural network techniques. Artists can then tune and tailor the animation to an actor’s unique performance and build additive logic with Motion Effects. The data can then be streamed to Faceware-supported plugins in Unreal Engine, Unity, MotionBuilder and soon Maya for live streaming or recording in engine on an avatar.

Faceware Studio is available now. Pricing starts at just $195 per month or $2,340 annually, which includes support. Trial versions are available on their site.

New features include:
Realtime jaw positioning using deep learning: Faceware’s improved jaw positioning tech, which is currently used in Faceware Retargeter, is now available in Studio, giving users the ability to create fast and accurate lipsync animation in realtime.

Motion Effects and Animation Tuning: Studio offers users direct control over their final animation. They can visualize and adjust actor-specific profiles with Animation Tuning and build powerful logic into the realtime data stream using Motion Effects.

Realtime Animation Viewport and Media Timeline: Users can see their facial animation from any angle with Studio’s 3D animation viewport and use the timeline and media controls to pause, play and scrub through their media to find suitable frames for calibration and to focus on specific sections of their video.

Dockable, customizable interface: A customizable user interface with docking panels and saveable workspaces.

Posting John Krasinski’s Some Good News

By Randi Altman

Need an escape from a world filled with coronavirus and murder hornets? You should try John Krasinski’s weekly YouTube show, Some Good News. It focuses on the good things that are happening during the COVID-19 crisis, giving people a reason to smile with things such as a virtual prom, Krasinski’s chat with astronauts on the ISS and bringing the original Broadway cast of Hamilton together for a Zoom singalong.

L-R: Remy, Olivier, Josh and Lila Senior

Josh Senior, owner of Leroi and Senior Post in Dumbo, New York, is providing editing and post to SGN. His involvement began when he got a call from a mutual friend of Krasinski’s, asking if he could help put something together. They sent him clips via Dropbox, and a workflow was born.

While the show is shot at Krasinski’s house in New York at different times during the week, Senior’s Fridays, Saturdays and Sundays are spent editing and posting SGN.

In addition to his post duties, Senior is an EP on the show, along with his producing partner Evan Wolf Buxbaum at their production company, Leroi. The two work in concert with Allyson Seeger and Alexa Ginsburg, who executive produced for Krasinski’s company, Sunday Night Productions. Production meetings are held on Tuesday, and then shooting begins. After footage is captured, it’s still shared via Dropbox or good old iMessage.

Let’s find out more…

What does John use for the shoot?
John films on two iPhones. A good portion of the show is screen-recorded on Zoom, and then there’s the found footage user-generated content component.

What’s your process once you get the footage? And, I’m assuming, it’s probably a little challenging getting footage from different kinds of cameras?
Yes. In the alternate reality where there’s no coronavirus, we run a pretty big post house in Dumbo, Brooklyn. And none of the tools of the trade that we have there are really at play here, outside of our server, which exists as the ever-present backend for all of our remote work.

The assets are pulled down from wherever they originate. The masters are then housed behind an encrypted firewall, like we do for all of our TV shows at the post house. Our online editor is the gatekeeper. All the editors, assistant editors, producers, animators, sound folks — they all get a mirrored drive that they download, locally, and we all get to work.

Do you have a style guide?
We have a bible, which is a living document that we’ve made week over week. It has music cues, editing style, technique, structure, recurring themes, a living archive of all the notes that we’ve received and how we’ve addressed them. Also, any style that’s specific to segments, post processing, any phasing or audio adjustments that we make all live within a document, that we give to whoever we onboard to the show.

Evan Wolf Buxbaum

Our post producers made this really elegant workflow that’s a combination of Vimeo and Slack where we post project files and review links and share notes. There’s nothing formal about this show, and that’s really cool. I mean, at the same time, as we’re doing this, we’re rapidly finishing and delivering the second season of Ramy on Hulu. It comes out on May 29.

I bet that workflow is a bit different than SGN’s.
It’s like bouncing between two poles. That show has a hierarchy, it’s formalized, there’s a production company, there’s a network, there’s a lot of infrastructure. This show is created in a group text with a bunch of friends.

What are you using to edit and color Some Good News?
We edit in Adobe Premiere, and that helps mitigate some of the challenges of the mixed media that comes in. We typically color inside of Adobe, and we use Pro Tools for our sound mix. We online and deliver out of Resolve, which is pretty much how we work on most of our things. Some of our shows edit in Avid Media Composer, but on our own productions we almost always post in Premiere — so when we can control the full pipeline, we tend to prefer Adobe software.

Are review and approvals with John and the producers done through iMessage in Dropbox too?
Yes, and we post links on Vimeo. Thankfully we actually produce Some Good News as well as post it, so that intersection is really fluid. With Ramy it’s a bit more formalized. We do notes together and, usually internally, we get a cut that we like. Then it goes to John, and he gives us his thoughts and we retool the edit; it’s like a rapid prototyping rather than a gated milestone. There are no network cuts or anything like that.

Joanna Naugle

For me, what’s super-interesting is that everyone’s ideas are merited and validated. I feel like there’s nothing that you shouldn’t say because this show has no agenda outside of making people happy, and everybody’s uniquely qualified to speak to that. With other projects, there are people who have an experience advantage, a technical advantage or some established thought leadership. Everybody knows what makes people happy. So you can make the show, I can make the show, my mom can make the show, and because of that, everything’s almost implicitly right or wrong.

Let’s talk about specific episodes, like the ones featuring the prom and Hamilton? What were some of the challenges of working with all of that footage. Maybe start with Hamilton?
That one was a really fun puzzle. My partner at Senior Post, Joanna Naugle, edited that. She drew on a lot of her experience editing music videos, performance content, comedy specials, multicam live tapings. It was a lot like a multicam live pre-taped event being put together.

We all love Hamilton, so that helps. This was a combination of performers pre-taping the entire song and a live performance. The editing technique really dissolves into the background, but it’s clear that there’s an abundance of skill that’s been brought to that. For me, that piece is a great showcase of the aesthetic of the show, which is that it should feel homemade and lo-fi, but there’s this undercurrent of a feat to the way that it’s put together.

Getting all of those people into the Zoom, getting everyone to sound right, having the ability to emphasize or de-emphasize different faces. To restructure the grid of the Zoom, if we needed to, to make sure that there’s more than one screen worth of people there and to make sure that everybody was visible and audible. It took a few days, but the whole show is made from Thursday to Sunday, so that’s a limiting factor, and it’s also this great challenge. It’s like a 48-hour film festival at a really high level.

What about the prom episode?
The prom episode was fantastic. We made the music performances the day before and preloaded them into the live player so that we could cut to them during the prom. Then we got to watch the prom. To be able to participate as an audience member in the content that you’re still creating is such a unique feeling and experience. The only agenda is happiness, and people need a prom, so there’s a service aspect of it, which feels really good.

John Krasinski setting up his shot.

Any challenges?
It’s hard to put things together that are flat, and I think one of the challenges that we found at the onset was that we weren’t getting multiple takes of anything, so we weren’t getting a lot of angles to play with. Things are coming in pretty baked from a production standpoint, so we’ve had to find unique and novel ways to be nonlinear when we want to emphasize and de-emphasize certain things. We want to present things in an expositional way, which is not that common. I couldn’t even tell you another thing that we’ve worked on that didn’t have any subjectivity to it.

Let’s talk sound. Is he just picking up audio from the iPhones or is he wearing a mic?
Nope. No, mic. Audio from the iPhones that we just run through a few filters on Pro Tools. Nobody mics themselves. We do spend a lot of time balancing out the sound, but there’s not a lot of effect work.

Other than SGN and Ramy, what are some other shows you guys have worked on?
John Mulaney & the Sack Lunch Bunch, 2 Dope Queens, Random Acts of Flyness, Julio Torres: My Favorite Shapes by Julio Torres and others.

Anything that I haven’t asked that you think is important?
It’s really important for me to acknowledge that this is something that is enabling a New York-based production company and post house to work fully remotely. In doing this week over week, we’re really honing what we think are tangible practices that we can then turn around and evangelize out to the people that we want to work with in the future.

I don’t know when we’re going to get back to the post house, so being able to work on a show like this is providing this wonderful learning opportunity for my whole team to figure out what we can modulate from our workflow in the office to be a viable partner from home.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Soundwhale app intros new editing features for remote audio collaboration

Soundwhale, which makes a Mac and iOS-based remote audio collaboration app, has introduced a new suite of editing capabilities targeting teams working apart but together during this COVID crisis. It’s a virtual studio that lets engineers match sound to picture and lets actors, with no audio experience, record their lines. The company says this is done with minimal latency and no new hardware or additional specialized software required. The app also allows pro-quality mixing, recording and other post tasks, and can work alongside a user’s DAW of choice.
“Production teams are scattered and in self-isolation all around the world,” says Soundwhale founder Ameen Abdulla, who is an audio engineer. “They can’t get expensive hardware to everyone. They have to get people without any access to, or knowledge of, a digital audio workspace like Pro Tools to collaborate. That’s why we felt some urgency to launch more stand-alone editing options within Soundwhale, specifically designed for tasks like ADR.”



Soundwhale allows users to:
– Record against picture
– Control another user’s timeline and playback
– Manage recorded takes
– Cope with slow connections thanks to improved compression
– Optimize stream settings
– Share takes in timeline of other users
– Customize I/O for different setups
– Do basic copy, paste, and moving of audio files
– Share any file by drag and drop
– Share screens and video chat

Soundwhale stems from Abdulla’s own challenges trying to perfect the post process from his recording studio, Mothlab, in Minneapolis. His clients were often on the West Coast and he needed to work with them remotely. Nothing available at the time worked very well, and drawing on his technical background, he set out to fix the issues, which included frustrating lags.

“Asynchronous edits and feedback are hell,” Abdulla notes. “As the show goes on, audio professionals need ways to edit and work with talent in real time over the Internet. Everybody’s experiencing this same thing. Everyone needs the same thing at the same time.”

Video Chat: Posting Late Night With Seth Meyers from home

By Randi Altman

For many, late-night shows have been offering up laughs during a really tough time, with hosts continuing to shoot from dens, living rooms, backyards and country houses, often with spouses and kids pitching in as crew.

NBC’s Late Night With Seth Meyers is one of those shows. They had their last in-studio taping on March 13, followed by a scheduled hiatus week, followed by the news they wouldn’t be able to come back to the studio. That’s when his team started preproduction and workflow testing to figure out questions like “How are we going to transfer files?” and “How are we going to get it on the air?”

I recently interviewed associate director and lead editor Dan Dome about their process and how that workflow has been allowing Meyers to check in daily from his wasp-ridden and probably haunted attic.

(Watch our original Video Interview here or below.)

How are you making this remote production work?
We’re doing a combination of things. We are using our network laptops to edit footage that’s coming in for interviews or comedy pieces. That’s all being done locally, meaning on our home systems and without involving our SAN or anything like that. So we’re cutting interviews and comedy pieces and then sending links out for approval via Dropbox. Why Dropbox? The syncing features are really great when uploading and downloading footage to all the various places we need to send it.

Once a piece is approved and ready to go into the show — we know the timings are right, we know the graphics are right, we know the spelling is correct, audio levels look good, video levels look good — then we upload that back to Dropbox and back to our computers at 30 Rock where our offices are located. We’re virtually logging into our machines there to compile the show. So, yeah, there are a few bits and pieces to building stuff remotely. And then there are a few bits and pieces to actually compiling the show on our systems back at home base.

What do you use for editing?
We’re still on Adobe Premiere. We launched on Premiere when the show started in February of 2014, and we’re still using that version — it’s solid and stable, and doing a daily show, we don’t necessarily get a ton of time to test new versions. So we have a stable version that we like for doing the show composite aspect of things.

When we’re back at 30 Rock and editing remote pieces, we’re using the newer versions of Adobe Premiere Pro CC 2015.2 9.2.0 (41 Build). At home we are using Premiere Pro CC 2020 14.0.4 (Build 18).

Let talk about how Seth’s been shooting. What’s his main camera?
Some of the home studio recording has been on iPads and iPhones. Then we’re using Zoom to do interviews, and there are multiple records of that happening. The files are then uploaded and downloaded between the edit team, and our director is in on the interviews, setting up cameras and trying to get it to look the best it can.

Once those interviews are done, the different records get uploaded to Dropbox. On my home computer, I use a 6TB CalDigit drive for Dropbox syncing and media storage. (Devon Schwab and Tony Dolezal, who are also editing pieces, use 4TB G-RAID drives with Thunderbolt 3.) So as soon as they tell me the file is up, I sync locally on the folder I know it’s going to, the media automatically downloads, and we simultaneously download it to our systems at 30 Rock. So it syncs there as well. We have multiple copies of it, and if we need to, we can hand off a project between me, Devin or Tony; we can do that pretty easily.

Have you discovered any challenges or happy surprises working this way?
It has been a nice happy surprise that it’s like, “Oh wow, this is working pretty well.” We did have a situation where we thought we might lose power on the East coast because of rains and winds and things like that. So we had safeguards in place for that, as far as having an evergreen show that was ready to go for that night in case we did end up losing power. It would have been terrible, but everything held up, and it worked pretty well.

So there are certainly some challenges to working this way, but it’s amazing that we are working and we can keep our mind on other things and just try to help entertain people while this craziness is going on.

You can watch our original Video Interview with Dome here:

Chimney Group: Adapting workflows in a time of crisis

By Dana Bonomo

In early March, Chimney delivered a piece for TED, created to honor women on International Women’s Day featuring Reshma Saujani, founder of Girls Who Code. This was in the early days of coronavirus taking hold in the United States. We had little comprehension at that point of the true extent to which we would be impacted as a country and as an industry. As the situation grew and awareness around the severity of the COVID-19 health crisis sunk in, we started to realize that it would be animated projects like this one that we would come to rely upon.

TED & Ultimate Software: International Women’s Day

This film showcases the use of other creative solutions when live-action projects can’t be shot. But the real function of work like this is that, on an emotional level, it feels good to make something with a socially actionable message.

In just the last few weeks, platforms have been saturated with COVID-19-related content: salutes to healthcare workers, PSAs from federal, state and local authorities and brands sharing messages of unity. Finding opportunities that can include some form of social purpose help provide hope to our communities while also raising the spirits of those creating it. We are currently in production on two of these projects and they help us feel like we’re contributing in some small way with the resources we have.

As a global company, Chimney is always highlighting our worldwide service capabilities, with 12 offices on four continents, and our abilities to work together. We’ve routinely used portals such as Zoho and Slack in the past, yet now I’m enjoying the shift in how we’re communicating with each other in a more connected and familiar way. Just a short time ago we might have used a typical workflow, and today we’re sharing and exchanging ideas and information at an exponential rate.

As a whole, we prefer to video chat, have more follow-ups and create more opportunities to work on internal company goals in addition to just project pipelines and calendars. There’s efficiency in brainstorming and solving creative challenges in real time, either as a virtual brainstorm or idea exchange in PM software and project communication channels. So at the end of a meeting, internal review or present, current project kick off, we have action items in place and ready to facilitate on a global scale.

Our company’s headquarters is in Stockholm, Sweden. You may have heard that Sweden’s health officials have taken a different approach to handling COVID-19 than most countries, and it is resulting in less drastic social distancing and isolation measures while still being quite mindful of safety. Small shoots are still possible with crews of 10 or less — so we can shoot in Sweden with a fully protected crew, executing safe and sanitary protocols —and we can livestream to clients worldwide from set.

This is Chimney editor Sam O’Hare’s work-from-home setup.

Our CEO North America Marcelo Gandola is encouraging us individually to schedule personal development time, whether it’s for health and wellness, master classes on subjects that interest us, certifications for our field of expertise, or purely creative and expressive outlets. Since many of us used our commute time for that before the pandemic, we can still use that time for emotional recharging in different ways. By setting aside time for this, we regain some control of our situation. It lifts our morale and it can be very self-affirming, personally and professionally.

While most everyone has remote work capabilities these days, there’s a level of creative energy in the air, driven by the need to employ different tactics — either by working with what you have (optimizing existing creative assets, produced content, captured content from the confines of home) or replacing what was intended to be live-action with some form of animation or graphics. For example, Chimney’s Creative Asset Optimization has been around for some time now. Using Edisen, our customization platform, we can scale brands’ creative content on any platform, in any market at any time, without spending more. From title changes to language versioning and adding incremental design elements, clients get bigger volumes of content with high-quality creative for all channels and platforms. So a campaign that might have had a more limited shelf life on one platform can now stretch to an umbrella campaign with a variety of applications depending on its distribution.

Dana Bonomo

They say that necessity is the mother of invention, and it’s exciting to see how brands and makers are creatively solving current challenges. Our visual effects team recently worked on a campaign (sorry we can’t name this yet) that took existing archival footage and — with the help of VFX — generated content that resonated with audiences today. We’re also helping clients figure out remote content capture solutions in lieu of their live events getting canceled.

I was recently on a Zoom call with students at my alma mater, SUNY Oneonta, in conversation with songwriter and producer John Mayer. He said he really feels for students and younger people during this time, because there’s no point of reference for them to approach this situation. The way the younger generation is adapting — reacting by living so fully despite so many limitations — they are the ones building that point of reference for the future. I think that holds true for all generations… there will always be something to be learned. We don’t fully know what the extent of our learning will be, but we’re working creatively to make the most of it.

Main Image: Editor Zach Moore’s cat is helping him edit


Dana Bonomo is managing director at Chimney Group in NYC.

Behind the Title: Squeak E. Clean executive producer Chris Clark

This executive producer combines his background as a musician with his 11 years working at advertising  agencies to ensure clients get their audio post needs met.

Name: Chris Clark

Company: Squeak E. Clean Studios

Can you describe your company?
We are an international audio post and music company with a fun-loving, multi-talented crew of composers, producers, sound designers and engineers across six studios.

What’s your job title?
Executive Producer

What does that entail?
I work closely with our creative and production teams to ensure the highest quality for audio post production, original music and music supervision are upheld. I also take the lead role and responsibility for ensuring our agency and brand clients are satisfied with our work, and that the entire Chicago operation is seamlessly integrated with the other five studios on a daily basis.

Chicago Ideas Week “Put the Guns Down” music video

What would surprise people the most about what falls under that title?
I also take out the trash. Sometimes.

You have an agency background. How will that help you at Squeak E. Clean Studios?
I’ve had the privilege of working closely with creative teams and clients on a wild and wide array of inspired music treatments over the past 11 years at Leo Burnett and across various Publicis Groupe agencies.

I know what it’s like to be in those meetings when things go off the rails and, fortunately, I take pleasure in creating calm and restoring inspiration by laying out all the musical options available. I know this intimate knowledge of agency and brand challenges will help us at Squeak E. Clean Studios provide really smart, focused music and post audio solutions without any filler.

What’s your favorite part of the job?
The individual challenge of each project and the individual person at the other end of that request. It’s a small and very personal industry… and being able to help out creative friends with great music solutions just makes us all feel good.

What’s your least favorite?
When I kick myself after remembering I don’t have to do everything; we have many capable collaborative people across the company.

What is your most productive time of the day?
Morning. Coffee and excitement for the day’s challenges bring out the best in me, typically.

If you didn’t have this job, what would you be doing instead? Something entrepreneurial in music marketing. Or working as a high school basketball coach.

Why did you choose this profession?
Music had always been my therapy, but it wasn’t until I moved to NYC and started making my own bedroom-produced music that I realized it had fully taken over as my passion. It suddenly surpassed creative writing, sports, comedy, etc. I was working in media communications and bored with my day-to-day challenges when it struck me that there must be some type of work in music and advertising/marketing. Then this whole world opened up just one Craigslist job search later.

You are an industry veteran. How have you seen the industry change over the years?
I worked in the media world when digital broke through to challenge broadcast for supremacy, worked in DJ music marketing when the DJ/producers came to the forefront of pop music, and I’ve been fortunate enough to benefit from the rise of music experts in large agency settings.

Somewhere in all of that you see the industry embracing more content, the individuality of the rightful creator and the importance of music in every aspect of development and production. I’m pretty happy with the changes.

Can you name some recent projects you have worked on?
I recently finalized some new Coors Light “Chill” campaign spots with Leo Burnett. I am also producing original music for 3 Beats by Dre spots for their creative team in Japan with the help of our awesome composer roster.

What is the project that you are most proud of?
Uniting Chicago rappers like Common, G Herbo, Saba, Noname and King Louie for the Chicago Ideas Week Put the Guns Down music video was really special and unprecedented.

Samsung

I also pitched and licensed a cover of “Across the Universe” for a Samsung global spot that featured a father and his newborn son as a main vignette; it came out shortly after the birth of my first son, Charlie, so that will always be a memorable one.

Name three pieces of technology you can’t live without.
Phone, TV, turntables!

What social media channels do you follow?
Instagram mainly, but Twitter and Facebook in moderation.

What do you do to de-stress from it all?
Playing in bands and writing music with no intention of ever tying it to anything professional is always a great release and escape from the day job. I’ve found it also helps me relate to artists and up-and-coming composers/producers who are trying to get their footing in the music industry.

New ZBooks and Envy offerings from HP

A couple of weeks ago, HP introduced the HP ZBook Studio and HP ZBook Create mobile workstations as well as the HP Envy 15. All are the latest additions to the HP Create ecosystem, an initiative introduced during last year’s Adobe Max.

ZBook Studio

These Z by HP solutions are small-form-factor devices for resource-intensive tasks and target professional content creators working in design and modeling. The HP Envy portfolio, including the newest Envy 15, is built for editing video, stills, graphics and web design.

The systems are accelerated by Nvidia Quadro and GeForce RTX GPUs and backed by Nvidia Studio drivers. HP’s ZBook Studio, ZBook Create and Envy 15 laptops with RTX GPUs are members of Nvidia’s RTX Studio program, featuring acceleration for demanding raytraced rendering and AI creative workloads and Studio drivers for reliability.

HP says that the latest additions to the Z by HP portfolio are different from other thin and light mobile workstations and 15-inch notebooks in that they are built specifically for use in the most demanding creative workflows, which call for pro applications, graphics and color accuracy.

The ZBook Studio and ZBook Create, which target visual effects artists, animators and colorists, have all-day battery life. And HP’s DreamColor display accurately represents their content on screen thanks to the built-in colorimeter for automatic self-calibration, 100% sRGB and Adobe RGB for accuracy.

The Z Power Slider gives users control over the type of performance and acoustics for specific workflows. At the same time, the Z Predictive Fan Algorithm intelligently manages fan behavior based on the kind of work and applications used by creatives.

HP Envy 15

The systems feature vapor cooling chamber and liquid crystal polymer, gaming-class thermals. The custom advanced cooling system pushes air away from the CPU and GPU in two-dimensional paths, unlocking power density that the company says is 2.8 times higher gen-to-gen in a laptop design that is up to 22% smaller.

HP says the highly recyclable and lightweight aluminum exterior provides five times the abrasion resistance of painted carbon fiber and still complies with MIL-STD 810G testing.

The HP Envy offers a minimalist design with a sophisticated aluminum chassis and diamond-cut design and is the first Envy with a layer of glass on top of the touchpad for a smooth-touch experience. The HP Envy 15 features an all-aluminum chassis with 82.6% screen-to-body ratio, up to a 4K OLED Vesa DisplayHDR True Black display with touch interface, 10th Gen Intel Core processors, Nvidia GeForce RTX 2060 with Max-Q, and gaming-class thermals for the ultimate experience in creator performance.

Framestore adds three to London film team

During these difficult times, it’s great to hear that Framestore in London has further beefed up its staff. Three new hires will join the VFX studio’s entire workforce (approximately 2,500 people) working from home at the moment.

Two-time VES Award-winner Graham Page joins the company as VFX supervisor after 14 years with Dneg, where he supervised the company’s work on titles such as Avengers: Endgame, Captain Marvel and Avengers: Infinity War. He brings Framestore’s tally of VFX supervisors to 24 — all of whom from pre-production to on-set supervision through to the final delivery.

Mark Hodgkins, who rejoins the company after a 12-year stint with Dneg, will serve as Framestore’s global head of FX, film, and brings with him technical knowledge and extensive experience working on properties from Marvel, DC and J.K. Rowling.

Anna Ford joins Framestore as head of business development, film. Formerly sales and bidding manager at Cinesite, Ford brings knowledge of the global production industry and a passion for emerging technologies that will help identify and secure exciting projects that will push and challenge Framestore’s team of creative thinkers.

“While working in different areas of the company’s business, Graham, Anna and Mark all share the kind of outlook and attitude we’re always looking for at Framestore. They’re forward-thinking, creative in their approaches and never shy away from the kind of challenges that will bring out the best in themselves and those they work with,” says Fiona Walkinshaw, Framestore’s global managing director, film.

Director/EP Lenny Abrahamson on Hulu’s Normal People

By Iain Blair

Irish director Lenny Abrahamson first burst onto the international scene in 2015 with the harrowing drama Room, which picked up four Oscar nominations, including for Best Adapted Screenplay and Best Director. Abrahamson’s latest project is Hulu’s Normal People, based on Sally Rooney’s best-selling novel of the same name.

 (Photo by: Enda Bowe)

Lenny Abrahamson

The series focuses on the passionate, tender and complicated relationship of Marianne and Connell — from the end of their school days in a small town in the west of Ireland to their undergraduate years at Trinity College. At school, he’s a popular sports hero, while she’s upper class, lonely, proud and intimidating. But when Connell comes to pick up his mother from her cleaning job at Marianne’s house, a strange connection grows between the two teenagers… one they are determined to conceal. A year later, they’re both studying in Dublin and Marianne has found her feet in a new social world but Connell hangs on the sidelines, shy and uncertain as the tables are turned.

The series stars Daisy Edgar-Jones (War of the Worlds, Cold Feet) as Marianne and Paul Mescal, in his first television role, as Connell. Adapted by Sally Rooney alongside writers Alice Birch and Mark O’Rowe, Normal People is a 12-episode 30-minute drama series produced by Element Pictures for Hulu and BBC Three. Rooney and Abrahamson also serve as executive producers and Endeavour Content is the international distributor.

I spoke with Abrahamson — whose credits also include The Little Stranger, Frank, Garage, What Richard Did and Adam & Paul — about making the show, his workflows and his love of editing.

You’ve taken on quite a few book projects in the past. What was the appeal of this one?
It’s always an instinctual thing — something chimes with me. Yeah, I’ve done a number of literary adaptations, and I wasn’t really looking to do another. In fact, I was setting out not do another one, but in this case the novel just struck me so much, with such resonance, and it’s very hard not to do it when that happens. And it’s an Irish project and I hadn’t shot in Ireland for some seven years, and it was great to go back and do something that felt so fresh, so all of that was very attractive to me.

(Photo by Enda Bowe/Hulu)

Rooney co-wrote the script with Alice Birch, but translating any novel to a visual medium is always tricky, especially this book with all its inner psychological detail. As a director, how challenging was it to translate the alternating sections of the book while maintaining forward motion of the narrative?
It was pretty challenging. The writing is so direct and honest, yet deep, which is a rare combination. And Sally’s perspective is so fresh and insightful, and all that was a challenge I tried to take on and capture in the filming. How do you deal with something so interior? When you really care about the characters as I did, how do you do justice to them and their extraordinary relationship? But I relished the challenge.

Obviously, casting the right actors was crucial. What did Daisy Edgar-Jones and Paul Mescal bring to their roles and the project?
I feel very lucky to have found them. We actually found Paul first, very early on. He’d been making some waves in theater in Ireland, but he’d never been on screen in anything. What I saw in him was a combination of intelligence, which both characters had to have, and brilliant choices in playing Connell. He really captured that mix of masculinity and anxiety which is so hard to do. There is a sensitivity but also an inarticulateness, and he has great screen presence. Daisy came later, and it was harder in that you had to find someone who works well with Paul. She’s brilliant too, as she found a way of playing Marianne’s spikiness in a very un-clichéd and delicate way that allows you to see past it. They ended up working so well together and became good friends, too.

You co-directed with Hettie Macdonald (Doctor Who, Howard’s End), with you directing the first six episodes and Macdonald directing the final six. How did that work in terms of maintaining the same naturalistic tone and feel you set?
We spoke a lot at the beginning when she came on board. The whole idea was for her to bring her own sensibility to it. We’d already cast and shot the first half and we knew a director of her caliber wasn’t going to break that. We had two DPs: Suzie Lavelle and she had had Kate McCullough. During the shooting I had the odd note, like, “It looks great,” but I was more involved with her material during editing, which is natural as the EP. We had a great relationship.

Tell us about post and your approach.
We did it all — the editing, sound and VFX — at Outer Limits, which is on the coast about 30 minutes outside Dublin. It’s run by two guys who used to be at Screen Scene, where I posted my last five or six films. I followed them over there as I like them so much. It’s a lovely place, very quiet. The editor and I were based out there for the whole thing.

Our VFX supervisor was Andy Clarke, and it’s all pretty invisible stuff, like rain and all the fixes. I also did all the grading and picture finishing at Outer Limits with my regular colorist Gary Curran, who’s done nearly all my projects. He knows what I like, but also when to push me into bolder looks. I tend toward very low-contrast, desaturated looks, but over the years he’s nudged me into more saturated, vivid palettes, which I now really like. And we’ll be doing a 4K version.

I love post, as after all the stress of the shoot and all the instant decisions you have to make on the set, it’s like swimming ashore. You reach ground and can stand up and get all the water out of your lungs and just take your time to actually make the film. I love all the creative possibilities you get in post, particularly in editing.

You edited with your go-to editor Nathan Nugent. Was he on set?
No, we sent him dailies. On a film, he might be cutting next door if we’re in a studio, but not on this. He’s very fast and I’d see an assembly of stuff within 24 hours of shooting it. We like to throw everything up in the air again during the edit. Whatever we thought as we shot, it’s all up for grabs.

What were the main editing challenges?
I think choosing to work with short episodes was really good as it takes away some of the pressure to have lots of plot and story, and it allows you to look really closely at the shifts in their relationship. But there’s nowhere to hide, and you have to absolutely deeply care about the two of them. But if you do, then all the losses and gains, the highs and lows, become as big a story as any you could tell. That’s what gives it momentum. But if you don’t get that right, or you miscast it, then the danger is that you do lose that momentum.

So it’s a real balancing act… to feel that you’re spending time with them but also letting the story move forward in a subtle way. It’s the challenge of all editing — maintaining the tension and pace while letting an audience get a deep and close enough look at the characters.

Lenny Abrahamson

Can you talk about the importance of music and sound in the show.
I’ve had the same team ever since What Richard Did, including my supervising sound designer and editor Steve Fanagan and sound mixer Niall O’Sullivan. They’re so creative. Then I had composer Stephen Rennicks who’s also done all my projects. What was different this time was that we also licensed some tracks, as it just felt right. Our music supervisors Juliet Martin and Maggie Phillips were great with that.

So it was a core team of five, and I did what I always like to do — get all of that involved far earlier than you’d normally do. We don’t just lock picture and hand it over, so this way you have sound constantly interacting with editorial, and they both develop organically at the same time.

What’s next?
Another collaboration with Sally on her first novel, “Conversations With Friends,” with the same team I had on this. But with the COVID-19 pandemic, who knows when we’ll be able to start shooting.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Sound Devices producing 30,000 face shields per day

During times of crisis, people and companies step up. One of those companies is Wisconsin-based pro audio equipment manufacturer Sound Devices, wnhich is producing more than 30,000 face shields each day to help keep frontline workers safe in the fight against COVID-19. The company has pulled together a coalition of local manufacturers in the Reedsburg, Wisconsin, area to achieve this number, including Columbia Parcar, VARC, Cellox and Hankscraft AJS.

Sound Devices realized it could simultaneously play a direct role in helping protect health care workers and keep local area production-line workers employed. Around 100 people in the Reedsburg area are working daily to bring in material, assemble and ship the FS-1 face shields. Sound Devices sells the shields at a nonprofit price and has already shipped nearly a quarter million shields around Wisconsin and the rest of the US.

“The real heroes in this operation have been our line workers,” says Lisa Wiedenfeld, VP of finance and operations at Sound Devices. “They have been coming in day after day and cranking out these face shields while maintaining strict safety standards including wearing face masks, 10-foot distancing and extensive sanitation procedures. Under normal circumstances, ramping up manufacturing on a high volume of a new product is challenging enough, let alone avoiding a dangerous virus at the same time. My hat is off to all of our workers.”

“We started production of our FS-1 and FS-1NL face shields on March 24th, producing about 400 per day. As we’ve increased production to 30,000 per day, one of the most difficult aspects has been procuring enough parts to build consistently,” said Matt Anderson, CEO /president of Sound Devices. “Luckily, we have an extremely resourceful purchasing team. They have tapped our excellent network of Wisconsin-based suppliers. When our production levels outstripped what our suppliers here could do, our overseas suppliers pitched in to augment the supply of parts. But getting parts sent to us has been extremely difficult due to the reduced capacity of shippers. This whole experience has been very challenging but rewarding.”

Sound Devices now has FS-1 (original) and FS-1NL (latex-free) shields in stock. Face shields may be purchased by anyone in the US directly from store.sounddevices.com or by contacting sales@sounddevices.com.

Frame.io offers beta version of Transfer, updates app to v3.6

Frame.io has launched Frame.io v3.6 along with a beta version of a new application called Frame.io Transfer. Frame.io v3.6’s new features are designed for the evolving needs remote workflows, with a particular focus on speed and security. The expanded toolset addresses the need for fast project downloads with the Frame.io Transfer app, boosts security with features like Watermark ID for enterprise accounts and improves collaboration with new features like iOS Offline Mode and Folder Sharing.

Frame.io Transfer works on both Mac and Windows OS. Transfer lets users download large files and sophisticated folder structures — even entire projects — with one click. It supports EDL and XML formats so users can identify specific files, accelerating the process of relinking to original camera files for final conforms, color grading or sharing assets for VFX. Transfer allows users to monitor active downloads and to drag and drop to reprioritize their order. Finally, Transfer facilitates fast and secure downloads, regardless of unstable internet connections; if there’s a disruption mid-download, Transfer pauses and automatically resumes once reconnected.

“Transfer was originally slated for release later this year, but as a response to the profound shift in the way we’re working and the tools our customers need immediately, we’re releasing it in beta today,” says Emery Wells, cofounder/CEO of Frame.io. (Check out our video interview with him below.)

For secure sharing, enterprise users can now secure Presentation and Review links using login-only access, which means that only specified recipients can view Share Links. Recipients will see a list of everything that’s been shared with them in Frame.io’s new Inbox, which offers a clean and focused view.

Frame.io v3.6 also has Watermark ID, which gives customers the ultimate layer of visible security. When any viewer presses “Play,” Frame.io completes a realtime, on-demand transcode of the video with that viewer’s personal identifying information burned into every frame. A two-hour video starts playing back in less than two seconds.

Offline Mode

Users can now add folders to Review Links, allowing them to easily organize and share assets across teams or projects. Any changes made to folders after they are shared are dynamically updated in the Review Link. This is especially useful for teams that produce episodic content or programming that relies on a library of media. Frame.io also made it faster, easier and more intuitive to organize assets with an improved “Move-to” and “Copy-to” flow.

With this update, Frame.io users will now be able to see all the Presentations they’ve shared and access their settings from one organized list. The display shows folder sizes so users can see at a glance which are the heaviest projects, making it easier to optimize projects and storage.

Frame.io v3.6 consolidates notifications made on the same video within short periods of time, grouping them together into one notification. Users can filter by read or unread, see comment previews and scrub asset thumbnails to easily spot what needs to be reviewed or addressed.

Offline Mode for Frame.io’s iOS apps allows users work from anywhere. They can tap a file to make it available offline, then review and leave comments. As soon as the app comes back online, comments are automatically synced to the project.

COVID-19: How our industry is stepping up

We’ve been using this space to talk about how companies are discounting products, raising money and introducing technology to help with remote workflows, as well as highlighting how pros are personally pitching in.

Here are the latest updates, followed by what we’ve gathered to date:

Adobe
Adobe has made a $4.5 million commitment to trusted organizations that are providing vital assistance to those most in need.

• Adobe is joining forces with other tech leaders in the Bay Area to support the COVID-19 Coronavirus Regional Response Fund of the Silicon Valley Community Foundation, a trusted foundation that serves a network of local nonprofits. Adobe’s $1 million donation will help provide low-income people in Santa Clara County through The Santa Clara County Homelessness Prevention System Financial Assistance Program  with immediate financial assistance to help pay rent or meet other basic needs. Additionally, Adobe is donating $250,000 to the Valley Medical Center Foundation to purchase life-saving ventilators for Bay Area hospitals.
• Adobe has donated $1 million to the COVID-19 Fund of the International Federation of Red Cross and Red Crescent Societies, the recognized global leader in providing rapid disaster relief and basic human and medical services. Adobe’s support will help aid vulnerable communities impacted by COVID-19 around the world. This is in addition to the $250,000 the company is donating to Direct Relief as a part of Adobe’s #HonorHeroes campaign.
• To support the community in India, Adobe is donating $1 million towards the American India Foundation (AIF) and the Akshaya Patra Foundation. The donation will help AIF source much-needed ventilators for hospitals, while the grant for Akshaya Patra will provide approximately 5 million meals to impacted families.

Harbor
Harbor is releasing Inspiration in Isolation, a new talk series that features filmmakers in candid conversation about their creative process during this unprecedented time and beyond. The web series aims to reveal the ideas and rituals that contribute to their creative process. The premiere episode features celebrated cinematographer Bradford Young and senior colorist Joe Gawler. The two, who are collaborators and friends, talk community, family, adapting to change and much more.

The full-length episodes will be released on Harbor’s new platform, HarborPresents, with additional content on Harbor’s social media (@HarborPictureCo).

HPA
The HPA has formed the HPA Industry Recovery Task Force, which will focus on sustainably resuming production and post services, with the aim of understanding how to enable content creation in an evolving world impacted by the pandemic.

The task force’s key objectives are:
• To serve as a forum for collaboration, communication and thought leadership regarding how to resume global production and post production in a sustainable fashion.
• To understand and influence evolving technical requirements, such as the impact of remote collaboration, work from home and other workflows that have been highlighted by the current crisis.
• To provide up-to-date information and access to emerging health and safety guidelines that will be issued by various governments, municipalities, unions, guilds, industry organizations and content creators.
• To provide collaborative support and guidance to those impacted by the crisis.

Genelec
Genelec is donating a percentage of every sale of its new Raw loudspeaker range to the Audio Engineering Society (AES) for the remainder of this year. Additionally, Genelec will fund 10 one-year AES memberships for those whose lives have been impacted by the COVID-19 crisis. A longtime sustaining member of AES, Genelec is making the donation to help sustain the society’s cash flow, which has been significantly affected by the coronavirus situation.

OWC
OWC has expanded its safety protocols, as they continue to operate as an essential business in Illinois. They have expanded their already strong standard operating practice in terms of cleanliness with additional surface disinfection actions, as well as both gloves and masks being used by their warehouse and build teams. Even before recent events, manufacturing teams used gloves to prevent fingerprinting units during build, but those gloves have new importance now. In addition, OWC has both MERV air filters in place and a UV air purifier, which combined are considered to be 99.999% effective in killing/capturing all airborne bacteria and viruses.

Red

For a limited time, existing DSMC2 and Red Ranger Helium and Gemini customers can purchase a Red Extended Warranty at a discounted price. Existing customers who are into their second year of warranty can pay the standard pricing they would receive within their first year instead of the markup price. For example, instead of paying $1,740 (the 20% markup), a DSMC2 Gemini owner who is in within the second year of warranty can purchase an Extended Warranty for $1,450.

This promotion has been extended to June 30. Adding the Red Extended Warranty not only increases the warranty coverage period but also provides benefits such as priority repair, expedited shipping, and premium technical support directly from Red. Customers also have access to the Red Rapid Replacement Program. Extended Warranty is also transferable to new owners if completing a Transfer of Ownership with Red.

DejaSoft
DejaSoft has extended its offering of giving editors 50% off all their DejaEdit licenses — it now goes through the end of June. In addition, the company will help users implement DejaEdit in the best way possible to suit their workflow. DejaEdit allows editors to share media files and timelines automatically and securely with remote co-workers around the world, without having to be online continuously. It helps editors working on Avid Nexis, Media Composer and EditShare workflows across studios, production companies and post facilities ensure that media files, bins and timelines are kept up to date across multiple remote edit stations.

Assimilate
Assimilate is offering all of its products — including Scratch 9.2, Scratch VR 9.2, PlayPro 9.2, Scratch Web and the recently released Live Looks and Live Assist — for free through October 31. Users can register for free licenses. Online tutorials are here and free access to Lowepost online Scratch training is here.

B&H
B&H is partnering with suppliers to donate gear to the teams at Mount Sinai and other NYC hospitals to help health care professionals and first responders stay in touch with their loved ones. Some much-needed items are chargers, power sources, battery packs and mobile accessories. B&H is supporting the Mayor’s Fund to Advance New York City and Direct Relief.

FXhome
FXhome last month turned the attention of its “Pay What You Want” initiative to direct proceeds to help fight Covid-19. This month, in an effort to teach the community new skills, and inspire them with new ideas to help them reinvent themselves, FXhome has today launched a new, entirely free Master Class series designed to teach everything from basic editing, to creating flashy title sequences, to editing audio and of course, learning basic VFX and compositing.

Nugen Audio 
Nugen Audio has a new “Staying Home, Staying Creative” initiative aimed at promoting collaboration and creativity in a time of social distancing. Included are a variety of videos, interviews and articles that will inspire new artistic approaches for post production workflows. The company is also providing temporary replacement licenses for any users who do not have access to their in-office workstations.

Already available on the Staying Creative web page is a special interview with audio post production specialist Keith Alexander. Building from his specialty in remote recording and sound design for broadcast, film and gaming, Alexander shares some helpful tips on how to work efficiently in a home-based setting and how to manage audio cleanup and broadcast-audio editing projects from home. There’s also an article focused on three ways to improve lo-fi drum recording in a less-than-ideal space.

Nugen is also offering temporary two-month licenses for current iLok customers, along with one additional Challenge Response license code authorization. The company has also reduced the prices of all products in its web store.

Tovusound 
Tovusound has extended its 20% discount until the end of the month and has added some new special offers.

The Spot Edward Ultimate Suite expansion, regularly $149, is now $79 with coupon. It adds the Spot creature footstep and movement instrument to the Edward footstep, cloth and props designer. Customers also get free WAV files with the purchase of all Edward instruments and expansions and with all Tovusound bundles. Anyone who purchased one of the applicable products after April 1 also has free access to the WAV files.

Tovusound will continue to donate an additional 10% of the sales price to the CleanOceanProject.org. Customers may claim their discounts by entering STAYHOME in the “apply coupon” field at checkout. All offers end on April 30.

 

Previous Updates

Object Matrix and Cinesys-Oceana
Object Matrix and Cinesys-Oceana are hosting a series of informal online Beer Roundtable events in the coming months. The series will discuss the various challenges with implementing hybrid technology for continuity, remote working and self-serve access to archive content.You can register for the next Beer Roundtable here. The sessions will be open, fun and relaxed. Participants are asked to grab themselves a drink and simply raise their glass when they wish to ask a question.

During the first session, Cinesys-Oceana CTO Brent Angle and Object Matrix CEO Jonathan Morgan will introduce what they believe to be the mandatory elements of the ultimate hybrid technology stack. This will be followed by a roundtable discussion hosted by Harry Skopas, director M&E solutions architecture and technical sales at Cinesys-Oceana, with guest appearances from the media and sports technology communities.

MZed
MZed, an online platform for master classes in filmmaking, photography and visual storytelling, is donating 20% of all sales to the Los Angeles Food Bank throughout April. For every new MZed Pro membership, $60 is donated, equating to 240 meals to feed hungry children, seniors and families. MZed serves the creative community, a large portion of which lives in the LA area and is being hit hard by the lockdown due to the coronavirus. MZed hopes to help play a role in keeping high-risk members of the community fed during a time of extreme uncertainty.

MZed has also launched a “Get One, Gift One” initiative. When someone purchases an MZed Pro membership, that person will not only be supporting the LA Food Bank but will instantly receive a Pro membership to give to someone else. MZed will email details upon purchase.

MZed offers hundreds of hours of training courses covering everything from photography and filmmaking to audio and lighting in courses like “The Art of Storytelling” with Alex Buono and Philip Bloom’s Cinematic Masterclass.

NAB Show
NAB Show’s new digital experience, NAB Show Express, will take place May 13-14. The platform is free and offers 24-hour access to three educational channels, on-demand content and a Solutions Marketplace featuring exhibitor product information, announcements and demos. Registration for the event will open on April 20 at NABShowExpress.com. Each channel will feature eight hours of content streamed daily and available on-demand to accommodate the global NAB Show audience. NAB Show Express will also offer NAB Show’s signature podcast, exploring relevant themes and featuring prominent speakers.

Additionally, NAB Show Express will feature three stand-alone training and executive leadership events for which separate registrations will be available soon. These include:
• Executive Leadership Summit (May 11), produced in partnership with Variety
• Cybersecurity & Content Protection Summit (May 12), produced in partnership with Content Delivery & Security Association (CDSA) and Media & Entertainment Services Alliance (MESA) – registration fees apply
• Post | Production World Online (May 17-19), produced in partnership with Future Media Conferences (FMC) – registration fees apply.

Atto 
Atto Technology is supporting content producers who face new workflow and performance challenges by making Atto Disk Benchmark for macOS more widely available and by updating Atto 360 tuning, monitoring and analytics software. Atto 360 for macOS and Linux have been updated for enhanced stability and include an additional tuning profile. The current Windows release already includes these updates. The software is free and can be downloaded directly from Atto.

Sigma
Sigma has launched a charitable giving initiative in partnership with authorized Sigma lens dealers nationwide. From now until June 30, 2020, 5% of all Sigma lens sales made through participating dealers will be donated to a charitable organization of the dealers’ choice. Donations will be made to organizations working on COVID-19 relief efforts to help ease the devastation many communities are feeling as a result of the global crisis. A full list of participating Sigma dealers and benefiting charities can be found here.

FXhome 
To support those who are putting their lives on the line to provide care and healing to those impacted by the global pandemic, FXhome is adding Partners In Health, Doctors Without Borders and the Center for Disaster Philanthropy as new beneficiaries of the FXhome “Pay What You Want” initiative.

Pay What You Want is a goodwill program inspired by the HitFilm Express community’s desire to contribute to the future development of HitFilm Express, the company’s free video editing and VFX software. Through the initiative, users can contribute financially, and those funds will be allocated for future development and improvements to HitFilm. Additionally, FXhome is contributing a percentage of the proceeds to organizations dedicated to global causes important to the company and its community. The larger the contribution from customers, the more FXhome will donate.

Besides adding the three new health-related beneficiaries, FXhome has extended its campaign to support each new cause from one month to three months, beginning in April and running through the end of June. A percentage of all proceeds of revenues generated during this time period will be donated to each cause.

Covid-19 Film and TV Emergency Relief Fund
Created by The Film and TV Charity in close partnership with the BFI, the new COVID-19 Film and TV Emergency Relief Fund provides support to the many thousands of active workers and freelancers who have been hit hardest by the closure of productions across the UK. The fund has received initial donations totaling £2.5 million from Netflix, the BFI, BBC Studios, BBC Content, WarnerMedia and several generous individuals.

It is being administered by The Film and TV Charity, with support from BFI staff. The Film and TV Charity and the BFI is covering all overheads, enabling donations to go directly to eligible workers and freelancers across film, TV and cinema. One-off grants of between £500 and £2,500 will be awarded based on need. Applications for the one-off grants can be made via The Film and TV Charity’s website. The application process will remain open for two weeks.

The Film and TV Charity also has a new COVID-19 Film and TV Repayable Grants Scheme offering support for industry freelancers waiting for payments under the Government’s Self-employment Income Support Scheme. Interest-free grants of up to £2,000 will be offered to those eligible for Self-employment Income Support but who are struggling with the wait for payments in June. The Covid-19 Film and TV Repayable Grants Scheme opens April 15. Applicants will have one week to make a claim via The Film and TV Charity’s website.

Lenovo
Lenovo is offering a free 120-day license of Mechdyne’s TGX Remote Desktop software, which uses Nvidia Quadro GPUs and a built-in video encoder to compress and send information from the host workstation to the end-point device to decode. This eliminates lag on complex and detailed application files.

Teams can share powerful, high-end workstation resources across the business, easily dialing up performance and powerful GPUs from their standard workstation to collaborate remotely with coworkers around the world.

Users keep data and company IP secure on-site while reducing the risk of data breaches and remotely administering computer hardware assets from anywhere, anytime.
Users install the trial on their host workstations and install the receiver software on their local devices to access their applications and projects as if they were in the office.

Ambidio 
To help sound editors, mixers and other post pro who suddenly find themselves working from home, Ambidio is making its immersive sound technology, Ambidio Looking Glass, available for free. Sound professionals can apply for a free license through Ambidio’s website. Ambidio is also waiving its per-title releasing fee for home entertainment titles during the current cinema shutdown. It applies to new titles that haven’t previously been released through Blu-ray, DVD, digital download or streaming. The free offer is available through May 31.

Ambidio Looking Glass can be used as a monitoring tool for theatrical and television projects requiring immersive sound. Ambidio Looking Glass produces immersive sound that approximates what can be achieved on a studio mix stage, except it is playable through standard stereo speaker systems. Editors and mixers working from home studios can use it to check their work and share it with clients, who can also hear the results without immersive sound playback systems.

“The COVID-19 pandemic is forcing sound editors and mixers to work remotely,” says Ambidio founder Iris Wu. “Many need to finish projects that require immersive sound from home studios that lack complex speaker arrays. Ambidio Looking Glass provides a way for them to continue working with dimensional sound and meet deadlines, even if they can’t get to a mix stage.”

Qumulo
Through July 2020, Qumulo is offering its cloud-native file software for free to public and private-sector medical and health care research organizations that are working to minimize the spread and impact of the COVID-19 virus.

“Research and health care organizations across the world are working tirelessly to find answers and collaborate faster in their COVID-19 vaccine mission,” said Matt McIlwain, chairman of the board of trustees of the Fred Hutchinson Cancer Research Center and managing partner at Madrona Venture Group. “It will be through the work of these professionals, globally sharing and analyzing all available data in the cloud, that a cure for COVID-19 will be discovered.”

Qumulo’s cloud-native file and data services allows organizations to use the cloud to capture, process, analyze and share data with researchers distributed across geographies. Qumulo’s software works seamlessly with the applications medical and health care researchers have been using for decades, as well as with artificial intelligence and analytics services more recently developed in the cloud.

Medical organizations can register to use Qumulo’s file software in the cloud, which will be deployable through the Amazon Web Services and Google Cloud marketplaces.

Goldcrest Post
Goldcrest Post has established the capability to conduct most picture and sound post production work remotely. Colorists, conform editors and other staff are now able to work from home or a remote site and connect to the facility’s central storage and technical resources via remote collaboration software. Clients can monitor work through similar secure, fast and reliable desktop connections.

The service allows Goldcrest to ensure theatrical and television projects remain on track while allowing clients to oversee work in as normal a manner as possible under current circumstances.

Goldcrest has set up a temporary color grading facility at a remote site convenient for its staff colorists. The site includes a color grading control panel, two color-calibrated monitors and a high-speed connection to the main Goldcrest facility. The company has also installed desktop workstations and monitors in the homes of editors and other staff involved in picture conforming and deliverables. Sound mixing is still being conducted on-site, but sound editorial and ancillary sound work is being done from home.In taking these measures, the facility has reduced its on-site staff to a bare minimum while keeping workflow disruption to a minimum.

Ziva Dynamics
Ziva Dynamics is making Ziva VFX character simulation software free for students and educators. The same tools used on Game of Thrones, Hellboy and John Wick: Chapter 3 are now available for noncommercial projects, offering students the chance to learn physics-based character creation before they graduate. Ziva VFX Academic licenses are fully featured and receive the same access and support as other Ziva products.

In addition to the software, Ziva Academic users will now receive free access to Ziva Dynamics’ simulation-ready assets Zeke the Lion (previously $10,000) and Lila the Cheetah. Thanks to Ziva VFX’s Anatomy Transfer feature, the Zeke rig has helped make squirrels, cougars, dogs and more for films like John Wick 3, A Dog’s Way Home and Primal.

Ziva Dynamics will also be providing a free Ziva Academic floating lab license to universities so students can access the software in labs across campuses whenever they want. Ziva VFX Academic licenses are free and open to any fully accredited institution, student, professor or researcher (an $1,800 value). New licenses can be found in the Ziva store and are provided following a few eligibility questions. Academic users on the original paid plan can now increase their license count for free.

OpenDrives 
OpenDrives’ OpenDrives Anywhere is an in-place private cloud model that enables customers with OpenDrives to work on the same project from multiple locations without compromising performance. With existing office infrastructure, teams already have an in-place private cloud and can extend its power to each of their remote professionals. No reinvestment in storage is needed.

Nothing changes from a workflow perspective except physical proximity. With simple adjustments, remote control of existing enterprise workstations can be extended via a secure connection. HP’s ZCentral Remote Boost (formerly RGS) software will facilitate remote access over secure connection to your workstations, or Teradici can provide both dedicated external hardware and software solutions for this purpose, giving teams the ability to support collaborative workflows at low cost. OpenDrives can also get teams quickly set up in under two hours on a corporate VPN and in under 24 hours without.

Prime Focus Technologies 
Prime Focus Technologies (PFT), the technology arm of Prime Focus, has added new features and advanced security enhancements to Clear to help customers embrace the virtual work environment. In terms of security, Clear now has a new-generation HTML 5 player enabled with Hollywood-grade DRM encryption. There’s also support for just-in-time visual watermarking embedded within the stream for streaming through Clear as a secure alternative to generating watermarking on the client side.

Clear also has new features that make it easier to use, including direct and faster download from S3 and Azure storage, easier partner onboarding and an admin module enhancement with condensed permissions to easily handle custom user roles. Content acquisition is made easier with a host of new functionalities to simplify content acquisition processes and reduce dependencies as much as possible. Likewise, for easier content servicing, there is now automation in content localization, to make it easier to perform and review tasks on Clear. For content distribution, PFT has enabled on-demand cloud distribution on Clear through the most commonly used cloud technologies.

Brady and Stephenie Betzel
Many of you know postPerspective contributor and online video editor Brady Betzel from his great reviews and tips pieces. During this crisis, he is helping his wife, Stephenie, make masks for her sister (a nurse) and colleagues working at St. John’s Regional Medical Center in Oxnard, California, in addition to anyone else who works on the “front lines.” She’s sewn over 300 masks so far and is not stopping. Creativity and sewing is not new to her. Her day job is also creating. You can check out her work on Facebook and Instagram.

Object Matrix 
Object Matrix co-founder Nick Pearce has another LinkedIn dispatch, this time launching Good News Friday, where folks from around the globe check in with good news!  You can also watch it on YouTube. Pearce and crew are also offering video tips for surviving working from home. The videos, hosted by Pearce, and are updated weekly. Check them out  here.

Conductor
Conductor is waiving charges for orchestrating renders in the cloud. Updated pricing is reflected in the cost calculator on Conductor’s Pricing page. These changes will last at least through May 2020. To help expedite any transition needs, the Conductor team will be on call for virtual render wrangling of cloud submissions, from debugging scenes and scripts to optimizing settings for cost, turnaround time, etc. If you need this option, then email support@conductortech.com.

Conductor is working with partners to set up online training sessions to help studios quickly adopt cloud strategies and workflows. The company will send out further notifications as the sessions are formalized. Conductor staff is also available for one-on-one studio sessions as needed for those with specific pipeline considerations.

Conductor’s president and CEO Mac Moore said this: “The sudden onset of this pandemic has put a tremendous strain on our industry, completely changing the way studios need to operate virtually overnight. Given Conductor was built on the ‘work from anywhere’ premise, I felt it our responsibility to help studios to the greatest extent possible during this critical time.”

Symply
Symply is providing as many remote workers in the industry as possible with a free 90-day license to SymplyConveyor, its secure, high-speed transfer and sync software. Symply techs will be available to install SymplyConveyor remotely on any PC, Mac or Linux workstation pair or server and workstation.

The no-obligation offer is available at gosymply.com. Users sign up, and as long as they are in the industry and have a need, Symply techs will install the software. The number of free 90-day licenses is limited only by Symply’s ability to install them given its limited resources.

Foundry
Foundry has reset its trial database so that users can access a new 30-day trial for all products regardless of the date of their last trial. The company continues to offer unlimited non-commercial use of Nuke and Mari. On the educational side, students who are unable to access school facilities can get a year of free access to Nuke, Modo, Mari and Katana.

They have also announced virtual events, including:

• Foundry LiveStream – a series of talks around projects, pipelines and tools.
• Foundry Webinars – A 30 to 40-minute technical deep dive into Foundry products, workflows and third-party tools.
• Foundry Skill-Ups – A 30-minute guide to improving your skills as a compositor/lighter/texture artist to get to that next level in your career.
• Foundry Sessions – Special conversations with our customers sharing insights, tips and tricks.
• Foundry Workflow Wednesdays –10-minute weekly videos posted on social media showing tips and tricks with Nuke from our experts.

Alibi Music Library
Alibi Music Library is offering free whitelisted licensing of its Alibi Music and Sound FX catalogs to freelancers, agencies and production companies needing to create or update their demo reels during this challenging time.

Those who would like to take advantage of this opportunity can choose Demo Reel 2020 Gratis from the shopping cart feature on Alibi’s website next to any desired track(s). For more info, click here.

2C Creative
Caleb & Calder Sloan’s Awesome Foundation, the charity of 2C Creative founders Chris Sloan and Carla Kaufman Sloan, is running a campaign that will match individual donations (up to $250 each) to charities supporting first responders, organizations and those affected by COVID-19. 2C is a creative agency & production company serving the TV/streaming business with promos, brand integrations, trailers, upfront presentations and other campaigns. So far, the organization’s “COVID-19 Has Met Its Match” campaign has raised more than $50,000. While the initial deadline date for people to participate was April 6, this has now been extended to April 13. To participate, please visit ccawesomefoundation.org for a list of charities already vetted by the foundation or choose your own. Then, simply email a copy of your donation receipt to: cncawesomefoundation@gmail.com and they will match it!

Red Giant 
For the filmmaking education community, Red Giant is offering Red Giant Complete — the full set of tools including Trapcode Suite, Magic Bullet Suite, Universe, VFX Suite and Shooter Suite — free for students or faculty members of a university, college or high school. Instead of buying separate suites or choosing which tools best suits one’s educational needs or budget, students and teachers can get every tool Red Giant makes completely free of charge. All that’s required is a simple verification.

How to get a free Red Giant Complete license if you are a student, teacher or faculty member:
1. Use school or organization ID or any proof of current employment or enrollment for verification. More information on academic verification is available here.
2. Send your academic verification to academic@redgiant.com.
3. Wait for approval via email before purchasing.
4. Once you get approval, go to the Red Giant Complete Product Page and “buy” your free version. You will only be able to buy the free version if you have been pre-approved.

The free education subscription will last 180 days. When that time period ends, users will need to reverify their academic status to renew their free subscription.

Flanders Scientific
Remote collaboration and review benefits greatly from having the same type of display calibrated the same way in both locations. To help facilitate such workflow consistency, FSI is launching a limited time buy one, get one for $1,000 off special on its most popular monitor, the DM240.

Nvidia
For those pros needing to power graphics workloads without local hardware, cloud providers, such as Amazon Web Services and Google Cloud, offer Nvidia Quadro Virtual Workstation instances to support remote, graphics-intensive work quickly without the need for any on-prem infrastructure. End-users only need a connected laptop or thin client, as the virtual workstations support the same Nvidia Quadro drivers and features as the physical Quadro GPUs used by pro artists and designers in local workstations.

Additionally, last week, Nvidia has expanded its free virtual GPU software evaluation to 500 licenses for 90 days to help companies support their remote workers with their existing GPU infrastructure. Nvidia vGPU software licenses — including Quadro Virtual Workstation — enable GPU-accelerated virtualization so that content creators, designers, engineers and others can continue their work. More details are available here.  Nvidia has also posted a separate blog on virtual GPUs to help admins who are working to support remote employees

Harman
Harman is offering a free e-learning program called Learning Sessions in conjunction with Harman Pro University.

The Learning Sessions and the Live Workshop Series provide a range of free on-demand and instructor-led webinars hosted by experts from around the world. The Industry Expert workshops feature tips and tricks from front of house engineers, lighting designers, technicians and other industry experts, while the Harman Expert workshops feature in-depth product and solution webinars by Harman product specialists.

• April 7—Lighting for Churches: Live and Video with Lucas Jameson and Chris Pyron
• April 9—Audio Challenges in Esports with Cameron O’Neill
• April 15—Special Martin Lighting Product Launch with Markus Klüesener
• April 16—Lighting Programming Workshop with Susan Rose
• April 23—Performance Manager: Beginner to Expert with Nowell Helms

Apple
Apple is offering free 90-day trials of Final Cut Pro X and Logic Pro X apps for all in order to help those working from home and looking for something new to master, as well as for students who are already using the tools in school but don’t have the apps on their home computers.

Avid
For its part, Avid is offering free temp licenses for remote users of the company’s creative tools. Commercial customers can get a free 90-day license for each registered user of Media Composer | Ultimate, Pro Tools, Pro Tools | Ultimate and Sibelius | Ultimate. For students whose school campuses are closed, any student of an Avid-based learning institution that uses Media Composer, Pro Tools or Sibelius can receive a free 90-day license for the same products.

Aris
Aris, a full-service production and post house based in Los Angeles, is partnering with ThinkLA to offer free online editing classes for those who want to sharpen their skills while staying close to home during this worldwide crisis. The series will be taught by Aris EP/founder Greg Bassenian, who is also an award-winning writer and director. He has also edited numerous projects for clients including Coca-Cola, Chevy and Zappos.

mLogic
mLogic is offering a 15% discount on its mTape Thunderbolt 3 LTO-7 and LTO-8 solutions The discount applies to orders placed on the mTape website through April 20th. Use discount code mLogicpostPerspective15%.

Xytech
Xytech has launched “Xytech After Dark,” a podcast focusing on trends in the media and broadcasting industries. The first two episodes are now available on iTunes, Spotify and all podcasting platforms.

Xytech’s Greg Dolan says the podcast “is not a forum to sell, but instead to talk about why create the functionality in MediaPulse and the types of things happening in our industry.”

Hosted by Xytech’s Gregg Sandheinrich, the podcast will feature Xytech staff, along with special guests. The first two episodes cover topics including the recent HPA Tech Retreat (featuring HPA president Seth Hallen), as well as the cancellation of the NAB Show, the value of trade shows and the effects of COVID-19 on the industry.

Adobe
Adobe shared a guide to best practices for working from home. It’s meant to support creators and filmmakers who might be shifting to remote work and need to stay connected with their teams and continue to complete projects. You can find the guide here.

Adobe’s principal Creative Cloud evangelist, Jason Levine, hosted a live stream — Video Workflows With Team Projects that focus on remote workflows.

Additionally, Karl Soule, Senior Technical Business Development Manager, hosed a stream focusing on Remote video workflows and collaboration in the enterprise. If you sign up on this page, you can see his presentation.

Streambox
Streambox has introduced a pay-as-you-go software plan for video professionals who use its Chroma 4K, Chroma UHD, Chroma HD and Chroma X streaming encoder/decoder hardware. Since the software has been “decoupled” from the hardware platform, those who own the hardware can rent the software on a monthly basis, pause the subscription between projects and reinstate it as needed. By renting software for a fixed period, creatives can take on jobs without having to pay outright for technology that might have been impractical

Frame.io 
Through the end of March, Frame.io is offering 2TB of free extra storage .capacity for 90 days. Those who could use that additional storage to accommodate work from home workflows should email rapid-response@frame.io to get it set up.

Frame.io is also offering free Frame.io Enterprise plans for the next 90 days to support educational institutions, nonprofits and health care organizations that have been impacted. Please email rapid-response@frame.io to set up this account.

To help guide companies through this new reality of remote working, Frame.io is launching a new “Workflow From Home” series on YouTube, hosted by Michael Cioni, with the first episode launching Monday, March 23rd. Cioni will walk through everything artists need to keep post production humming as smoothly as possible. Subscribe to the Frame.io YouTube channel to get notified when it’s released.

EditShare
EditShare has made its web-based, remote production and collaboration tool, Flow Media Management, free through July 1st. Flow enables individuals as well as large creative workgroups to collaborate on story development with capabilities to perform extensive review approval from anywhere in the world. Those interested can complete this form and one of EditShare’s Flow experts will follow up.

Veritone 
Veritone will extend free access to its core applications — Veritone Essentials, Attribute and Digital Media Hub — for 60 days. Targeted to media and entertainment clients in radio, TV, film, sports and podcasting, Veritone Essentials, Attribute, and Digital Media Hub are designed to make data and content sharing easy, efficient and universal. The solutions give any workforce (whether in the office or remote) tools that accelerate workflows and facilitate collaboration. The solutions are fully cloud-based, which means that staff can access them from any home office in the world as long as there is internet access.

More information about the free access is here. Certain limitations apply. Offer is subject to change without notice.

SNS
In an effort to quickly help EVO users who are suddenly required to work on editing projects from home, SNS has released Nomad for on-the-go, work-from-anywhere, remote workflows. It is a simple utility that runs on any Mac or Windows system that’s connected to EVO.

Nomad helps users repurpose their existing ShareBrowser preview files into proxy files for offline editing. These proxy files are much smaller versions of the source media files, and therefore easier to use for remote work. They take up less space on the computer, take less time to copy and are easier to manage. Users can edit with these proxy files, and after they’re finished putting the final touches on the production, their NLE can export a master file using the full-quality, high-resolution source files.

Nomad is available immediately and free to all EVO customers.

Ftrack
Remote creative collaboration tool ftrack Review is free for all until May 31. This date might extend as the global situation continues to unfold. ftrack Review is an out-of-the-box remote review and approval tool that enables creative teams to collaborate on, review and approve media via their desktop or mobile browser. Contextual comments and annotations eliminate confusion and reduce reliance on email threads. ftrack Review accepts many media formats as well as PDFs. Every ftrack Review workspace receives 250 GB of storage.

Cinedeck 
Cinedeck’s cineXtools allows editing and correcting your file deliveries from home.
From now until April 3rd, pros can get a one month license of cineXtools free of charge.

 

 

How VFX house Phosphene has been working remotely

By Randi Altman

In our ongoing coverage of how studios are working remotely, we reached out to New York City-based visual effects house Phosphene. Founded in 2010 by Vivian Connolly and John Bair, Phosphene specializes in photorealistic VFX for film and television, and is particularly known for their detailed CG environments and set extensions.

This four-time Emmy-nominated (Mildred Pierce and Boardwalk Empire Season 3, Season 5, Escape at Dannemora) studio’s more recent work includes The Plot Against America, The Hunters, A Beautiful Day in The Neighborhood and Motherless Brooklyn.

The Plot Against America

Like many others, Phosphene tasked with developing secure remote workflows, so we reached out to director of IT Jimmy Marrero and head of operations and strategy Beck Dunn to find out more.

How is Phosphene weathering this storm? Do you have most of your folks working remotely?
Beck Dunn: We were fortunate to be able to switch to remote work very quickly and are extremely grateful for our team who had been preparing for this major change. We are grateful we are in a position to support staff and productions who are able to continue working remotely.

Can you talk about what it took to get artists setup from their homes and walk us through that workflow?
Jimmy Marrero: Luckily, we’ve had experience with using PCOIP technology in the past and were in a good place to transition smoothly to remote work. We had a good number of workstations already set up with PCOIP remote workstation cards. We also leveraged AWS to create cloud workstations that are connected to our office via a VPC (virtual private cloud). This gives us the capability to securely increase our capacity for work way beyond any physical hardware limitations.

What tools are you using to make sure these folks stay connected?
Marrero: We all communicate with each other via chat using an open-source tool called Rocket.Chat. Producers connect via BlueJeans video conference.

For anyone setting up a remote pipeline, I would also recommend taking advantage of cloud-based software like Slack for communication, Trello for organization, and AnyDesk to allow IT to help troubleshoot any issues that might occur during the setup process.

What about security and working remotely?
Marrero: Security was the driving force for us to investigate the advantages of PCOIP technology. Having remote workstation cards installed at the office allows us to stream encrypted screen information directly to the artists monitors and eliminates the need for any data to be hosted outside of Phosphene’s internal network.

Using PCOIP combined with only being able to access our network via VPN with two-factor authentication, we were able to address many security concerns from our clients, which was a key factor in our being able to work remotely.

PCOIP technology also allows us to easily use all the tools on our internal network, with no change in set up, or compromise to security. Once logged in, artists are able to access Nuke, Hiero, 3dsMax, Houdini and Deadline as though they are in the office.

What types of work are you guys doing at the moment?
Dunn: We can’t talk about any of our current work, but one project we recently finished is HBO’s The Plot Against America, created by Ed Burns and David Simon. The show is based on Philip Roth’s 2004 novel depicting the lives of US citizens in an alternate history where Franklin D.Roosevelt loses the 1940 presidential election to Charles Lindbergh.

Phosphene worked with show-side VFX supervisor Jim Rider on a wide range of visual effects for the show, including creating period-accurate aerial views of 1940’s Manhattan, exteriors of Newark Airport and a British Navy base, and extensive crowd duplication shots inside Madison Square Garden. In total, Phosphene delivered 274 shots for the limited series.

The Plot Against America

Any tips for those companies who are just starting to get set up remotely or even those who are currently working remotely?
Marrero: Be nice to your IT department. (Smiles) Working remotely has many moving parts that need to all work perfectly for things to go smoothly. Expect delays in the beginning as all the kinks are worked out.

What has helped staffers get settled into working from home?
Dunn: I’ll let them speak for themselves.

VFX producer Matthew Griffin: I found it really helpful to set up a dedicated mini-office rather than just working on a laptop from the couch. When I sit down at my workspace, I feel like I am still “going into” the office. Holding team meetings via video chat and maintaining rituals like having my morning coffee at the same time also helps me to stay in a familiar rhythm. We also have a dog, so walking him at the end of the day makes the workday feel complete. I close the laptop, walk the dog, and once I’m home, it’s like my commute is over and it’s time to relax.

VFX producer Steven Weigle: Producers are used to working remotely for short stints, so this hasn’t been an entirely foreign experience. I did recently add a KVM switch to my home setup, to use my full-sized keyboard, mouse and monitor to control my work laptop but be able to switch back to my personal machine with the click of a button. It’s a small, basic upgrade but it helps me maximize my desk space while still separating my “work brain” from my “home brain.”


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

The Embassy opens in Culver City with EP Kenny Solomon leading charge

Vancouver-based visual effects and production studio The Embassy is opening an office in LA office in Culver City. EP Kenny Solomon will head up the operation. The move comes following the studio’s growth in film, advertising and streaming, and a successful 2019.The LA-based office will allow The Embassy to have a direct connection and point of contact with its growing US client base and provide front-end project support and creative development while Vancouver — offering pipeline and technology infrastructure — remains the heart of operations.

New studio head Solomon has worked in the film, TV and streaming industries for the past 20 years, launching and operating a number of companies. The most recent of which was Big Block Media Holdings — an Emmy-, Cannes-, Webby-, Promax- and Clio-winning integrated media company founded by Solomon nine years ago.

“We have a beautiful studio in Culver City with infrastructure to quickly staff up to 15 artists 2D, 3D and design, a screening room, conference room, edit bay and wonderful outdoor space for a late night ping-pong match and a local Golden Road beer or two,” says Solomon. “Obviously, everyone is WFH right now but at a moment’s notice we are able to scale accordingly. And Vancouver will always be our heartbeat and main production hub.”

“We have happily been here in Vancouver for the past 17 years plus,” says The Embassy president Winston Helgason. “I’ve seen the global industry go through its ups and downs, and yet we continue to thrive. The last few months have been a difficult period of uncertainty and business interruption and, while we are operating successfully out of current WFH restrictions, I can’t wait to open up to our full potential once the world is a little more back to normal.”

In 2020, The Embassy reunited with Area 23/FCB and RSA director Robert Stromberg (Maleficent) to craft a series of fantastical VFX environments for Emgality’s new campaign. The team has also been in full production for the past 16 months on all VFX work for Warrior Nun, an upcoming 10-episode series for Netflix. The Embassy was responsible for providing everything from concept art to pre-production, on-set supervision and almost 700 visual VFX shots for the show. The team in Vancouver is working both remotely and in the studio to deliver the full 10 episodes.

Solomon is excited to get to work, saying that he always respected The Embassy’s work, even while he was competing with them when he was at CafeFX/The Syndicate and Big Block.

As part of the expansion, The Embassy has also added a number of new reps to the team — Sarah Gitersonke joins for Midwest representation, and Kelly Flint and Sarah Lange join for East Coast.

Sony’s Venice and FX9 cameras get firmware updates

Sony is expanding the capabilities of its Venice digital motion picture camera and its FX9 full-frame camera to offer greater expression and usability for cinematographers and their collaborators in production and post. These firmware upgrades build on the two cameras’ image capture and color science. Venice offers more monitoring options and high frame rate, and FX9 expands shooting as well as recording capabilities for content creators.

Version 6.0 of Venice firmware will allow the import of Advanced Rendering Transform (.art) files that improve monitoring picture quality and viewing options on set. These .art files can be generated by Sony’s Raw Viewer software from users’ own 3D LUT files. Additionally, Sony is collaborating with Technicolor color scientists to create a new “look library” for the Venice camera, which will be available online as a resource for creatives wishing to quickly access some of Technicolor’s established looks.

Venice

Another enhancement in Venice v6.0 firmware is the ability to shoot with a second user frame line. This enables DPs to more easily take advantage of Venice’s large sensor size to shoot for both horizontal and vertical distribution within the same composition.

Venice v6.0 features also include:
• Expansion of HFR capabilities – up to 72fps at 5.7K 16:9 and 110fps 3.8K 16:9, which simplifies post of slow-motion, especially for TV drama workflows that have quick turnarounds. In addition, up to 72fps at 4K 6:5 imager mode for Anamorphic lens operation.
• Improved 3D LUT monitoring – 3D LUT look can be fed to camera viewfinder
• Gyro information in metadata – camera’s Tilt & Roll data can be referenced by VFX teams

FX9 was launched in 2019 to bring full-frame imaging to run-and-gun, documentary and independent productions. Employing the form factor, ergonomics and workflow of Sony’s FS7 and FS7II camera, FX9 brings color science from Venice and auto focus (AF) technology from Sony’s interchangeable lens camera, Alpha, to creatives desiring a small camera footprint.

FX9

Version 2.0 of FX9 firmware supports 4K 60p/50p recording through oversampling from a 5K cropped area of 6K full-frame sensor. Version 2.0 also enables output of a 4K 16-bit raw signal to an external recorder with the optional XDCA-FX9 accessory. This additional bit depth beyond the camera’s internal 10-bit recording is ideal for projects requiring more intensive post production.

Additionally, FX9 v2.0 firmware will expand the camera’s operability with Eye AF technology and touch-screen operation for focus control and menu setting on the viewfinder.

FX9 v2.0 also includes:
• 180 fps full-frame HD recording
• 4K (4096×2160) DCI recording
• Ability to load user 3D LUTs
• HDR shooting function recorded in Hybrid Log Gamma

Version 6.0 of Venice firmware is planned for release in November 2020, and Version 2.0 of FX9 firmware is planned for an October 2020 release.

Posting Everest VR: Journey to the Top of the World

While preparing to climb both Mount Everest and Mount Lhotse without the use of bottled oxygen, renowned climber Ueli Steck fell to his death in late April of 2017. VR director and alpine photographer Jonathan Griffith and mountain guide Tenji Sherpa, both friends of Steck, picked up the climber’s torch, and the result was the 8K 3D documentary Everest VR: Journey to the Top of the World, produced by Facebook’s Oculus.

Over the course of three years, Griffith shot footage following Tenji and some of the world’s most accomplished climbers in some of the world’s most extreme locations. The series also includes footage that lets viewers witness what it is like to be engulfed in a Himalayan avalanche, cross a crevasse and staring deep in its depths, take a huge rock-climbing fall, camp under the stars and soak in the view from the top of the world.

For the post part of the doc, Griffith called on veteran VR post pro Matthew DeJohn for editing and color correction, VR stitching expert Keith Kolod and Brendan Hogan for sound design.

“It really was amazing how a small crew was able to get all of this done,” says Griffith. “The collaboration between myself as the cameraman and Matt and Keith was a huge part of being able to get this series done — and done at such as a high quality.

“Matt and Keith would give suggestions on how to capture for VR, how camera wobbling impacted stitching, how to be aware of the nadir and zenith in each frame and to think about proximity issues. The efficient post process helped in letting us focus on what was needed, and I am incredibly happy with the end result.”

DeJohn was tasked with bringing together a huge amount of footage from a number of different high-end camera systems, including the Yi Halo and Z Cam V1 Pro.

DeJohn called on Blackmagic Resolve for this part of the project, saying that using one tool for all helped speed up the process.“A VR project usually has different teams of multiple people for editing, grading and stitching, but with Resolve, Keith and I handled everything,” he explains.

Within Resolve, DeJohncut the series at 2Kx2K, relinked to 8Kx8K source and then change the timeline resolution to 8Kx8K for final color and rendering. He used the Fairlight audio editing tab to make fine adjustments, manage different narration takes with audio layers, and manage varied source files such as mono-narration, stereo music and four-channel ambisonic spatial audio.

In terms of color grading, DeJohn says, “I colored the project from the very first edit so when it came to finalize the color it was just a process of touching things up.”

Fusion Studio was used for stereoscopic alignment fixes, motion graphics, rig removal, nadir patches, stabilization, stereo correction of the initial stitch, re-orienting 360 imagery, viewing the 360 scenes in a VR headset and controlling focal areas. More intense stitching work was done by Kolod using Fusion Studio.

Footage of such an extreme environment, as well as the closeness of climbers to the cameras, provided unique challenges for Kolod who had to rebuild portions of images from individual cameras. He also had to manually ramp down the stereo on the images north and south poles to ensure easy viewing, fix stereo misalignment and distance issues between the foreground and background and calm excessive movement in images.

“A regular fix I had to make was adjusting incorrect vertical alignments, which create huge problems for viewing. Even if a camera is a little bit off, the viewer can tell,” says Kolod. “The project used a lot of locked-off tripod cameras, and you would think that the images coming from them would be completely steady. But a little bit of wind or slight movement in what is usually a calm frame makes a scene unwatchable in VR. So I used Fusion for stabilization on a lot of shots.”

“High-quality VR work should always be done with manual stitching with an artist making sure there are no rough areas. The reason why this series looks so amazing is that there was an artist involved in every part of the process — shooting, editing, grading and stitching,” concludes Kolod.

RuckSackNY: Branding, targeted videos and high-quality masks

By Randi Altman

Fred Ruckel got his start in post at New York’s Post Perfect in the ‘90s. From there he grew his skills and experience before opening his own shop, Stitch. While spending his days as a Flame artist, in his spare time Ruckel and his wife Natasha invented something called the Ripple Rug. They’ve since moved to upstate New York, where they built an extensive post suite and studio under the name RuckSackNY.

Fred Ruckel at work.

What is the Ripple Rug, you ask? It’s essentially a cat playground in a rug, but their site describes it as “a multifunction pet enrichment system mainly geared toward house cats.”

Fred and Natasha (whose own career includes stints at creative agencies as well as Autodesk) felt strongly about manufacturing the Ripple Rug in the US, and they wanted to use recycled materials. After a bit of research, they found a factory in Georgia and used recycled plastic water bottles in the process. To date they have recycled over 3 million bottles.

To help promote the Ripple Rug, the Ruckels leveraged their creative capabilities from years of working in advertising and post to create a brand from scratch.

When the COVID-19 crisis hit, the Ruckels realized they were in a unique position — they could repurpose the Georgia factory to make masks and face shields for health workers and the general population. While reformatting the factory to this type of manufacturing is still ongoing, the Ruckels wanted to make sure that, in the meantime, people would have access to high-quality face masks. So they sourced masks via their textile production partners, had them tested in a US lab, and have already sold over 40,000 masks under their new brand, SnugglyMask.

Many have taken to making their own masks, so the factory will also be making filters to help beef up that protection, which will allow people to buy filter packs for their homemade masks. Check out their video showing people how to make their own masks.  “We should have that part functional this week or next. Our mask supplier is quickly trying to put together the production pipeline so we can make masks here, but those machines are automated and take a bit of engineering to make them work properly.”

These materials will be both sold to the general public and donated to those on the frontlines. The Ruckels have once again used their creative backgrounds to build a brand and tell a story. Let’s find out more from Fred…

With the recent COVID-19 crisis, you realized that your factory could be used to make masks — both for civilians and for medical professionals and those on the frontline. How did you come to that realization, and what were your first steps?
When the pandemic broke out, we immediately took action to help the cause. Our factory makes many textile products, and we knew we could set up an assembly line to make masks, shields and gowns, and with some funding, we could pretty much make anything. We have the know-how and ability, as well as 60,000 square feet of space, which we are cutting a chunk out of to make a clean room to handle the process in as sterile an environment as possible.

I reached out to New York Governor Andrew Cuomo’s office, our local congressman and Empire State Development. At the same time, I was communicating with Georgia (we are a registered business in both states) and worked with the Department of Economic Development and the National Association of Manufacturers. That led us to the Global Center for Medical Innovation.

Natasha Ruckel

So while that was happening, you decided to sell and donate masks?
Yes. While waiting for responses to help us retool our factory, we had to do something to be an immediate help. We did not want to wait on the sidelines for red tape to be cut; we had to come up with Plan B while waiting for government help.

Plan B meant using our resources to allow us to purchase masks without several levels of middlemen raising the prices. We still ended up with two levels of middlemen, but it’s better than five! In manufacturing, it is all about pennies. This is a lesson I learned from a mentor early on with our Ripple Rug project. Middlemen make pennies, a nickel becomes $50,000 in profit on 1 million units, so pennies add up, and middlemen capitalize on that. My goal is to remove middlemen and get directly sourced goods to people in need at the best price possible.

Can you describe both masks and the materials used?
In our PSA, we demonstrate the use of a cloth bandana versus a basic medical mask. We are looking to filter particulate matter down to the micron level, smaller than the human eye can see. For reference, the human eye can only see particles as small as 50 to 60 microns (think about a fleck of dust caught in sunlight). The particles we are looking to “arrest” are down to .3 microns, smaller than red blood cells.

The mechanical weaving of cloth masks makes them porous. This allows particulate matter to pass right through, as the holes are enormous in scale. The key component is the middle layer is called “melt-blown.” The outer layer is a polypropylene spun-bond fiber, and the inside layer is an acrylic spun-bond fiber. Sandwiched between is the melt -blown layer, which is the fine particulate catcher. Each layer captures a different size particle. Think of it as a video production — it would be like adding multiple scrims to lights to block light, except we are blocking particles in this case.

You recently created a PSA detailing the differences in the masks people are using to protect themselves. What did you use to shoot and post?
The PSA was shot using a Canon EOS 5D Mark IV. We have some great Fiilex LED lighting kits with a ring light and a 7-foot white shooting tent. My intent wasn’t to make a full-on video. I was shooting elements to make animating gifs to show the testing process. When I loaded the footage into Adobe Premiere and made a selects reel, I realized we had the elements of a PSA … and so a spot was born.

Natasha looked at my selects and quickly switched into producer mode and pieced together a storyline. We then had to shoot more elements. Fortunately, our shooting studio is in our home, so there were no delays. I shot an element, loaded it, shot another and so on until we had the pieces to make it work.

Natasha created graphic elements in Adobe Illustrator while I worked on the edit in Premiere. We also took product pics in raw mode for the packaging and demos, which we developed in camera raw within Photoshop. We shot the video portion in 4K, which allowed us to punch in for closeups and pull back to reframe as if it were a multi-cam shoot.

We filmed on a stainless steel table to give it a clinical feel while blowing it out a little bit to feel ethereal and hazy. My favorite shot is the water dripping on the table; the lighting and table make it feel like mercury.

Why was it so important for you to turn your business into the mask business?
There are so many reasons that it is hard to pinpoint. I knew we had the capability, and our pipeline was efficient enough to pull it off from start to finish. As an inventor I’ve seen people take advantage of situations for financial gain — like knocking off products — and that means making fake masks, which cause more harm than good.

I saw an opportunity to protect everyone I know by supplying quality masks they can trust. On internet sites, fake masks can look identical. In fact, the pics might be of the real mask, but they ship you a cheap version that’s missing some key elements.

I do not cut corners. As a Flame artist, I continually dealt with clients saying, ‘It’s good enough, let’s move to the next shot.” Good enough is not what I do; I do not have a halfway button. I’d look like a bad Flame artist if I didn’t go all the way.

Knowing that we can play an active part in protecting my friends and family and colleagues in the post community by taking on this single effort made me pull the trigger. With that, SnugglyMask.com was born.

Are you guys selling and donating masks? How is that working?
We are both selling and donating masks. One of our RuckSackNY clients is a philanthropist named Josh Malone. As his marketing agency, we created a mask donation program. The first hospitals we shipped to were Montefiore Medical Center in the Bronx and Westchester Medical Center. We will be donating to hospitals nationwide and also selling masks to hospitals and the public via our site, https://www.snugglymask.com/. This is a place people can go for a mask they can trust and that has been lab tested. We built a brand in just a week, and sales simply exploded due to our honest content and demand.

Why is it important for you to make sure your products are being made in the US?
We make the Ripple Rug in the US to provide jobs for US workers. There are more than 100 people working at 10 companies in five states for Ripple Rug. I order carpet 100,000 square feet at a time and cannot imagine shipping it from overseas with the demand we must meet. Shipping from China takes weeks, if not months.

Making it in the USA means continual production to meet demand while reinvesting to grow along the way. Sure, I could produce my products in China and make a lot more money, but I am proud to say American workers put food on the table and children go to school because we make our products in the USA. That alone makes it worth it to me.

Do you feel the videos you create help get more people to pay attention to the product?
We feel effective videos engage viewers and build intrigue about our product. We create a range of videos, not just the regular polished spots. Consumers appreciate the feeling of user-generated content, as it adds to the authenticity of the product. If every spot is beautiful, it feels staged.

We have a series called “Cats Gone Wild” in which all of the videos are made solely of user-generated content sourced from YouTube, Facebook and Instagram. I edit them to a stock music track and create a theme for each video. We add titles to call out the social media names to give credit to the person who posted the video and to give them a little spotlight on our show reel. This, in turn, creates engagement, as it encourages them to share the video on their social media channels.

I keep my edits to around a minute for this series to “get in and get out” before losing the viewer’s attention. The original content is cut to a whimsical track and is fun to watch — who doesn’t love cute cat videos? We share these on social media, and that helps grow our sales. Our customers love it, they get acknowledgement, our brand grows, and we are able to show our product in action.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Adding precise and realistic Foley to The Invisible Man

Foley artists normally produce sound effects by mimicking the action of characters on a screen, but for Universal Pictures’ new horror-thriller, The Invisible Man, the Foley team from New York’s Alchemy Post Sound faced the novel assignment of creating the patter of footsteps and swish of clothing for a character who cannot be seen.

Directed by Leigh Whannell, The Invisible Man centers on Cecilia Kass (Elisabeth Moss), a Bay Area architect who is terrorized by her former boyfriend, Adrian Griffin (Oliver Jackson-Cohen), a wealthy entrepreneur who develops a digital technology that makes him invisible. Adrian causes Cecelia to appear to be going insane by drugging her, tampering with her work and committing similarly fiendish acts while remaining hidden from sight.

The film’s sound team was led by the LA-based duo of sound designer/supervising sound editor P.K. Hooker and re-recording mixer Will Files. Files recalls that he and Hooker had extensive conversations with Whannell during pre-production about the unique role sound would play in telling the film’s story. “Leigh encouraged us to think at right angles to the way we normally think,” he recalls. “He told us to use all the tools at our disposal to keep the audience on the edge of their seats. He wanted us to be bold and create something very special.”

Hooker and Files asked Alchemy Post Sound to create a huge assortment of sound effects for the film. The Foley team produced footsteps, floor creaks and fist fights, but its most innovative work involved sounds that convey Adrian’s onscreen presence when he is wearing his high-tech invisibility suit. “Sound effects let the audience know Adrian is around when they can’t see him,” explains lead Foley artist Leslie Bloome. “The Invisible Man is a very quiet film and so the sounds we added for Adrian needed to be very precise and real. The details and textures had to be spot on.”

Alchemy’s Andrea Bloome, Ryan Collison and Leslie Bloome

Foley mixer Ryan Collison adds that getting the Foley sound just right was exceedingly tough because it needed to communicate Adrian’s presence, but in a hesitant, ephemeral manner. “He’s trying to be as quiet as possible because he doesn’t want to be heard,” Collison explains. “You want the audience to hear him, but they should strain just a bit to do so.”

Many of Adrian’s invisible scenes were shot with a stand-in wearing a green suit who interacted with other actors and was later digitally removed. Alchemy’s Foley team had access to the original footage and used it in recording matching footsteps and body motions. “We were lucky to be able to perform Foley to what was originally shot on the set, but unlike normal Foley work, we were given artistic license to enhance the performance,” notes Foley artist Joanna Fang. “We could make him walk faster or slower, seem creepier or step with more creakiness than what was originally there.”

Foley sound was also used to suggest the presence of Adrian’s suit, which is made from neoprene and covered in tiny optical devices. “Every time Adrian moves his hand or throws a punch, we created the sound of his suit rustling,” Fang explains. “We used glass beads from an old chandelier and light bulb filaments for the tinkle of the optics and a yoga mat for the material of the suit itself. The result sounds super high-tech and has a menacing quality.”

Special attention was applied to Adrian’s footsteps. “The Invisible Man’s feet needed a very signature sound so that when you hear it, you know it’s him,” says Files. “We asked the Foley team for different options.”

Ultimately, Alchemy’s solution involved something other than shoes. “Like his suit, Adrian’s shoes are made of neoprene,” explains Bloome, whose team used Neumann KMR 81 mics, an Avid C24 Pro Tools mixing console, a Millennia HV-3D eight-channel preamp, an Apogee Maestro control interface and Adam A77X speakers. “So they make a soft sound, but we didn’t want it to sound like he’s wearing sneakers, so I pulled large rubber gloves over my feet and did the footsteps that way.”

Invisible Adrian makes his first appearance in the film’s opening scene when he invades Cecilia’s home while she is asleep in bed. For that scene, the Foley team created sounds for both the unseen Adrian and for Cecilia as she moves about her house looking for the intruder. “P.K. Hooker told us to imagine that we were a kid who’s come home late and is trying to sneak about the house without waking his parents,” recalls Foley editor Nick Seaman. “When Cecilia is tiptoeing through the kitchen, she stumbles into a dog food can. We made that sound larger than life, so that it resonates through the whole place. It’s designed to make the audience jump.”

Will Files

“P.K. wanted the scene to have more detail than usual to create a feeling of heightened reality,” adds Foley editor Laura Heinzinger. “As Cecelia moves through her house, sound reverberates all around her, as if she were in a museum.”

The creepiness was enhanced by the way the effects were mixed. “We trick the audience into feeling safe by turning down the sound,” explains Files. “We dial it down in pieces. First, we removed the music, and then the waves, so you just hear her bare feet and breath. Then, out of nowhere, comes this really loud sound, the bowl banging and dog food scattering across the floor. The Foley team provided multiple layers that we panned throughout the theater. It feels like this huge disaster because of how shocking it is.”

At another point in the film, Cecilia meets Adrian as she is about to get into her car. It’s raining and the droplets of water reveal the contours of his otherwise invisible frame. To add to the eeriness of the moment, Alchemy’s Foley team recorded the patter of raindrops. “We recorded drops differently depending on whether they were landing on the hood of the car or its trunk,” says Fang. “The drops that land on Adrian make a tinkling sound. We created that by letting water roll off my finger. I also stood on a ladder and dropped water onto a chamois for the sound of droplets striking Adrian’s suit.”

 

The film climaxes with a scene in a psychiatric hospital where Cecilia and several guards engage in a desperate struggle with the invisible Adrian. “It’s a chaotic moment but the footsteps help the audience track Adrian as the fight unfolds,” says Foley mixer Connor Nagy. “The audience knows where Adrian is, but the guards don’t. They hear him as he comes around corners and moves in and out of the room. The guards, meanwhile, are shaking in disbelief.”

“The Foley had a lot of detail and texture,” adds Files. “It was also done with finesse. And we needed that, because Foley was featured in a way it normally isn’t in the mix.”

Alchemy often uses Foley sound to suggest the presence of characters who are off screen, but this was the first instance when they were asked to create sound for a character whose presence onscreen derives from sound alone. “It was a total group effort,” says Bloome. “It took a combination of Foley performance, editing and mixing to convince the audience that there is someone on the screen in front of them who they can’t see. It’s freaky.”

Light Illusion intros color management system for beta testing

Color management specialist Light Illusion has launched a new color management product called ColourSpace CMS for post facilities, studios and broadcasters. It is available for pre-order, which enables participation in the open beta program to provide feedback and input for the final release.

Running on a PC, ColourSpace CMS software connects to a wide variety of display calibration probes in order to accurately assess the color rendition of any display. It then creates color management data that can be used to perfect color accuracy by uploading it into a display that has color management features or, alternatively, applied as a correction to the signal feeding the display using associated hardware.

Applications include display calibration, color management and color workflows within the professional film, post production and broadcast industries and for display manufacturers and home cinema enthusiasts. ColourSpace CMS data is compatible with most color-related post platforms such as Resolve, Flame and Baselight.

ColourSpace CMS was designed to improve how color accuracy is measured and reported. Light Illusion CEO Steve Shaw says, “The most visually impressive aspect of ColourSpace CMS is the way it communicates color accuracy to the user. Full volumetric accuracy of any given display can be quickly and easily assessed using the new three-dimensional, fully interactive, resizable and color-coded graphs. Complex color data can be clearly analyzed with 3D CIE and normalized RGB Color Space graphs, including Error Tangent lines and color coded measure points.

“Color coding within the display graphs helps to quickly identify the accuracy of any given measured point. For example, green signifies that a measured color value is below 1dE, while orange shows a color as being between 1dE and 2.3dE, and red indicates a value above 2.3dE. Additional Tangent Lines are visual plots of any given point’s error, showing the recorded location of the measured color, compared to where the color should actually be located for any given color space.”

Display manufacturer Flanders Scientific has been involved in testing and feedback during the development of ColourSpace CMS. The open beta program will be available to anyone who has ordered ColourSpace CMS in advance.

Vegas Post upgrades for VFX, compositing and stills

Vegas Creative Software, in partnership with FXhome, has added new versions of Vegas Effects and Vegas Image to the Vegas Post suite of editing, VFX, compositing and imaging tools for video professionals, editors and VFX artists.

The Vegas Post workflow centers on Vegas Pro for editing and adds Vegas Effects and Vegas Image for VFX, compositing and still-image editing.

Vegas Effects is a full-featured visual effects and compositing tool that provides a variety of high-quality effects, presets and correction tools. With over 800 effects and filters to tweak, combine, pull apart and put back together, Vegas Effects provides users with a powerful library of effects including:
• Particle generators
• Text and titling
• Behavior effects
• 3D model rendering
• A unified 3D space
• Fire and lightning generators
• Greenscreen removal
• Muzzle flash generators
• Picture in picture
• Vertical video integration

Vegas Image is a non-destructive raw image compositor that enables video editors to work with still-image and graphical content and incorporate it directly into their final productions — all directly integrated with Vegas Post. This new version of Vegas Image contains feature updates including:
• Brush masks: A new mask type that allows the user to brush in/out effects or layers and includes basic brush settings like radius, opacity, softness, spacing and smoothing
• Multiple layer transform: Gives the ability to move, rotate and scale a selection of layers
• Multi-point gradient effect: An effect that enables users to create colored gradients using an unlimited amount of colored points
• Light rays effect: An effect that uses bright spots to cast light rays in scenes, e.g., light rays streaming through trees
• Raw denoise: Bespoke denoise step for raw images, which can remove defective pixels and large noise patterns
• Lens distortion effect: Can be used to perform lens-based adjustments, such as barrel/pincushion distortion or chromatic aberration
• Halftone effect: Produces a halftone look, like a newspaper print or pop art
• Configurable mask overlay color: Users can now pick what color is overlaid when the mask overlay render option is enabled

Vegas Post is available now for $999 or as a subscription starting at $21 per month.

Hecho Studios: Mobilizing talent and pipeline to keep working

By Ryan Curtis

When Hecho first learned of the possibility of a shutdown due to COVID-19, we started putting together a game plan to maintain the level of production quality and collaboration that we are all used to, but this time remotely. Working closely with our chief content officer Tom Dunlap, our post production workflow manager Nathan Fleming and senior editor Stevo Chang, we first identified the editors, animators, colorists, Flame artists, footage researchers and other post-related talent who work with us regularly. We then built a standing army of remote talent who were ready to embrace the new normal and get to work.

Ryan Curtis

It was a formidable challenge to get the remote editorial stations up and running. We had a relatively short notice that we were going to have to finalize and enact a WFH game plan in LA. In order to keep productions running smoothly, we teamed with our equipment vendor, VFX Technologies, to give our IT team the ability to remote in and fully outfit each work station with software. They also scheduled a driver to make contact-free drop offs at the homes of our artists. We’ve deployed over 15 iMacs for editorial, animation and finishing needs. We can scale as needed, and only need two to three days’ notice to get a new artist fully set up at home with the appropriate tools. Our remote edit bay workstations are mainly iMac Pros, running the Adobe suite of tools, Maxon Cinema 4D, Blackmagic DaVinci Resolve and Autodesk Flame.

We have outfitted each member of our team with Signiant, which allows for rapid speed file transfers for larger files. If an artist’s home internet is not up to snuff for their project, we have been boosting their internet speeds. To maintain file integrity, we are rolling out the same file structure as you would find on our server, allowing us to archive projects back to the server remotely once delivered. We’ve also designated key people who can access the in-office stations and server virtually, retrieve assets and migrate them to remote teams to refresh existing campaigns.

The need to review during each phase of production has never been stronger. We tested a wide variety of review solutions, and have currently settled on the following:

• For Animation/Design-Based Projects:
Frankie – Export-based interactive reviews
• For Editorial Projects:
Evercast – Live plug and play sessions
Wiredrive (often times paired with Google Hangouts or Zoom)
• For Finishing:
Vimeo Review – Export-based color reviews
Streambox – Live color collaboration (paired with Google Hangouts or Zoom)
Frankie – Export-based interactive reviews
Wiredrive for deliverables (often times paired with Google Hangouts or Zoom)

Our collective of talent remains our contracted veteran Hecho crew, well over 50 people who know our shorthand and in-office workflows and can easily be onboarded to our new remote workflow. If needed to satisfy a specific creative challenge, we bring in new talent and quickly onboard them into the Hecho family.

In terms of how we deal with approvals, it depends on the team and the project. If you have a dedicated team to a project it can be even more efficient than working in the office. Overcommunication is key, and transparency with feedback and workflows is paramount to a successful project. However, in many cases, efficiencies can be lost and projects currently move about 20 percent slower than if we were in the office. To combat this, some teams have structured a little differently as it can be hard to wrangle busy individuals with fast deadlines remotely. So having approved backup approvers on board has been immensely helpful to keep projects moving along on time. And without clients in the bay, we lean even more on our post producers to funnel all questions and feedback from clients, ensuring clear back and forth with artists.

NFL #stayhomestaystrong

Challenges Solved
Aside from the lack of in-person interaction and the efficiencies of quick catch ups in the hall or in the bay, the biggest challenge has been home internet speeds. This affects everything else that’s involved with a WFH set up. In some cases we had to actually upgrade current ISP contracts in order to reach an acceptable baseline for getting work done: streaming reviews, file sharing, etc.

The other challenge was quickly testing/evaluating new tools and then getting everybody up to speed on how to use them. Evercast was probably the trickiest new product because it involves live streaming from an editor’s machine (using Adobe Premiere) while multiple “reviewers” watch them work in real time. As you can imagine, there are many factors that can affect live streaming: CPU of the streaming computer, bitrate you’re streaming, etc. Luckily, once we had gone through a couple setups and reviews (trial and error) things got much easier. Also the team at Evercast (thanks Brad, Tyrel, and Robert!) were great in helping us figure out some of the issues we ran into early on.

Our First WFH Projects
For our first COVID-19 response project, we worked with agency 72andSunny and the NFL to share the uplifting message #Stayhomestaystrong. Behind the scenes, our post team produced a complete offline to online workflow in record time and went from brief to live in six days while everyone transitioned to working entirely remotely. #Stayhomestaystrong also helped bring in $35 million in donations toward COVID relief groups. Credits include editors Amanda Tuttle, Andrew Leggett, assistant editors: Max Pankow, Stephen Shirk, animator Lawrence Wyatt, Flame artists Rachel Moorer, Gurvand Tanneau and Paul Song and post producer Song Cho.

Stay INspired

Another project we worked with 72andSunny on was COVID-19 response ad, Pinterest Stay INspired, involving heavy motion graphics and a large number of assets, which ranged from stock photos, raw video files from remote shoots and licensed UGC assets. The designers, motion graphics artists, writers and clients used a Google Slides deck to link thumbnail images directly to the stock photo or UGC asset. Notes were sent directly to their emails via tags in the comments section of the slides.

Our team shared storyboards, frequently jumped on video conference calls and even sent recorded hand gestures to indicate the kind of motion graphic movement they were looking for. Credits for this one include editor/motion designer: Stevo Chang, motion designer Sierra Hunkins, associate editor Josh Copeland and post producer Cho, once again.

What We Learned
WFH reinforced the need for the utmost transparency in team structures and the need for super-clear communication. Each and every member of our team has needed to embrace the change and take on new challenges and responsibilities. What worked before in office, doesn’t necessarily work in a remote situation.

The shutdown also forced us to discover new technologies, like Evercast, and we likely wouldn’t have signed up for Signiant for a while. Moving forward, these tools have both been great additions to what we can offer our clients. These new technologies also open up future opportunities for us to work with clients we didn’t have access to before (out of state and overseas). We can do live remote sessions without the client having to physically be in a bay which is a game changer.


Ryan Curtis is head of post production at two-time Emmy-nominated Hecho Studios, part of MDC’s Constellation collective of companies.

Eizo intros HDR reference monitor with built-in calibration

Eizo is now shipping the ColorEdge Prominence CG3146, a 31.1-inch, DCI-4K (4096×2160) HDR reference monitor for post and color grading workflows. It is the successor model to Eizo’s flagship HDR reference monitor, the ColorEdge Prominence CG3145, and is the first to incorporate a built-in calibration sensor. Hardware calibration ensures the screen stays color-accurate over time and streamlines color management.

Like its predecessor, the ColorEdge Prominence CG3146 correctly shows both very bright and very dark areas on the screen without sacrificing the integrity of either. The monitor achieves 1000 cd/m2 (typical) high brightness and 1,000,000:1 contrast ratio for true HDR display.

The ColorEdge Prominence CG3146 supports HLG (hybrid log-gamma) and the PQ (perceptual quantization) curve for displaying and editing broadcast, film and other video content in HDR. The optimized gamma curves render images to appear truer to how the human eye perceives the real world compared to SDR.

The color and brightness of an LCD monitor can shift due to changes in ambient temperature and the temperature of the monitor itself. The ColorEdge Prominence CG3146 is equipped with a temperature sensor for accurately measuring the temperature inside the monitor and for estimating the temperature of the surrounding environment. With this temperature-sensing and estimation technology, the monitor adjusts in real time, so gradations, color, brightness and other characteristics continue to be displayed accurately.

Eizo uses artificial intelligence in the monitor’s estimation algorithm so it can distinguish between various temperature-changing patterns to calculate even more accurate correction. Eizo’s patented digital uniformity equalizer technology also counterbalances the influence that a fluctuating temperature might have on color temperature and brightness for stable image display across the screen.

Details
• Single-Link 12G/6G/3G/HD-SDI and Dual- or Quad-Link 3G/HD-SDI
• VPID support for SDI connections
• HDMI and DisplayPort inputs
• 99% reproduction of DCI-P3
• 3D LUT for individual color adjustment on an RGB cubic table
• 10-bit simultaneous display from a 24-bit LUT for smooth color gradations
• Quick adjustment of monitor settings via front bezel dial
• Light-shielding hood included
• Five-year manufacturer’s warranty

 

Editor Greg Mitchels joins Northern Lights

Bi-coastal post and creative company Northern Lights has added editor Greg Mitchels to its team. Mitchels’ experience includes cutting trailers, show launches, upfronts, music videos, behind-the-scenes content and promo campaigns. He comes to Northern Lights after 13 years at Attitude in New York City. He will be working on Avid Media Composer and Adobe Premiere.

Mitchels has concepted and cut Promax-, Emmy-  and Clio-nominated work for A&E, Comedy Central, Amazon, History and Lifetime. He’s also edited PSAs for the Vietnam Veterans Memorial Fund and World Down Syndrome Day.

He just wrapped editing A+E Networks’ annual upfront presentation and is currently working on image spots for Lifetime and Amazon Prime Video.

Behind the Title: Kaboom’s Doug Werby

After starting his career as an editor, Doug Werby transitioned to director as well. He loves when a project combines both of his talents.

Name: Doug Werby

Company: Kaboom

Can you describe what Kaboom does?
Kaboom is a full-creative service production company providing production and post.

Doug Werby on set

What’s your job title?
Director and editor

What does that entail?
Whatever it takes to pull down and execute a project to the highest degree of my ability. It means having a concrete vision from beginning to end and always collaborating with the team. From directing a voiceover session in LA from a remote island off the coast of Croatia to editing at 2am for an east coast 6am delivery. Whatever it takes. I’m an all-in person by nature.

What would surprise people the most about what falls under that title?
That everything I’ve learned in editing I apply to each moment of directing. I started my career as an editor, and it’s about seeing collaborations from different angles and using that to produce creative work efficiently. I believe my strength is in making other peoples ideas better. Shaping the narrative with all the tools and talent available.

What’s your favorite part of the job?
Editing: Cracking open a fresh bin of un-cut dailies in my editing studio when everything is quiet.
Directing: First shot of the first day of shooting.

What’s your least favorite?
Editing: Editing screen captures for app interfaces.
Directing: Late, late-night shoots.

What is your most productive time of day?
After my first cup of coffee at 7:30am til around 11:30am, and then from 8pm to 11pm after a dessert espresso.

If you didn’t have this job, what would you be doing instead?
Origami or pottery, but basically the same thing I already do – shaping things with my hands – but paid commissions are more rarified.

Why did you choose this profession?
It was really the only possible option that made my heart beat faster.

How early did you know this would be your path?
When I was 22 years old. After four years at a liberal arts college, not knowing what the heck to study but always loving film and radio, I made that my focus and, ultimately, my career.

Can you name some recent projects you have worked on?
On the editing front for Kaboom: Campaigns for American Express promoting Wimbledon. This was a “social” project we cut in NYC. It was great fun bringing together celebrity, humor, music, stylized art direction and motion graphics for the small screen.

Wimbledon

The Oakland Airport TV edit for Kaboom. This was a throwback to the days of cutting deadpan mockumentary humor. I love this format and working closely with the creatives, we got the most out of the footage. Plus, I love Oakland Airport.

My two personal short films: For the past few years I’ve been parlaying all my skills from the commercial world and applying them to the scripted drama genre. I’ve come up with a series of real-life stories adapted for film that I’m packaging up to present as a whole. The idea would be to create a series of 10 half-hour programs all dealing with kindness. Individually the films have been honored at multiple film festivals.

The first is called No Tricks, based on a gritty, real-life experience of Julio Diaz that unveils a mugging gone good. Two men from different worlds bring change and some unexpected wisdom.

Motorbike Thief tells a real-life incident that happened to Michael Coffin when he discovered a stranger with his stolen bike. So enraged at the sight, he confronts the assailant in no uncertain terms and just when the situation is about to get out of hand, the anger turns empathetic and an unlikely friendship develops.

Do you put on a different hat when cutting a specific genre?
Completely. When editing spots and promotions, I’m trying to tell the most entertaining story in the shortest amount of time while staying focused on a clear message. When editing scripted material, I’m focused on story beats, character development and performance. Performance trumps all.

Oakland Airport

What is the project you are most proud of?
The work I did as a director with Kaboom for Bank of America via Hill Holiday a few years back for the Special Olympics. Making stars out of unsung heroes and shining a light on how brave these individuals are was a great honor. The films really puts things into perspective and makes you think about what we take for granted.

What do you edit on?
Adobe Premiere Pro is my current weapon of choice, but I would edit on an iPhone, Amiga 500 or a Moviola if need be.

Favorite plugin?
That would be Dope Transitions for Premiere.

Name three pieces of technology you can’t live without?
iPhone, iMac, airplane.

What do you do to destress from it all?
I bike the hills and valleys around the San Francisco Bay Area. I work out as much as possible, and I help my wonderful partner cook and entertain our friends and family. And travel!

Atomos Ninja V to record 5.9K raw from Panasonic S1H

Atomos and Panasonic are making updates to the Ninja V HDR monitor-recorder and Panasonic Lumix S1H mirrorless digital camera that will make it possible to record 5.9K Apple ProRes raw files directly from the camera’s sensor. The free updates will be available May 25.

The Ninja V captures highly detailed 12-bit raw files from the S1H over HDMI at up to 5.9K/29.97p in full frame or 4K/59.94p in Super35. These clean, unprocessed files preserve the maximum dynamic range, color accuracy and detail from the S1H. The resulting ProRes raw files offer perfect skin tones and easily matched colors ideal for both HDR and SDR (Rec. 709) workflows.

With the new 3.5K Super35 Anamorphic 4:3 raw mode, the Ninja V and S1H combination caters to cinematographers who shoot with anamorphic lenses. The Ninja V and S1H can now be used as an A camera or a smaller B camera on an anamorphic raw production.

Each frame recorded in ProRes raw has metadata supplied by the S1H. Apple’s Final Cut Pro X and other NLEs will automatically recognize ProRes raw files as coming from the S1H and set them up for editing and display in either SDR or HDR projects automatically. Additional information will also allow other software to perform extensive parameter adjustments.

EditShare’s new EFSv for virtual editing and storage

EditShare has a new virtualized video editing and storage platform called EFSv. Initially running on AWS infrastructure, the open EFSv platform supports industry-standard third-party creative tools for editing, audio mixing and grading with security capabilities such as file auditing to propel secure, end-to-end editorial workflows in the cloud.

According to EditShare, EFSv native drivers eliminate traditional IT bottlenecks and offer higher performance in virtual environments. And, by using the EditShare RESTful API, users and technology partners can easily automate advanced storage management workflows. Everything within EFSv is virtualized, including project sharing, editing and bin locking.

“Only the cloud can bring the depth of flexibility that’s essential for today’s unusual and disruptive circumstances. Overnight, the advantages offered by the cloud have changed from being ‘nice to have’ to necessary,” says Sunil Mudholkar, VP of product management, EditShare.

EFSv packages include the workstation and GPU resources required to support teams of all sizes. The EFSv packages also include EditShare’s Flow media management and remote production workflow tools. Flow adds a control layer to virtualized storage pools, with tools to scan, log, search and organize media; assemble story packages; and move content between object and block tiers of storage and between cloud and on-premises tiers. Flow’s automation capabilities help users to set redundant tasks and complex workflows.

The EditShare pricing structure offers options to purchase the EFSv subscription alone or with cloud services. EditShare’s customer service team is consulting with users to help them move client workflows to the cloud. This includes cloud configuration, data migration, workflow design and system automation.

Dolores McGinley heads Goldcrest London’s VFX division

London’s Goldcrest Post, a picture and audio post studio, has launched a visual effects division at its Lexington Street location. It will be led by VFX vet Dolores McGinley, whose first task is to assemble a team of artists that will provide services for both new and existing clients.

During the COVID-19 crisis, all Goldcrest staff is working from home except the colorists, who are coming in as needed and working alone in the grading suites. McGinley and her team will move into the Goldcrest facility when lockdown has ended.

“Having been immersed in such a diverse range of projects over the past five years, we identified the need to expand into VFX some time ago,” explains Goldcrest MD Patrick Malone. “We know how essential an integrated VFX service is to our continued success as a leading supplier of creative post solutions to the film and broadcast community.

“As a successful VFX artist in her own right, Dolores is positioned to interpret the client’s brief and offer constructive creative input throughout the production process. She will also draw upon her considerable experience working with colorists to streamline the inclusion of VFX into the grade and guarantee we are able to meet the specific creative requirements of our clients.”

With over two decades of creative experience, McGinley joins Goldcrest having held various senior roles within the London VFX community. Recent examples of her work include The Crown, Giri/Haji and Good Omens.

Colorist Chat: I Am Not Okay With This’ Toby Tomkins

Colorist Toby Tomkins, co-founder of London color grading and finishing boutique Cheat, collaborated once again with The End of the F***ing World director Jonathan Entwistle on another Charles Forsman novel, I Am Not Okay With This. The now-streaming Netflix show is produced by Entwistle alongside Stranger Things EPs Shawn Levy and Dan Cohen, as well as Josh Barry. The director also once again called DP Justin Brown.

Toby Tomkins

Adapted from Forsman’s graphic novel of the same name, the series follows a teenage girl named Sydney as she navigates school, crushes, her sexuality and sudden on-set superpowers. You know, the typical teenage experience.

Here, Tomkins talks about collaborating with the director and DP as well as his workflow.

How early did you get involved on I Am Not Okay With This?
Jon Entwistle had reached out to DP Justin Brown about his interest in adapting this graphic novel after working on The End of the F***ing World. When the series then got commissioned and Justin was on board, he and Jon convinced production company 21 Laps that they could do the grade in London with Cheat. There were some discussions about grading in LA, but we managed to convince them that it could be a quick and easy process back here, and that’s how I got involved.

I was on board quite early on in the production, getting involved with camera tests and reviewing all the material with Justin. We worked together to evaluate the material, and after Justin chose the camera and lenses, we built a color pipeline that informed how the material was shot and how the show would be captured and pass through the color pipeline. From then, we started building off the work we did on The End of the F***ing World. (Check out our coverage of The End of the F***ing World, which includes an interview with Tomkins.)

What kind of look did Jon and Justin want, and how did they express that look to you? Film or show references? Pictures?
There were quite a few visual references, which I already knew from previously working with Jon and Justin. They both gravitate toward a timeless American cinema look — something photochemical but also natural. I knew it would be similar to The End of the F***ing World, but we were obviously using different locations and a slightly different light, so there was a little bit of playing around at the beginning.

We’re all fans of American cinema, especially the look of old film stock. We wanted the look of the show to feel a little bit rough around the edges — like when things used to be shot on film and you had limited control on how to make any changes. Films weren’t corrected to a perfect level and we wanted to keep those imperfections for this show, making it feel authentic and not overly polished. Although it was produced by the same people that did Stranger Things, we wanted to stray away from that style slightly, making it feel a bit different.

We were really aiming for a timeless American look, with a vintage aesthetic that played into a world that was slightly out of place and not really part of reality. During the grade, Justin liked to put a line through it, keeping it all very much in the same space, with perhaps a little pop on the reds and key “American” colors.

Personally, I wanted to evoke the style of some teen film from the late 20th century — slightly-independent looking and minimally processed. Films like 10 Things I Hate About You and She’s All That certainly influenced me.

You have all worked together in the past. How did that help on this show? Was there a kind of shorthand?
We learned a lot doing The End of the F***ing World together and Justin and I definitely developed a shorthand. It’s like having a head start because we are all on the same page from the get-go. Especially as I was grading remotely with Justin and Jon just trusted us to know exactly what he wanted.

Tomkins works on Resolve

At the end of the first day, we shared our work with Jon in LA and he’d watch and add his notes. There were only three notes of feedback from him, which is always nice! They were notes on richness in some scenes and a question on matching between two shots. As we’d already tested the cameras and had conversations about it before, we were always on the same page with feedback and I never disagreed with a single note. And Jon only had to watch the work through once, which meant he was always looking at it with clean, fresh eyes.

What was the show shot on, and what did you use for color grading?
It was shot ARRI Alexa, and I used DaVinci Resolve Studio.

Any particular challenges on this one for you?
It was actually quite smooth for me! Because Justin and I have worked together for so long, and because we did the initial testing around cameras and LUTs, we were very prepared. Justin had a couple of challenges due to unpredictable weather in Pittsburgh, but he likes to do as much as possible in-camera. So once it got to me, we were already aligned and prepared.

How did you find your way to being a colorist?
I started off in the art department on big studio features but wanted to learn more about filmmaking in general, so I went to film school in Bournemouth, back when it was called the Arts University College Bournemouth. I quickly realized my passion was post and gleamed what I could from an exceptional VFX tutor there called Jon Turner. I started specializing in editing and then VFX.

I loved the wizardry and limitless availability of VFX but missed the more direct relationship with storytelling, so when I found out about color grading — which seemed like the perfect balance of both — I fell in love. Once I started grading, I didn’t stop. I even bribed the cleaners to get access to the university grading suite at night.

My first paid gig was for N-Dubz, and after I graduated and they became famous, they kept me on. And that gave me the opportunity to work on bigger music videos with other artists. I set up a suite at home (way before anyone else was really doing this) and convinced clients to come 30 minutes out of London to my parents’ house in a little village called Kings Langley.

I then got asked to set up a color department for a sound studio called Tate Post, where I completed lots of commercials, a few feature films — notably Ill Manors — and some shorts. These included one for Jon called Human Beings, which is where our relationship began! After that, I went it alone again and eventually set up Cheat. The rest is history.

What, in addition to the color, do you provide on projects? Small VFX, etc.?
For I Am Not Okay With This, we did some minor work, online and delivery in house at Cheat. I just do color, however. I think it’s best to leave each department to do its own work and trust the knowledge and experience of experts in the field. We worked with LA-based VFX company Crafty Apes for the show; they were really fantastic.

Where do you get inspiration? Photographs, museums?
Mostly from films — both old and new — and definitely photography and the work of other colorists.

Finally, any advice you’d give your younger self about working as a colorist?
Keep at it! Experience is everything.

Digital Nirvana updated Trance 3.0 for captions, transcriptions

Digital Nirvana has released version 3.0 of its Trance cloud-based application for closed captioning and transcription, which combines STT technology and other AI-driven processes with cloud-based architecture. Implementing cloud-based metadata generation and closed captioning as part of their existing operations, media companies can reduce the time and cost of delivering accurate, compliant content worldwide. Users can can enrich and classify content, which enables more effective repurposing of media libraries and facilitating more intelligent targeting of advertising spots.

“Trance 3.0 includes a new transcript correction window, a text translation engine that simplifies and speeds captioning in additional languages, and automated caption conformance to accelerate delivery of content to new platforms and geographic regions,” says Russell Wise, SVP at Digital Nirvana. “Even now, with the widespread need to work from home, Trance 3.0 users can maintain their productivity in prepping content for distribution on platforms such as Quibi, Netflix, Hulu, HBO Max, and others.”

A new transcript correction window simplifies the process of reviewing and correcting the transcript used to generate closed captions. The user interface shows time-synced video and captions side by side in a window along with tools for editing text and adding visual cues, music tags and speaker tags. Dictionaries, scripts, rosters and other text resources ingested into Trance help to boost the accuracy of a transcript and, ultimately, the closed captions applied to video. Source text can be automatically translated into one or more additional languages, with the resulting text displayed in a dual-pane window for review and correction.

New caption conformance and quality assurance capabilities within Trance 3.0 allow users to configure captions according to style guidelines of each distribution platform, ensuring streaming services do not reject content just because captioning doesn’t match up with their internal caption style guide. Users configure and apply presets for target platforms, and Trance 3.0 automates caption formatting — number of characters, number of lines and caption placement — in accordance with policies defined in the appropriate preset. The resulting captions are displayed in a captioner window for final comparison to video. Once captions have been reviewed and approved, the file is used to generate the multiple output formats required for distribution.

Trance 3.0 completes the closed-captioning processes from end to end, generating a project management layer that centralizes tasks and minimizes the need for manual  intervention. The project manager can configure roles and priorities for different users and then set up individual projects by identifying necessary tasks, outputs and deadlines. Trance automatically handles the movement and processing of content, transcription, translation and captioning and tracks the productivity, workload and availability of different staff members. It also identifies the most appropriate person to task with a particular job and delivers notifications and alerts as needed to drive each project through to completion.

Working From Home: VFX house The Molecule

By Randi Altman

With the COVID-19 crisis affecting all aspects of our industry, we’ve been talking to companies that have set up remote workflows to meet their clients’ needs. One of those studios is The Molecule, which is based in New York and has a location in LA as well. The Molecule has focused on creating visual effects for episodics and films since its inception in 2005.

Blaine Cone 

The Molecule artists are currently working on series such as Dickinson and Little Voice (AppleTV+), Billions (Showtime), Genius: Aretha (NatGeo), Schooled and For Life (ABC) and The Stranger (Quibi). And on the feature side, there is Stillwater (Focus Features) and Bliss (Amazon). Other notable projects include The Plot Against America (HBO), Fosse/Verdon (FX) and The Sinner (USA).

In order to keep these high-profile projects flowing, head of production Blaine Cone and IT manager Kevin Hopper worked together to create the studio’s work-from-home setup.

Let’s find out more…

In the weeks leading up to the shutdown, what were you doing to prepare?
Blaine Cone: We had already been investigating and testing various remote workflows in an attempt to find a secure solution we could extend to artists who weren’t readily available to join us in house. Once we realized this would be a necessity for everyone in the company, we accelerated our plans. In the weeks before the lockdown, we had increasingly larger groups of artists work from home to gradually stress-test the system.

How difficult was it to get that set up?
Cone: We were fortunate to have a head start on our remote secure platform. Because we decided to tie into AWS, as well as into our own servers and farm (custom software running on a custom-built hypervisor server on Dell machines), it took a little while, but once we saw the need to fast-track it we were able to refine our solution pretty quickly. We’re still optimizing and improving behind the scenes, but the artists have been able to work uninterrupted since the beginning.

Kevin Hopper

What was your process in choosing the right tools to make this work?
Kevin Hopper: We have been dedicated to nailing down TPN-compliant remote work practices for the better part of a year now. We knew that there was a larger market of artists available for us to tap into if we could get a remote work solution configured properly from a security standpoint. We looked through a few companies offering full remote working suites via Teradici PCOIP setups and ultimately decided to configure our own images and administer them to our users ourselves. This route gives us the most flexibility and allows us to accurately and effectively mirror our required security standards.

Did employees bring home their workstations/monitors? How is that working?
Cone: In the majority of cases, employees are using their home workstations and monitors to tap into their dedicated AWS instance. In fact, the home setup could be relatively modest because they were tapping into a very strong machine on the cloud. In a few cases, we sent home 4K monitors with individuals so they could better look at their work..

Can you describe your set up and what tools you are using?
Cone: We are using Teradici to give artists access to dedicated, powerful and secure AWS machines to work off of files on our server. This is set up for Nuke, Maya, Houdini, Mocha, Syntheyes, Krita, Resolve, Mari and Substance Painter. We spin up the AWS instances in the morning and then down again after the workday is over. It allows us to scale as necessary, and it limits the amount of technical troubleshooting and support we might have to do otherwise. We have our own internal workflow tools built into the workflow just as we did when artists were at our office. It’s been relatively seamless.

Fosse/Verdon

How are you dealing with the issues of security while artists are working remotely?
Cone: Teradici gives us the security we need to ensure that the data exists only on our servers. It limits the artists from web traffic as well.

How is this allowing you to continue creating visual effects for shows?
Cone: It’s really not dissimilar to how we normally work. The most challenging change has been the lack of in-person interaction. Shotgun, which we use to manage our shots, still serves as our creative hub, but Slack has become an even more integral aspect of our communication workflow as we’ve gone remote. We’ve also set up regular team calls, video chats and more to make up for the lack of interpersonal interaction inherent in a remote scenario.

Can you talk about review and approval on shots?
Cone: Our supervisors are all set up with Teradici to review shots securely. They also have 4K monitors. In some cases, artists are doing Region of Interest to review their work. We’ve continued our regular methods of delivery to our clients so that they can review and approve as necessary.

How many artists do you have working remotely right now?
Cone: Between supervisors, producers, artists and support staff in NY and LA, we have about 50 remote users working on a daily basis. Our Zoom chats are a lot of fun. In a strange way, this has brought us all closer together than ever before.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years.