OWC 12.4

Category Archives: Color Grading

Directing bookend sequences for Portals, a horror anthology film

By Hasraf “HaZ” Dulull

Portals is a genre-bending feature film anthology focusing on a series of worldwide blackouts — after which millions of mysterious objects appear everywhere across the planet. While many flee from the sentient objects, some people are drawn toward and into them with horrifying consequences.

Portals

The film was in the final stages of post when writer/director Liam O’Donnell (Beyond Skyline and the upcoming Skylines film) called to see if I would like to get involved and direct some bookend sequences to add more scope and setup, which the producers felt was very much needed. I loved the premise and the world of the anthology, so I said yes. I pitched an idea for an ending, that quickly evolved into an extra segment at the end of the film, which I directed. That’s why there are officially four directors on the show, with me getting executive producer and “end-segment created by” credits.

Two of the other sequences are around 20 to 25 minutes each, and O’Donnell’s sequence was around 35 minutes. The film is 85 minutes long. Eduardo Sanchez and Gregg Hale (The Blair Witch Project) co-directed their segments. So the anthology feature film is really three long segments with my bookend sequences. The only connections among all the stories are the objects that appear, the event itself and the actual “portal,” but everything else was unique to each segment’s story. In terms of production, the only consistencies throughout the anthology were the camera language — that slight hand-held feel — and, of course, the music/sound

I had to watch the latest cut of the entire anthology film to get my head into that world, but I was given freedom to bring my own style to my sequences. That is exactly the point of an anthology — for each director to bring his or her own sensibilities to the individual segments. Besides Liam, the main producers I worked closely with on this project were Alyssa Devine and Griffin Devine from Pigrat Productions. They are fans of my first feature film, The Beyond, so they really encouraged the grounded tone I had demonstrated in that film.

The portal in Portals.

I’ve been a huge advocate of Blackmagic cameras and technology for a long time. Additionally, I knew I had to a lot to shoot in a very short time space (two days!), so I needed a camera that was light and flexible yet able to shoot 4K. I brought on cinematographer Colin Emerson, who shoots in a very loose way but always makes his stuff look cinematic. We watched the cut of the film and noticed the consistent loose nature to the cinematography on all the segments. Colin uses the Fig Rig a lot and I love the way that rig works and the BMD Pocket Cinema 4K fits nicely on it along with his DSLR lenses he likes to use. The other reason was to be able to use Blackmagic’s new BRaw format too.

We also shot the segment using a skeleton crew, which comprised of myself as director/producer; VFX supervisor/1st AD John Sellings, who also did some focus pulling; James De Taranto (sound recording); DP/camera op Colin Emerson, FX makeup artists Kate Griffith and Jay James; and our two actors, Georgina Blackledge and Dare Emmanuel. I worked with both of them on my feature film The Beyond.

The Post
One thing I wanted to make sure of was that the post team at The Institution in LA was able to take my Resolve files and literally work from that for the picture post. One of the things I did during prep of the project (before we even cast) was to shoot some tests to show what I had in mind in terms of look and feel. We also tested the BRaw and color workflow between my setup in London and the LA team. Colin and I did this during location recce. This proved to be extremely useful to ensure we set our camera to the exact specs the post house wanted. So we shot at 23.98, 4K (4096×1716) 2:39 cropped, Blackmagic color design log color space.

HaZ’s segments were captured with the Blackmagic Pocket Cinema Camera.

During the test, I did some quick color tests to show the producers in LA the tone and mood I was going for and to make sure everyone was on board before I shot it. The look was very post apocalyptic, as it’s set after the main events have happened. I wanted the locations to be a contrast with each other, one interior and one exterior with greens.

Colin is used to shooting most of his stuff on the Panasonic GH, but he had the Cinema Pocket Camera and was looking for the right project to use it on. He found he could use all of his usual lenses because the Cinema Pocket Camera has the same mount. Lenses used were the Sigma 18-35mm f/1.8 + Metabones Speedbooster; the Olympus 12mm f2; and the Lumix 35-100mm f2.8

Colin used the onboard monitor screen on the Pocket Cinema Camera, while I used a tethered external monitor — the Ikan DH5e — for directing. We used a 1TB Samsung external SSD securely attached to the rig cage along with a 64GB CFast card. The resolution we shot in was determined by the tests we did. We set up the rushes for post after each of the two days of the shoot, so during the day we would swap out drives and back things up. At the end of the day, we would bring in all the picture and sound rushes and use the amazing autosync feature in Blackmagic DaVinci Resolve to set it all up. This way, when I headed back home I could start editing right away inside Resolve.

Resolve

I have to admit, we were hesitant at first because I was shooting and capturing Log in QuickTime ProRes 4:4:4:4, and I always avoided DNG raw because of the huge file size and data transfer. But the team at Blackmagic has always been so supportive and provided us with support right up till the end of the shoot, so after testing BRaw I was impressed. We had so much control as all that information is accessed within Resolve. . I was able to set the temp look during editing, and the colorist worked from there. Skin tones were of utmost importance; because of the intimate nature of the drama, I wanted a natural look to the skin tones. I am really happy with the way they came out at the end.

They couldn’t believe how cinematic the footage was when we told them we shot using the Pocket Cinema Camera, since the other segments were shot on cameras like Red. We delivered the same 4K deliverables spec as the other segments in the film.

HaZ on set, second from right.

I used the AMD Radeon RX Vega 56 version of the Blackmagic eGPU. The reason was because I wanted to edit on my MacBook Pro (late 2017) and needed the power to run 4K in realtime. I was so impressed with how much power it provided; it was like having a new MacBook Pro without having to buy one. The eGPU had all the connectivity (two Thunderbolt and four USB-3) I needed, which is a limitation of the MacBook Pro.

The beauty of keeping everything native was that there wasn’t much work to do when porting, as it’s just plug and play. And the Resolve detects the eGPU, which you can then set as default. The BRaw format makes it all so manageable to preview and playback in real time. Also, since it’s native, Resolve doesn’t need to do any transcoding in the background. I have always been a huge fan of the tracking in Resolve, and I was able to do eye effects very easily without it being budgeted or done as a VFX shot. I was able to get the VFX render assets from the visual effects artist (Justin Martinez ) in LA and do quick-slap comps during editing. I love the idea that I can set looks and store them as memories, which I can then recall very quickly to apply on a bunch of shots. This allows me to have a slick-looking preview rough cut of the film.

Portals

I sent a hard drive containing all the organized rushes to the team in LA while I was doing the final tweaks to the edit. Once the edit was signed off, or if any last-minute notes came in, I would do them and email them my Resolve file. It was super simple, and the colorists (Oliver Ojeil) and post team (Chad Van Horn and Danny Barone) in LA appreciated the simple workflow because there really wasn’t any conforming for them to do apart from a one-click relink of media location; they would just take my Resolve file and start working away with it.

We used practical effects to keep the horror as real and grounded as possible, and used VFX to augment further. We were fortunate to be able to get special effects makeup artist Kate Griffiths. Given the tight schedule she was able to create a terrifying effect, which I won’t give away. You need to watch the film to see it! We had to shoot those make-up FX-heavy shots at the end of the day, which meant we had to be smart about how we scheduled the shoot given the hours-long make-up process. Kate was also on hand to provide effects like the liquid coming out of the eyes and sweat etc. — every detail of which the camera picked up for us so we could bring it out in the grade.

The Skype-style shots at the start of the film (phone and computer monitor shots) had their VFX screen elements placed as a separate layer so the post team in LA could grade them separately and control the filters applied on them. For some of the wide shots showing our characters entering and leaving the portal, we keyframed some movement of the 4K shot along with motion blur to give the effect of in-camera movement. I also used the camera shake within Resolve, which comes with so many options to create bespoke movement on static frames.

Portals is now available on iTunes and other VOD platforms.


HaZ Dulull is known for his sci-fi feature films The Beyond and 2036 Origin Unknown, also in television for his pilot and episodes on Disney’s Fast Layne. He is currently busy on projects at various stages of development and production at his production company, hazfilm.com.

Review: HP’s ZBook G6 mobile workstation

By Brady Betzel

In a year that’s seen AMD reveal an affordable 64-core processor with its Threadripper 3, it appears as though we are picking up steam toward next-level computing.

Apple finally released its much-anticipated Mac Pro (which comes with a hefty price tag for the 1.5TB upgrade), and custom-build workstation companies — like Boxx and Puget Systems — can customize good-looking systems to fit any need you can imagine. Additionally, over the past few months, I have seen mobile workstations leveling the playing field with their desktop counterparts.

HP is well-known in the M&E community for its powerhouse workstations. Since I started my career, I have either worked on a MacPro or an HP. Both have their strong points. However, workstation users who must be able to travel with their systems, there have always been some technical abilities you had to give up in exchange for a smaller footprint. That is, until now.

The newly released HP ZBook 15 G6 has become the rising the rising tide that will float all the boats in the mobile workstation market. I know I’ve said it before, but the classification of “workstation” is technically much more than just a term companies just throw around. The systems with workstation-level classification (at least from HP) are meant to be powered on and run at high levels 24 hours a day, seven days a week, 365 days a year.

They are built with high-quality, enterprise-level components, such as ECC (error correcting code) memory. ECC memory will self-correct errors that it sees, preventing things like blue screens of death and other screen freezes. ECC memory comes at a cost, and that is why these workstations are priced a little higher than a standard computer system. In addition, the warranties are a little more inclusive — the HP ZBook 15 G6 comes with a standard three-year/on-site service warranty.

Beyond the “workstation” classification, the ZBook 15 G6 is amazingly powerful, brutally strong and incredibly colorful and bright. But what really matters is under the hood. I was sent the HP ZBook 15 G6 that retails for $4,096 and contains the following specs:
– Intel Xeon E-2286M (eight cores/16 threads — 2.4GHz base/5GHz Turbo)
– Nvidia Quadro RTX 3000 (6GB VRAM)
15.6-inch UHD HP Dream Color display, anti-glare, WLED backlit 600 nits, 100% DCI-P3
– 64GB DDR4 2667MHz
– 1TB PCIe Gen 3 x4 NVMe SSD TLC
– FHD webcam 1080p plus IR camera
– HP collaboration keyboard with dual point stick
– Fingerprint sensor
– Smart Card reader
– Intel Wi-Fi 6 AX 200, 802.11ac 2×2 +BT 4.2 combo adapter (vPro)
– HP long-life battery four-cell 90 Wh
– Three-year limited warranty

The ZBook 15 G6 is a high-end mobile workstation with a price that reflects it. However, as I said earlier, true workstations are built to withstand constant use and, in this case, abuse. The ZBook 15 G6 has been designed to pass up to 21 extensive MIL-STD 810G tests, which is essentially worst-case scenario testing. For instance, drop testing of around four feet, sand and dust testing, radiation testing (the sun beating down on the laptop for an extended period) and much more.

The exterior of the G6 is made of aluminum and built to withstand abuse. The latest G6 is a little bulky/boxy, in my opinion, but I can see why it would hold up to some bumps and bruises, all while working at blazingly fast speeds, so bulk isn’t a huge issue for me. Because of that bulk, you can imagine that this isn’t the lightest laptop either. It weighs in at 5.79 pounds for the lowest end and measures 1 inch by 14.8 inches by 10.4 inches.

On the bottom of the workstation is an easy-to-access panel for performing repairs and upgrades yourself. I really like the bottom compartment. I opened it and noticed I could throw in an additional NVMe drive and an SSD if needed. You can also access memory here. I love this because not only can you perform easy repairs yourself, but you can perform upgrades or part replacements without voiding your warranty on the original equipment. I’m glad to see that HP kept this in mind.

The keyboard is smaller than a full-size version but has a number keypad, which I love using when typing in timecodes. It is such a time-saver for me. (I credit entering in repair order numbers when I fixed computers at Best Buy as a teenager.) On the top of the keyboard are some handy shortcuts if you do web conferences or calls on your computer, including answering and ending calls. The Bang & Olufsen speakers are some of the best laptop speakers I’ve heard. While they aren’t quite monitor-quality, they do have some nice sound on the low end that I was able to fine-tune in the Bang & Olufsen audio control app.

Software Tests
All right, enough of the technical specs. Let’s get on to what people really want to know — how the HP ZBook 15 G6 performs while using apps like Blackmagic’s DaVinci Resolve and Adobe Premiere Pro. I used sample Red and Blackmagic Raw footage that I use a lot in testing. You can grab the Red footage here and the BRaw footage here. Keep in mind you will need to download the BRaw software to edit with BRaw inside of Adobe products, which you can find here).

Performance monitor while exporting in Resolve with VFX.

For testing in Resolve and Premiere, I strung out one-minute of 4K, 6K and 8K Red media in one sequence and the 4608×2592 4K and 6K BRaw media in another. During the middle of my testing Resolve had a giant Red API upgrade to allow for better realtime playback of Red Raw files if you have an Nvidia CUDA-based GPU.

First up is Resolve 16.1.1 and then Resolve 16.1.2. Both sequences are set to UHD (3840×2160) resolution. One sequence of each codec contains just color correction, while another of each codec contains effects and color correction. The Premiere sequence with color and effects contains basic Lumetri color correction, noise reduction (50) and a Gaussian blur with settings of 0.4. In Resolve, the only difference in the color and effects sequence is that the noise reduction is spatial and set to Enhanced, Medium and 25/25.

In Resolve, the 4K Red media would play in realtime while the 6K (RedCode 3:1) would jump down to about 14fps to 15fps, and the 8K (RedCode 7:1) would play at 10fps at full resolution with just color correction. With effects, the 4K media would play at 20fps, 6K at 3fps and 8K at 10fps. The Blackmagic Raw video would play at real time with just color correction and around 3fps to 4fps with effects.

This is where I talk about just how loud the fans in the ZBook 15 G6 can get. When running exports and benchmarks, the fans are noticeable and a little distracting. Obviously, we are running some high-end testing with processor- and GPU-intensive tests but still, the fans were noticeable. However, the bottom of the mobile workstation was not terribly hot, unlike the MacBook Pros I’ve tested before. So my lap was not on fire.

In my export testing, I used those same sequences as before and from Adobe Premiere Pro 2020. I exported UHD files using Adobe Media Encoder in different containers and codecs: H.264 (Mov), H.265 (Mov), ProResHQ, DPX, DCP and MXF OP1a (XDCAM). The MXF OP1a was at 1920x1080p export.
Here are my results:

Red (4K,6K,8K)
– Color Only: H.264 – 5:27; H.265 – 4:45; ProResHQ – 4:29; DPX – 3:37; DCP – 10:38; MXF OP1a – 2:31

Red Color, Noise Reduction (50), Gaussian Blur .4: H.264 – 4:56; H.265 – 4:56; ProResHQ – 4:36; DPX – 4:02; DCP – 8:20; MXF OP1a – 2:41

Blackmagic Raw
Color Only: H.264 – 2:05; H.265 – 2:19; ProResHQ – 2:04; DPX – 3:33; DCP – 4:05; MXF OP1a – 1:38

Color, Noise Reduction (50), Gaussian Blur 0.4: H.264 – 1:59; H.265 – 2:22; ProResHQ – 2:07; DPX – 3:49; DCP – 3:45; MXF OP1a – 1:51

What is surprising is that when adding effects like noise reduction and a Gaussian blur in Premiere, the export times stayed similar. While using the ZBook 15 G6, I noticed my export times improved when I upgraded driver versions, so I re-did my tests with the latest Nvidia drivers to make sure I was consistent. The drivers also solved an issue in which Resolve wasn’t reading BRaw properly, so remember to always research drivers.

The Nvidia Quadro RTX 3000 really pulled its weight when editing and exporting in both Premiere and Resolve. In fact, in previous versions of Premiere, I noticed that the GPU was not really being used as well as it should have been. With the Premiere Pro 2020 upgrade it seems like Adobe really upped its GPU usage game — at some points I saw 100% GPU usage.

In Resolve, I performed similar tests, but instead of ProResHQ I exported a DNxHR QuickTime file/package instead of a DCP and IMF package. For the most part, they are stock exports in the Deliver page of Resolve, except I forced Video Levels, Forced Debayer and Resizing to Highest Quality. Here are my results from Resolve version 16.1.1 and 16.1.2. (16.1.2 will be in parenthesis.)

– Red (4K, 6K, 8K) Color Only: H.264 – 2:17 (2:31); H.265 – 2:23 (2:37); DNxHR – 2:59 (3:06); IMF – 6:37 (6:40); DPX – 2:48 (2:45); MXF OP1A – 2:45 (2:33)

Color, Noise Reduction (Spatial, Enhanced, Medium, 25/25), Gaussian Blur 0.4: H.264 – 5:00 (5:15); H.265 – 5:18 (5:21); DNxHR – 5:25 (5:02); IMF – 5:28 (5:11); DPX – 5:23 (5:02); MXF OP1a – 5:20 (4:54)

-Blackmagic Raw Color Only: H.264 – 0:26 (0:25); H.265 – 0:31 (0:30); DNxHR – 0:50 (0:50); IMF – 3:51 (3:36); DPX – 0:46 (0:46); MXF OP1a – 0:23 (0:22)

Color, Noise Reduction (Spatial, Enhanced, Medium, 25/25), Gaussian Blur 0.4: H.264 – 7:51 (7:53); H.265 – 7:45 (8:01); DNxHR – 7:53 (8:00); IMF – 8:13 (7:56); DPX – 7:54 (8:18); MXF OP1a – 7:58 (7:57)

Interesting to note: Exporting Red footage with color correction only was significantly faster from Resolve, but for Red footage with effects applied, export times were similar between Resolve and Premiere. With the CUDA Red SDK update to Resolve in 16.1.2, I thought I would see a large improvement, but I didn’t. I saw an approximate 10% increase in playback but no improvement in export times.

Puget

Puget Systems has some great benchmarking tools, so I reached out to Matt Bach, Puget Systems’ senior labs technician, about my findings. He suggested that the mobile Xeon could possibly still be the bottleneck for Resolve. In his testing he saw a larger increase in speed with AMD Threadripper 3 and Intel i9-based systems. Regardless, I am kind of going deep on realtime playback of 8K Red Raw media on a mobile workstation — what a time we are in. Nonetheless, Blackmagic Raw footage was insanely fast when exporting out of Resolve, while export time for the Blackmagic Raw footage with effects was higher than I expected. There was a consistent use of the GPU and CPU in Resolve much like in the new version of Premiere 2020, which is a trend that’s nice to see.

In addition to Premiere and Resolve testing, I ran some common benchmarks that provide a good 30,000-foot view of the HP ZBook 15 G6 when comparing it to other systems. I decided to use the Puget Systems benchmarking tools. Unfortunately, at the time of this review, the tools were only working properly with Premiere and After Effects 2019, so I ran the After Effects benchmark using the 2019 version. The ZBook 15 G6 received an overall score of 802, render score of 79, preview score of 75.2 and tracking score of 86.4. These are solid numbers that beat out some desktop systems I have tested.

Corona

To test some 3D applications, I ran the Cinebench R20, which gave a CPU score of 3243, CPU (single core) score of 470 and an M/P ratio of 6.90x. I recently began running the Gooseberry benchmark scene in Blender to get a better sense of 3D rendering performance, and it took 29:56 to export. Using the Corona benchmark, it took 2:33 to render 16 passes, 3,216,368 rays/s. Using Octane Bench the ZBook 15 G6 received a score of 139.79. In the Vray benchmark for CPU, it received 9833 Ksamples, and in the Vray GPU testing, 228 mpaths. I’m not going to lie; I really don’t know a lot about what these benchmarks are trying to tell me, but they might help you decide whether this is the mobile workstation for your work.

Cinebench

One benchmark I thought was interesting between driver updates for the Nvidia Quadro RTX 3000 was the Neat Bench from Neat Video — the noise reduction plugin for video. It measures whether your system should use the CPU, GPU or a combination thereof to run Neat Video. Initially, the best combination result was to use the CPU only (seven cores) at 11.5fps.

After updating to the latest Nvidia drivers, the best combination result was to use the CPU (seven cores) and GPU (Quadro RTX 3000) at 24.2fps. A pretty incredible jump just from a driver update. Moral of the story: Make sure you have the correct drivers always!

Summing Up
Overall, the HP ZBook 15 G6 is a powerful mobile workstation that will work well across the board. From 3D to color correction apps, the Xeon processor in combination with the Quadro RTX 3000 will get you running 4K video without a problem. With the HP DreamColor anti-glare display using up to 600 nits of brightness and covering 100% of the DCI-P3 color space, coupled with the HDR option, you can rely on the attached display for color accuracy if you don’t have your output monitor attached. And with features like two USB Type-C ports (Thunderbolt 3 plus DP 1.4 plus USB 3.1 Gen 2), you can connect external monitors for a larger view of your work

The HP Fast Charge will get you out of a dead battery fiasco with the ability to go from 0% to 50% charge in 45 minutes. All of this for around $4,000 seems to be a pretty low price to pay, especially because it includes a three-year on-site warranty and because the device is certified to work seamlessly with many apps that pros use with HP’s independent software vendor verifications.

If you are looking for a mobile workstation upgrade, are moving from desktop to mobile or want an alternative to a MacBook Pro, you should price a system out online.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

OWC 12.4

The Mill opens boutique studio in Berlin

Technicolor’s The Mill has officially launched in Berlin. This new boutique studio is located in the heart of Berlin, situated in the creative hub of Mitte, near many of Germany’s agencies, production companies and brands.

The Mill has been working with German clients for years. Recent projects include the Mercedes’ Bertha Benz spot with director Sebastian Strasser; Netto’s The Easter Surprise, directed in-house by The Mill; and BMW The 8 with director Daniel Wolfe. The new studio will bring The Mill’s full range of creative services from color to experiential and interactive, as well as visual effects and design.

The Mill Berlin crew

Creative director Greg Spencer will lead the creative team. He is a multi-award winning creative, having won several VES, Cannes Lions and British Arrow awards. His recent projects include Carlsberg’s The Lake, PlayStation’s This Could Be You and Eve Cuddly Toy. Spencer also played a role in some of Mill Film’s major titles. He was the 2D supervisor for Les Misérables and also worked on the Lord of the Rings trilogy. His resume also includes campaigns for brands such as Nike and Samsung.

Executive producer Justin Stiebel moves from The Mill London, where he has been since early 2014, to manage client relationships and new business. Since joining the company, Stiebel has produced spots such as Audi’s Next Level and the Mini’s “The Faith of a Few” campaign. He has also collaborated with directors such as Sebastian Strasser, Markus Walter and Daniel Wolfe while working on brands like Mercedes, Audi and BMW.

Sean Costelloe is managing director of The Mill London and The Mill Berlin.

Main Image Caption: (L-R) Justin Stiebel and Greg Spencer


Colorfront’s Express Dailies 2020 for Mac Pro, new rental model

Coinciding with Apple’s launch of the latest Mac Pro workstation, Colorfront announced a new, annual rental model for Colorfront Express Dailies.

Launching in Q1 2020, Colorfront’s subscription service allows users to rent Express Dailies 2020 for an annual fee of $5,000, including maintenance support, updates and upgrades. Additionally, the availability of Apple’s brand-new Pro Display XDR, designed for use with the new Mac Pro, makes on-set HDR monitoring, enabled by Colorfront systems, more cost effective.

Express Dailies 2020 supports 6K HDR/SDR workflow along with the very latest camera and editorial formats, including Apple ProRes and Apple ProRes RAW, ARRI MXF-wrapped ProRes, ARRI Alexa LF and Alexa Mini LF ARRIRAW, Sony Venice 5.0, Blackmagic RAW 1.5, and Codex HDE (High Density Encoding).

Express Dailies 2020 is optimized for 6K HDR/SDR dailies processing on the new Mac Pro running MacOS Catalina, leveraging the performance of the Mac Pro’s Intel Xeon 28 core CPU processor and multi-GPU rendering.

“With the launch of the new Mac Pro and Apple Pro Display XDR, we identified a new opportunity to empower top-end DITs and dailies facilities to adopt HDR workflows on a wide range of high-end TV ad motion picture productions,” says Aron Jaszberenyi, managing director of Colorfront. “When combined with the new Mac Pro and Pro Display XDR, Express Dailies 2020 subscription model gives new and cost-effective options for filmmakers wanting to take full advantage of 6K HDR/SDR workflows and HDR on-set.”

 


Company 3 ups Jill Bogdanowicz to co-creative head, feature post  

Company 3 senior colorist Jill Bogdanowicz will now share the title of creative head, feature post with senior colorist Stephen Nakamura. In this new role she will collaborate with Nakamura working to foster communication among artists, operations and management in designing and implementing workflows to meet the ever-changing needs of feature post clients.

“Company 3 has been and will always be guided by artists,” says senior colorist/president Stefan Sonnenfeld. “As we continue to grow, we have been formalizing our intra-company communication to ensure that our artists communicate among themselves and with the company as a whole. I’m excited that Jill will be joining Stephen as a representative of our feature colorists. Her years of excellent work and her deep understanding of color science makes her a perfect choice for this position.”

Among the kinds of issues Bogdanowicz and Nakamura will address: Mentorship within the company, artist recruitment and training and adapting for emerging workflows and client expectations.

Says Bogdanowicz, “As the company continues to expand, both in size and workload, I think it’s more important than ever to have Stephen and me in a position to provide guidance to help the features department grow efficiently while also maintaining the level of quality our clients expect. I intend to listen closely to clients and the other artists to make sure that their ideas and concerns are heard.”

Bogdanowicz has been a leading feature film colorist since the early 2000s. Recent work includes Joker, Spider-Man: Far From Home and Dr. Sleep, to name a few.


Storage for Color and Post

By Karen Moltenbrey

At nearly every phase of the content creation process, storage is at the center. Here we look at two post facilities whose projects continually push boundaries in terms of data, but through it all, their storage solution remains fast and reliable. One, Light Iron, juggles an average of 20 to 40 data-intensive projects at a time and must have a robust storage solution to handle its ever-growing work. Another, Final Frame, recently took on a project whose storage requirements were literally out of this world.

Amazon’s The Marvelous Mrs. Maisel

Light Iron
Light Iron provides a wide range of services, from dailies to post on feature films, indies and episodic shows, to color/conform/beauty work on commercials and short-form projects. The facility’s clients include Netflix, Amazon Studios, Apple TV+, ABC Studios, HBO, Fox, FX, Paramount and many more. Light Iron has been committed to evolving digital filmmaking techniques over the past 10 years and understands the importance of data availability throughout the pipeline. Having a storage solution that is reliable, fast and scalable is paramount to successfully servicing data-centric projects with an ever-growing footprint.

More than 100 full-time employees located at Light Iron’s Los Angeles and New York locations regularly access the company’s shared storage solutions. Both facilities are equipped for dailies and finishing, giving clients an option between its offices based on proximity. In New York, where space is at a premium, the company also offers offline editorial suites.

The central storage solution used at both locations is a Quantum StorNext file system along with a combination of network-attached and direct-attached storage. On the archive end, both sites use LTO-7 tapes for backing up before moving the data off the spinning disc storage.

As Lance Hayes, senior post production systems engineer, explains, the facility segments the storage between three different types of options. “We structured our storage environment in a three-tiered model, with redundancy, flexibility and security in mind. We have our fast disks (tier one), which are fast volumes used primarily for playbacks in the rooms. Then there are deliverable volumes (tier two), where the focus is on the density of the storage. These are usually the destination for rendered files. And then, our nearline network-attached storage (tier three) is more for the deep storage, a holding pool before output to tape,” he explains.

Light Iron has been using Quantum as its de facto standard for the past several years. Founded in 2009, Light Iron has been on an aggressive growth trajectory and has evolved its storage strategy in response to client needs and technological advancement. Before installing its StorNext system, it managed with JBOD (“just a bunch of discs”) direct-attached storage on a very limited number of systems to service its staff of then-30-some employees, says Keenan Mock, senior media archivist at Light Iron. Light Iron, though, grew quickly, “and we realized we needed to invest in a full infrastructure,” he adds.

Lance Hayes

At Light Iron, work often starts with dailies, so the workflow teams interact with production to determine the cameras being used, the codecs being shot, the number of shoot days, the expected shooting ratio and so forth. Based on that information, the group determines which generation of LTO stock makes the most sense for the project (LTO-6 or LTO-7, with LTO-8 soon to be an option at the facility). “The industry standard, and our recommendation as well, is to create two LTO tapes per shoot day,” says Mock. Then, those tapes are geographically separated for safety.

In terms of working materials, the group generally restores only what is needed for each individual show from LTO tape, as opposed to keeping the entire show on spinning disc. “This allows us to use those really fast discs in a cost-effective way,” Hayes says.

Following the editorial process, Light Iron restores only the needed shots plus handles from tape directly to the StorNext SAN, so online editors can have immediate access. The material stays on the system while the conform and DI occur, followed by the creation of final deliverables, which are sent to the tier two and tier three spinning disk storage. If the project needs to be archived to tape, Mock’s department takes care of that; if it needs to be uploaded, that usually happens from the spinning discs.

Light Iron’s FilmLight Baselight systems have local storage, which is used mainly as cache volumes to ensure sustained playback in the color suite. In addition, Blackmagic Resolve color correctors play back content directly to the SAN using tier two storage.

Keenan Mock

Light Iron continually analyzes its storage infrastructure and reviews its options in terms of the latest technologies. Currently, the company considers its existing storage solution to be highly functional, though it is reviewing options for the latest versions of flash solutions from Quantum in 2020.

Based on the facility’s storage workflow, there’s minimal danger of maxing out the storage space anytime soon.

While Light Iron is religious about creating a duplicate set of tapes for backup, “it’s a very rare occurrence [for the duplicate to be needed],” notes Mock, “But it can happen, and in that circumstance, Light Iron is prepared.”

As for the shared storage, the datasets used in post, compared to other industries, are very large, “and without shared storage and a clustered file system, we wouldn’t be able to do the jobs we are currently doing,” Hayes notes.

Final Frame
With offices in New York City and London, Final Frame is a full-featured post facility offering a range of services, including DI of every flavor, 8mm to 77mm film scanning and restoration, offline editing, VFX, sound editing (theatrical and home Dolby Atmos) and mastering. Its work spans feature films, documentaries and television. The facility’s recent work on the documentary film Apollo 11, though, tested its infrastructure like no other, including the amount of storage space it required.

Will Cox

“A long time ago, we decided that for the backbone of all our storage needs, we were going to rely on fiber. We have a total of 55 edit rooms, five projection theaters and five audio mixing rooms, and we have fiber connectivity between all of those,” says Will Cox, CEO/supervising colorist. So, for the past 20 years, ever since 1Gb fiber became available, Final Frame has relied on this setup, though every five years or so, the shop has upgraded to the next level of fiber and is currently using 16Gb fiber.

“Storage requirements have increased because image data has increased and audio data has increased with Atmos. So, we’ve needed more storage and faster storage,” Cox says.

While the core of the system is fiber, the facility uses a variety of storage arrays, the bulk of which are 16Gb 4000 Series SAN offerings from Infortrend, totaling approximately 2PB of space. In addition, the studio uses 8GB Promise Technology VTrak arrays, also totaling about 1PB. Additionally installed at the facility are some JetStor 8GB offerings. For SAN management, Final Frame uses Tiger Technology’s Tiger Store.

Foremost in Cox’s mind when looking for a storage solution is interoperability, since Final Frame uses Linux, Mac and Windows platforms; reliability and fault tolerance are important as well. “We run RAID-6 and RAID-60 for pretty much everything,” he adds. “We also focus on how good the remote management is. We’ve brought online so much storage, we need the storage vendors to provide good interfaces so that our engineers and IT people can manage and get realtime feedback about the performance of the arrays and any faults that are creeping in, whether it’s due to failed drives or drives that are performing less than we had anticipated.”

Final Frame has also brought on a good deal more SSD storage. “We manage projects a bit differently now than we used to, where we have more tiered storage,” Cox adds. “We still do a lot of spinning discs, but SSD is moving in, and that is changing our workflows somewhat in that we don’t have to render as many files and as many versions when we have really fast storage. As a result, there’s some cost-savings on personnel at the workflow level when you have extremely fast storage.”

When working with clients who are doing offline editing, Final Frame will build an isolated SAN for them, and when it comes time to finish the project, whether it’s a picture or audio, the studio will connect its online and mixing rooms to that SAN. This setup is beneficial to security, Cox contends, as it accelerates the workflow since there’s no copying of data. However, aside from that work, everyone generally has parallel access to the storage infrastructure and can access it at any time.

More recently, in addition to other projects, Final Frame began working on Apollo 11, a film directed by Todd Douglas Miller. Miller wanted to rescan all the original negatives and all the original elements available from the Apollo 11 moon landing for a documentary film using audio and footage (16mm and 35mm) from NASA during that extraordinary feat. “He asked if we could make a movie just with the archival elements of what existed,” says Cox.

While ramping up and determining a plan of attack — Final Frame was going to scan the data at 4K resolution — NASA and NARA (National Archives and Records Administration) discovered a lost cache of archives containing 65mm and 70mm film.

“At that point, we decided that existing scanning technology wasn’t sufficient, and we’d need a film scanner to scan all this footage at 16K,” Cox adds, noting the company had to design and build an entirely new 16K film scanner and then build a pipeline that could handle all that data. “If you can imagine how tough 4K is to deal with, then think about 16K, with its insanely high data rates. And 8K is four times larger than 4K, and 16K is four times larger than 8K, so you’re talking about orders-of-magnitude increases in data.”

Adding to the complexity, the facility had no idea how much footage it would be using. Alas, Final Frame ultimately considered its storage structure and the costs needed to take it to the next level for 16K scanning and determined that amount of data was just too much to move and too much to store. “As it was, we filled up a little over a petabyte of storage just scanning the 8K material. We were looking at 4PB, quadrupling the amount of storage infrastructure needed. Then we would have had to run backups of everything, which would have increased it by another 4PB.”

Considering these factors, Final Frame changed its game plan and decided to scan at 8K. “So instead of 2PB to 2.5PB, we would have been looking at 8PB to 10PB of storage if we continued with our earlier plan, and that was really beyond what the production could tolerate,” says Cox.

Even scanning at 8K, the group had to have the data held in the central repository. “We were scanning in, doing what were extensively dailies, restoration and editorial, all from the same core set of media. Then, as editorial was still going on, we were beginning to conform and finish the film so we could make the Sundance deadline,” recalls Cox.

In terms of scans, copies and so forth, Final Frame stored about 2.5PB of data for that project. But in terms of data created and then destroyed, the amount of data was between 12PB and 15PB. To handle this load, the facility needed storage that could perform quickly, be very redundant and large. This led the company to bring on an additional 1PB of Fibre Channel SAN storage to add to the 1.5PB already in place — dedicated to just the Apollo 11 project. “We almost had to double the amount of storage infrastructure in the whole facility just to run this one project,” Cox points out. The additional storage was added in half-petabyte array increments, all connected to the SAN, all at 16Gb fiber.

While storage is important to any project, it was especially true for the Apollo 11 project due to the aggressive deadlines and excessively large storage needs. “Apollo 11 was a unique project. We were producing imagery that was being returned to the National Archives to be part of the historic record. Because of the significance of what we were scanning, we had to be very attentive to the longevity and accuracy of the media,” says Cox. “So, how it was being stored and where it was being stored were important factors on this project, more so than maybe any other project we’ve ever done.”


Karen Moltenbrey is a veteran writer, covering visual effects and post production.


Storage Roundtable

By Randi Altman

Every year in our special Storage Edition, we poll those who use storage and those who make storage. This year is no different. The users we’ve assembled for our latest offering weigh in on how they purchase gear, how they employ storage and cloud-based solutions. Storage makers talk about what’s to come from them, how AI and ML are affecting their tools, NVMe growth and more.

Enjoy…

Periscope Post & Audio, GM, Ben Benedetti

Periscope Post & Audio is a full-service post company with facilities in Hollywood and Chicago’s Cinespace. Both facilities provide a range of sound and picture finishing services for TV, film, spots, video games and other media.

Ben Benedetti

What types of storage are you using for your workflows?
For our video department, we have a large, high-speed Quantum media array supporting three color bays, two online edit suites, a dailies operation, two VFX suites and a data I/O department. The 15 systems in the video department are connected via 16GB fiber.

For our sound department, we are using an Avid Nexis System via 6e Ethernet supporting three Atmos mix stages, two sound design suites, an ADR room and numerous sound-edit bays. All the CPUs in the facility are securely located in two isolated machine rooms (one for video on our second floor and one for audio on the first). All CPUs in the facility are tied via an IHSE KVM system, giving us incredible flexibility to move and deliver assets however our creatives and clients need them. We aren’t interested in being the biggest. We just want to provide the best and most reliable services possible.

Cloud versus on-prem – what are the pros and cons?
We are blessed with a robust pipe into our facility in Hollywood and are actively discussing with our engineering staff about using potential cloud-based storage solutions in the future. We are already using some cloud-based solutions for our building’s security system and CCTV systems as well as the management of our firewall. But the concept of placing client intellectual property in the cloud sparks some interesting conversations.We always need immediate access to the raw footage and sound recordings of our client productions, so I sincerely doubt we will ever completely rely on a cloud-based solution for the storage of our clients’ original footage. We have many redundancy systems in place to avoid slowdowns in production workflows. This is so critical. Any potential interruption in connectivity that is beyond our control gives me great pause.

How often are you adding or upgrading your storage?
Obviously, we need to be as proactive as we can so that we are never caught unready to take on projects of any size. It involves continually ensuring that our archive system is optimized correctly and requires our data management team to constantly analyze available space and resources.

How do you feel about the use of ML/AI for managing assets?
Any AI or ML automated process that helps us monitor our facility is vital. Technology advancements over the past decade have allowed us to achieve amazing efficiencies. As a result, we can give the creative executives and storytellers we service the time they need to realize their visions.

What role might the different tiers of cloud storage play in the lifecycle of an asset?
As we have facilities in both Chicago and Hollywood, our ability to take advantage of Google cloud-based services for administration has been a real godsend. It’s not glamorous, but it’s extremely important to keeping our facilities running at peak performance.

The level of coordination we have achieved in that regard has been tremendous. Those low-tiered storage systems provide simple and direct solutions to our administrative and accounting needs, but when it comes to the high-performance requirements of our facility’s color bays and audio rooms, we still rely on the high-speed on-premises storage solutions.

For simple archiving purposes, a cloud-based solution might work very well, but for active work currently in production … we are just not ready to make that leap … yet. Of course, given Moore’s Law and the exponential advancement of technology, our position could change rapidly. The important thing is to remain open and willing to embrace change as long as it makes practical sense and never puts your client’s property at risk.

Panasas, Storage Systems Engineer, RW Hawkins

RW Hawkins

Panasas offers a scalable high-performance storage solution. Its PanFS parallel file system, delivered on the ActiveStor appliance, accelerates data access for VFX feature production, Linux-based image processing, VR/AR and game development, and multi-petabyte sized active media archives.

What kind of storage are you offering, and will that be changing in the coming year?
We just announced that we are now shipping the next generation of the PanFS parallel file system on the ActiveStor Ultra turnkey appliance, which is already in early deployment with five customers.

This new system offers unlimited performance scaling in 4GB/s building blocks. It uses multi-tier intelligent data placement to maximize storage performance by placing metadata on low-latency NVMe SSDs, small files on high IOPS SSDs and large files on high-bandwidth HDDs. The system’s balanced-node architecture optimizes networking, CPU, memory and storage capacity to prevent hot spots and bottlenecks, ensuring high performance regardless of workload. This new architecture will allow us to adapt PanFS to the ever-changing variety of workloads our customers will face over the next several years.

Are certain storage tiers more suitable for different asset types, workflows, etc.?
Absolutely. However, too many tiers can lead to frustration around complexity, loss of productivity and poor reliability. We take a hybrid approach, whereby each server has multiple types of storage media internal to one server. Using intelligent data placement, we put data on the most appropriate tier automatically. Using this approach, we can often replace a performance tier and a tier two active archive with one cost-effective appliance. Our standard file-based client makes it easy to gateway to an archive tier such as tape or an object store like S3.

What do you see are the big technology trends that can help storage for M&E? ML? AI?
AI/ML is so widespread, it seems to be all encompassing. Media tools will benefit greatly because many of the mundane production tasks will be optimized, allowing for more creative freedom. From a storage perspective, machine learning is really pushing performance in new directions; low latency and metadata performance are becoming more important. Large amounts of unstructured data with rich metadata are the norm, and today’s file systems need to adapt to meet these requirements.

How has NVMe advanced over the past year?
Everyone is taking notice of NVMe; it is easier than ever to build a fast array and connect it to a server. However, there is much more to making a performant storage appliance than just throwing hardware at the problem. My customers are telling me they are excited about this new technology but frustrated by the lack of scalability, the immaturity of the software and the general lack of stability. The proven way to scale is to build a file system on top of these fast boxes and connect them into one large namespace. We will continue to augment our architecture with these new technologies, all the while keeping an eye on maintaining our stability and ease of management.

Do you see NAS overtaking SAN for larger work groups? How about cloud taking on some of what NAS used to do?
Today’s modern NAS can take on all the tasks that historically could only be done with SAN. The main thing holding back traditional NAS has been the client access protocol. With network-attached parallel clients, like Panasas’ DirectFlow, customers get advanced client caching, full POSIX semantics and massive parallelism over standard ethernet.

Regarding cloud, my customers tell me they want all the benefits of cloud (data center consolidation, inexpensive power and cooling, ease of scaling) without the vendor lock-in and metered data access of the “big three” cloud providers. A scalable parallel file system forms the core of a private cloud model that yields the benefits without the drawbacks. File-based access to the namespace will continue to be required for most non-web-based applications.

Goldcrest Post, New York, Technical Director, Ahmed Barbary

Goldcrest Post is an independent post facility, providing solutions for features, episodic TV, docs, and other projects. The company provides editorial offices, on-set dailies, picture finishing, sound editorial, ADR and mixing, and related services.

Ahmed Barbary

What types of storage are you using for your workflows?
Storage performance in the post stage is tremendously demanding. We are using multiple SAN systems in office locations that provide centralized storage and easy access to disk arrays, servers, and other dedicated playout applications to meet storage needs throughout all stages of the workflow.

While backup refers to duplicating the content for peace of mind, short-term retention, and recovery, archival signifies transferring the content from the primary storage location to long-term storage to be preserved for weeks, months, and even years to come. Archival storage needs to offer scalability, flexible and sustainable pricing, as well as accessibility for individual users and asset management solutions for future projects.

LTO has been a popular choice for archival storage for decades because of its affordable, high-capacity solutions with low write/high read workloads that are optimal for cold storage workflows. The increased need for instant access to archived content today, coupled with the slow roll-out of LTO-8, has made tape a less favorable option.

Cloud versus on-prem – what are the pros and cons?
The fact is each option has its positives and negatives, and understanding that and determining how both cloud and on-premises software fit into your organization are vital. So, it’s best to be prepared and create a point-by-point comparison of both choices.

When looking at the pros and cons of cloud vs. on-premises solutions, everything starts with an understanding of how these two models differ. With a cloud deployment, the vendor hosts your information and offers access through a web portal. This enables more mobility and flexibility of use for cloud-based software options. When looking at an on-prem solution, you are committing to local ownership of your data, hardware, and software. Everything is run on machines in your facility with no third-party access.

How often are you adding or upgrading your storage?
We keep track of new technologies and continuously upgrade our systems, but when it comes to storage, it’s a huge expense. When deploying a new system, we do our best to future-proof and ensure that it can be expanded.

How do you feel about the use of ML/AI for managing assets?
For most M&E enterprises, the biggest potential of AI lies in automatic content recognition, which can drive several path-breaking business benefits. For instance, most content owners have thousands of video assets.

Cataloging, managing, processing, and re-purposing this content typically requires extensive manual effort. Advancements in AI and ML algorithms have
now made it possible to drastically cut down the time taken to perform many of these tasks. But there is still a lot of work to be done — especially as ML algorithms need to be trained, using the right kind of data and solutions, to achieve accurate results.

What role might the different tiers of cloud storage play in the lifecycle of an asset?
Data sets have unique lifecycles. Early in the lifecycle, people access some data often, but the need for access drops drastically as the data ages. Some data stays idle in the cloud and is rarely accessed once stored. Some data expires days or months after creation, while other data sets are actively read and modified throughout their lifetimes.

Rohde & Schwarz, Product Manager, Storage Solutions, Dirk Thometzek

Rohde & Schwarz offers broadcast and media solutions to help companies grow in media production, management and delivery in the IP and wireless age.

Dirk Thometzek

What kind of storage are you offering, and will that be changing in the coming year?
The industry is constantly changing, so we monitor market developments and key demands closely. We will be adding new features to the R&S SpycerNode in the next few months that will enable our customers to get their creative work done without focusing on complex technologies. The R&S SpycerNode will be extended with JBODs, which will allow seamless integration with our erasure coding technology, guaranteeing complete resilience and performance.

Are certain storage tiers more suitable for different asset types, workflows, etc.?
Each workflow is different, so, consequently, there is almost no system alike. The real artistry is to tailor storage systems according to real requirements without over-provisioning hardware or over-stressing budgets. Using different tiers can be very helpful to build effective systems, but they might introduce additional difficulties to the workflows if the system isn’t properly designed.

Rohde & Schwarz has developed R&S SpycerNode in a way that its performance is linear and predictable. Different tiers are aggregated under a single namespace, and our tools allow seamless workflows while complexity remains transparent to the users.

What do you see are the big technology trends that can help storage for M&E? ML? AI?
Machine learning and artificial intelligence can be helpful to automate certain tasks, but they will not replace human intervention in the short term. It might not be helpful to enrich media with too much data because doing so could result in imprecise queries that return far too much content.

However, clearly defined changes in sequences or reoccurring objects — such as bugs and logos — can be used as a trigger to initiate certain automated workflows. Certainly, we will see many interesting advances in the future.

How has NVMe advanced over the past year?
NVMe has very interesting aspects. Data rates and reduced latencies are admittedly quite impressive and are garnering a lot of interest. Unfortunately, we do see a trend inside our industry to be blinded by pure performance figures and exaggerated promises without considering hardware quality, life expectancy or proper implementation. Additionally, if well-designed and proven solutions exist that are efficient enough, then it doesn’t make sense to embrace a technology just because it is available.

R&S is dedicated to bringing high-end devices to the M&E market. We think that reliability and performance build the foundation for user-friendly products. Next year, we will update the market on how NVMe can be used in the most efficient way within our products.

Do you see NAS overtaking SAN for larger work groups? How about cloud taking on some of what NAS used to do?
We definitely see a trend away from classic Fibre Channel to Ethernet infrastructures for various reasons. For many years, NAS systems have been replacing central storage systems based on SAN technology for a lot of workflows. Unfortunately, standard NAS technologies will not support all necessary workflows and applications in our industry. Public and private cloud storage systems play an important role in overall concepts, but they can’t fulfil all necessary media production requirements or ease up workflows by default. Plus, when it comes to subscription models, [sometimes there could be unexpected fees]. In fact, we do see quite a few customers returning to their previous services, including on-premises storage systems such as archives.

When it comes to the very high data rates necessary for high-end media productions, NAS will relatively quickly reach its technical limits. Only block-level access can deliver the reliable performance necessary for uncompressed productions at high frame rates.

That does not necessarily mean Fibre Channel is the only solution. The R&S SpycerNode, for example, features a unified 100Gb/s Ethernet backbone, wherein clients and the redundant storage nodes are attached to the same network. This allows the clients to access the storage over industry-leading NAS technology or native block level while enabling true flexibility using state-of-the-art technology.

MTI Film, CEO, Larry Chernoff

Hollywood’s MTI Film is a full-service post facility, providing dailies, editorial, visual effects, color correction, and assembly for film, television, and commercials.

Larry Chernoff

What types of storage are you using for your workflows?
MTI uses a mix of spinning and SSD disks. Our volumes range from 700TB to 1000TB and are assigned to projects depending on the volume of expected camera files. The SSD volumes are substantially smaller and are used to play back ultra-large-resolution files, where several users are using the file.

Cloud versus on-prem — what are the pros and cons?
MTI only uses on-prem storage at the moment due to the real-time, full-resolution nature of our playback requirements. There is certainly a place for cloud-based storage but, as a finishing house, it does not apply to most of our workflows.

How often are you adding or upgrading your storage?
We are constantly adding storage to our facility. Each year, for the last five, we’ve added or replaced storage annually. We now have approximately 8+ PB, with plans for more in the future.

How do you feel about the use of ML/AI for managing assets?
Sounds like fun!

What role might the different tiers of cloud storage play in the lifecycle of an asset?
For a post house like MTI, we consider cloud storage to be used only for “deep storage” since our bandwidth needs are very high. The amount of Internet connectivity we would require to replicate the workflows we currently have using on-prem storage would be prohibitively expensive for a facility such as MTI. Speed and ease of access is critical to being able to fulfill our customers’ demanding schedules.

OWC,Founder/CEO, Larry O’Connor

Larry O’Connor

OWC offers storage, connectivity, software, and expansion solutions designed to enhance, accelerate, and extend the capabilities of Mac- and PC-based technology. Their products range from the home desktop to the enterprise rack to the audio recording studio to the motion picture set and beyond.

What kind of storage are you offering, and will that be changing in the coming year?
OWC will be expanding our Jupiter line of NAS storage products in 2020 with an all new external flash base array. We will also be launching the OWC ThunderBay Flex 8, a three-in-one Thunderbolt 3 storage, docking, and PCIe expansion solution for digital imaging, VFX, video production, and video editing.

Are certain storage tiers more suitable for different asset types, workflows etc?
Yes. SSD and NVMe are better for on-set storage and editing. Once you are finished and looking to archive, HDD are a better solution for long term storage.

What do you see are the big technology trends that can help storage for M&E? ML? AI?
We see U.2 SSDs as a trend that can help storage in this space. Also, solutions that allow the use of external docking of U.2 across different workflow needs.

How has NvME advanced over the past year?
We have seen NVMe technology become higher in capacity, higher in performance, and substantially lower in power draw. Yet even with all the improving performance, costs are lower today versus 12 months ago. SSD and NVMe are better for on-set storage and editing. Once you are finished and looking to archive, HDD are a better solution for long term storage.

Do you see NAS overtaking SAN for larger work groups? How about cloud taking on some of what NAS used to do?
I see both still having their place — I can’t speak to if one will take over the other. SANs provide other services that typically go hand in hand with M&E needs.

As for cloud, I can see some more cloud coming in, but for M&E on-site needs, it just doesn’t compete anywhere near with what the data rate demand is for editing, etc. Everything independently has its place.

EditShare, VP of Product Management, Sunil Mudholkar

EditShare offers a range of media management solutions, from ingest to archive with a focus on media and entertainment.

Sunil Mudholkar

What kind of storage are you offering and will that be changing in the coming year?
EditShare currently offers RAID and SSD, along with our nearline Sata HDD-based storage. We are on track to deliver NVMe- and cloud-based solutions in the first half of 2020. The latest major upgrade of our file system and management console, EFS2020, enables us to migrate to emerging technologies, including cloud deployment and using NVMe hardware.

EFS can manage and use multiple storage pools, enabling clients to use the most cost-effective tiered storage for their production, all while keeping that single namespace.

Are certain storage tiers more suitable for different asset types, workflows etc?
Absolutely. It’s clearly financially advantageous to have varying performance tiers of storage that are in line with the workflows the business requires. This also extends to the cloud, where we are seeing public cloud-based solutions augment or replace both high-performance and long-term storage needs. Tiered storage enables clients to be at their most cost effective by including parking storage and cloud storage for DR, while keeping SSD and NVME storage ready and primed for their high-end production.

What do you see are the big technology trends that can help storage for M&E? ML? AI?
AI and ML have somewhat of an advantage for storage when it comes to things like algorithms that are designed to automatically move content between storage tiers to optimize costs. This has been commonplace in the distribution side of the ecosystem for a long time with CDNs. ML and AI have a great ability to impact the Opex side of asset management and metadata by helping to automate very manual, repetitive data entry tasks through audio and image recognition, as an example.

AI can also assist by removing mundane human-centric repetitive tasks, such as logging incoming content. AI can assist with the growing issue of unstructured and unmanaged storage pools, enabling the automatic scanning and indexing of every piece of content located on a storage pool.

How has NVMe advanced over the past year?
Like any other storage medium, when it’s first introduced there are limited use cases that make sense financially, and only a certain few can afford to deploy it. As the technology scales and changes in form factor, and pricing becomes more competitive and inline with other storage options, it then can become more mainstream. This is what we are starting to see with NVMe.

Do you see NAS overtaking SAN for larger work groups? How about cloud taking on some of what NAS used to do?
Yes, NAS has overtaken SAN. It’s easier technology to deal with — this is fairly well acknowledged. It’s also easier to find people/talent with experience in NAS. Cloud will start to replace more NAS workflows in 2020, as we are already seeing today. For example, our ACL media spaces project options within our management console were designed for SAN clients migrating to NAS. They liked the granular detail that SAN offered, but wanted to migrate to NAS. EditShare’s ACL enables them to work like a SAN but in a NAS environment.

Zoic Studios CTO Saker Klippsten

Zoic Studios is an Emmy-winning VFX company based in Culver City, California, with sister offices in Vancouver and NYC. It creates computer-generated special effects for commercials, films, television and video games.

Saker Klippsten

What types of projects are you working on?
We work on a range of projects for series, film, commercial and interactive games (VR/AR). Most of the live-action projects are mixed with CG/VFX and some full-CG animated shots. In addition, there is typically some form of particle or fluid effects simulation going on, such as clouds, water, fire, destruction or other surreal effects.

What types of storage are you using for those workflows?
Cryogen – Off-the-shelf tape/disk/chip. Access time > 1 day. Mostly tape-based and completely offline, which requires human intervention to load tapes or restore from drives.
Freezing – Tape robot library. Access time < .5 day. Tape-based and in the robot. This does not require intervention.Cold – Spinning disk. Access time — slow (online). Disaster recovery and long-term archiving.
Warm – Spinning disk. Access time — medium (online). Data that needs to still be accessed promptly and transferred quickly (asset depot).
Hot – Chip-based. Access time — fast (online). SSD generic active production storage.
Blazing – Chip-based. Access time — uber fast (online). NVMe dedicated storage for 4K and 8K playback, databases and specific simulation workflows.

Cloud versus on-prem – what are the pros and cons?
The great debate! I tend to not look at it as pro vs. con, but where you are as a company. Many factors are involved and there is no one size that fits all, as many are led to believe, and neither cloud or on-prem alone can solve all your workflow and business challenges.

Cinemax’s Warrior (Credit: HBO/David Bloomer)

There are workflows that are greatly suited for the cloud and others that are potentially cost prohibitive for a number of reasons, such as the size of the data set being generated. Dynamics Cache Simulations are a good example, which can quickly generate tens of TBs or sometimes hundreds of TBs. If the workflow requires you to transfer this data on premises for review, it could take a very long time. Other workflows such as 3D CG-generated data can take better advantage of the cloud. They typically have small source file payloads that need to be uploaded and then only require final frames to be downloaded, which is much more manageable. Depending on the size of your company and level of technical people on hand, the cloud can be a problem

What triggers buying more storage in your shop?
Storage tends to be one of the largest and most significant purchases at many companies. End users do not have a clear concept of what happens at the other end of the wire from their workstation.

All they know is that there is never enough storage and it’s never fast enough. Not investing in the right storage can not only be detrimental to the delivery and production of a show, but also to the mental focus and health of the end users. If artists are constantly having to stop and clean up/delete, it takes them out of their creative rhythm and slows down task completion.

If the storage is not performing properly and is slow, this will not only have an impact on delivery, but the end user might be afraid they are being perceived as being slow. So what goes into buying more storage? What type of impact will buying more storage have on the various workflows and pipelines? Remember, if you are a mature company you are buying 2TB of storage for every 1TB required for DR purposes, so you have a complete up-to-the-hour backup.

Do you see ML/AI as important to your content strategy?
We have been using various layers of ML and heuristics sprinkled throughout our content workflows and pipelines. As an example, we look at the storage platforms we use to understand what’s on our storage, how and when it’s being used, what it’s being used for and how it’s being accessed. We look at the content to see what it contains and its characteristics. What are the overall costs to create that content? What insights can we learn from it for similarly created content? How can we reuse assets to be more efficient?

Dell Technologies, CTO, Media & Entertainment, Thomas Burns

Thomas Burns

Dell offers technologies across workstations, displays, servers, storage, networking and VMware, and partnerships with key media software vendors to provide media professionals the tools to deliver powerful stories, faster.

What kind of storage are you offering, and will that be changing in the coming year?
Dell Technologies offers a complete range of storage solutions from Isilon all-flash and disk-based scale-out NAS to our object storage, ECS, which is available as an appliance or a software-defined solution on commodity hardware. We have also developed and open-sourced Pravega, a new storage type for streaming data (e.g. IoT and other edge workloads), and continue to innovate in file, object and streaming solutions with software-defined and flexible consumption models.

Are certain storage tiers more suitable for different asset types, workflows etc?
Intelligent tiering is crucial to building a post and VFX pipeline. Today’s global pipelines must include software that distinguishes between hot data on the fastest tier and cold or versioned data on less performant tiers, especially in globally distributed workflows. Bringing applications to the media rather than unnecessarily moving media into a processing silo is the key to an efficient production.

What do you see are the big technology trends that can help storage for M&E? ML? AI?
New developments in storage class memory (SCM) — including the use of carbon nanotubes to create a nonvolatile, standalone memory product with speeds rivaling DRAM without needing battery backup — have the potential to speed up media workflows and eliminate AI/ML bottlenecks. New protocols such as NVMe allow much deeper I/O queues, overcoming today’s bus bandwidth limits.

GPUDirect enables direct paths between GPUs and network storage, bypassing the CPU for lower latency access to GPU compute — desirable for both M&E and AI/ML applications. Ethernet mesh, a.k.a. Leaf/Spine topologies, allow storage networks to scale more flexibly than ever before.

How has NVMe advanced over the past year?
Advances in I/O virtualization make NVMe useful in hyper-converged infrastructure, by allowing different virtual machines (VMs) to share a single PCIe hardware interface. Taking advantage of multi-stream writes, along with vGPUs and vNICs, allows talent to operate more flexibly as creative workstations start to become virtualized.

Do you see NAS overtaking SAN for larger work groups? How about cloud taking on some of what NAS used to do?
IP networks scale much better than any other protocol, so NAS allows on-premises workloads to be managed more efficiently than SAN. Object stores (the basic storage type for cloud services) support elastic workloads extremely well and will continue to be an integral part of public, hybrid and private cloud media workflows.

ATTO, Manager, Products Group, Peter Donnelly

ATTO network and storage connectivity products are purpose-made to support all phases of media production, from ingest to final archiving. ATTO offers an ecosystem of high-performance connectivity adapters, network interface cards and proprietary software.

Peter Donnelly

What kind of storage are you offering, and will that be changing in the coming year?
ATTO designs and manufactures storage connectivity products, and although we don’t manufacture storage, we are a critical part of the storage ecosystem. We regularly work with our customers to find the best solutions to their storage workflow and performance challenges.

ATTO designs products that use a wide variety of storage protocols. SAS, SATA, Fibre Channel, Ethernet and Thunderbolt are all part of our core technology portfolio. We’re starting to see more interest in NVMe solutions. While NVMe has already seen some solid growth as an “inside-the-box” storage solution, scalability, cost and limited management capabilities continue to limit its adoption as an external storage solution.

Data protection is still an important criteria in every data center. We are seeing a shift from traditional hardware RAID and parity RAID to software RAID and parity code implementations. Disk capacity has grown so quickly that it can take days to rebuild a RAID group with hardware controllers. Instead, we see our customers taking advantage of rapidly dropping storage prices and using faster, reliable software RAID implementations with basic HBA hardware.

How has NVMe advanced over the past year?
For inside-the-box storage needs, we have absolutely seen adoption skyrocket. It’s hard to beat the price-to-performance ratio of NVMe drives for system boot, application caching and similar use cases.

ATTO is working independently and with our ecosystem partners to bring those same benefits to shared, networked storage systems. Protocols such as NVMe-oF and FC-NVMe are enabling technologies that are starting to mature, and we see these getting further attention in the coming year.

Do you see NAS overtaking SAN for larger work groups? How about cloud taking on some of what NAS used to do?
We see customers looking for ways to more effectively share storage resources. Acquisition and ongoing support costs, as well as the ability to leverage existing technical skills, seem to be important factors pulling people toward Ethernet-based solutions.
However, there is no free lunch, and these same customers aren’t able to compromise on performance and latency concerns, which are important reasons why they used SANs in the first place. So there’s a lot of uncertainty in the market today. Since we design and market products in both the NAS and SAN spaces, we spend a lot of time talking with our customers about their priorities so that we can help them pick the solutions that best fit their needs.

Masstech, CTO, Mike Palmer

Masstech creates intelligent storage and asset lifecycle management solutions for the media and entertainment industry, focusing on broadcast and video content storage management with IT technologies.

Mike Palmer

What kind of storage are you offering, and will that be changing in the coming year?
Masstech products are used to manage a combination of any or all of these kinds of storage. Masstech allows content to move without friction across and through all of these technologies, most often using automated workflows and unified interfaces that hide the complexity otherwise required to directly manage content across so many different types of storage.

Are certain storage tiers more suitable for different asset types, workflows, etc.?
One of the benefits of having such a wide range of storage technologies to choose from is that we have the flexibility to match application requirements with the optimum performance characteristics of different storage technologies in each step of the lifecycle. Users now expect that content will automatically move to storage with the optimal combination of speed and price as it progresses through workflow.

In the past, HSM was designed to handle this task for on-prem storage. The challenge is much wider now with the addition of a plethora of storage technologies and services. Rather than moving between just two or three tiers of on-prem storage, content now often needs to flow through a hybrid environment of on-prem and cloud storage, often involving multiple cloud services, each with three or four sub-tiers. Making that happen in a seamless way, both to users and to integrated MAMs and PAMs, is what we do.

What do you see are the big technology trends that can help storage for M&E?
Cloud storage pricing that continues to drop along with the advance of storage density in both spinning disk and solid state. All of these are interrelated and have the general effect of lowering costs for the end user. For those who have specific business requirements that drive on-prem storage, the availability of higher density tape and optical disks is enabling petabytes of very efficient cold storage within less space than contained in a single rack.

How has NVMe advanced over the past year?
In addition to the obvious application of making media available more quickly, the greatest value of NVMe within M&E may be found in enabling faster search of both structured and unstructured metadata associated with media. Yes, we need faster access to media, but in many cases we must first find the media before it can be accessed. NVMe can make that search experience, particularly for large libraries, federated data sets and media lakes, lightning quick.

Do you see NAS overtaking SAN for larger workgroups? How about cloud taking on some of what NAS used to do?
Just as AWS, Azure and Wasabi, among other large players, have replaced many instances of on-prem NAS, so have Box, Dropbox, Google Drive and iCloud replaced many (but not all) of the USB drives gathering dust in the bottom of desk drawers. As NAS is built on top of faster and faster performing technologies, it is also beginning to put additional pressure on SAN – particularly for users who are sensitive to price and the amount of administration required.

Backblaze, Director of Product Marketing, M&E, Skip Levens

Backblaze offers easy-to-use cloud backup, archive and storage services. With over 12 years of experience and more than 800 Petabytes of customer data under management, Backblaze has offers cloud storage to anyone looking to create, distribute and preserve their content forever.

What kind of storage are you offering and will that be changing in the coming year?
At Backblaze, we offer a single class, or tier, of storage where everything’s active and immediately available wherever you need it, and it’s protected better than it would be on spinning disk or RAID systems.

Skip Levens

Are certain storage tiers more suitable for different asset types, workflows, etc?
Absolutely. For example, animators need different storage than a team of editors all editing a 4K project at the same time. And keeping your entire content library on your shared storage could get expensive indeed.

We’ve found that users can give up all that unneeded complexity and cost that gets in the way of creating content in two steps:
– Step one is getting off of the “shared storage expansion treadmill” and buying just enough on-site shared storage that fits your team. If you’re delivering a TV show every week and need a SAN, make it just large enough for your work in process and no larger.

– Step two is to get all of your content into active cloud storage. This not only frees up space on your shared storage, but makes all of your content highly protected and highly available at the same time. Since most of your team probably use MAM to find and discover content, the storage that assets actually live on is completely transparent.

Now life gets very simple for creative support teams managing that workflow: your shared storage stays fast and lean, and you can stop paying for storage that doesn’t fit that model. This could include getting rid of LTO, big JBODs or anything with a limited warranty and a maintenance contract.

What do you see are the big technology trends that can help storage for M&E?
For shooters and on-set data wranglers, the new class of ultra-fast flash drives dramatically speeds up collecting massive files with extremely high resolution. Of course, raw content isn’t safe until it’s ingested, so even after moving shots to two sets of external drives or a RAID cart, we’re seeing cloud archive on ingest. Uploading files from a remote location, before you get all the way back to the editing suite, unlocks a lot of speed and collaboration advantages — the content is protected faster, and your ingest tools can start making proxy versions that everyone can start working on, such as grading, commenting, even rough cuts.

We’re also seeing cloud-delivered workflow applications. The days of buying and maintaining a server and storage in your shop to run an application may seem old-fashioned. Especially when that entire experience can now be delivered from the cloud and on-demand.

Iconik, for example, is a complete, personalized deployment of a project collaboration, asset review and management tool – but it lives entirely in the cloud. When you log in, your app springs to life instantly in the cloud, so you only pay for the application when you actually use it. Users just want to get their creative work done and can’t tell it isn’t a traditional asset manager.

How has NVMe advanced over the past year?
NVMe means flash storage can completely ditch legacy storage controllers like the ones on traditional SATA hard drives. When you can fit 2TB of storage on a stick thats only 22 millimeters by 80 millimeters — not much larger than a stick of gum — and it’s 20 times faster than an external spinning hard drive and draws only 3.5V, that’s a game changer for data wrangling and camera cart offload right now.

And that’s on PCIe 3. The PCI Express standard is evolving faster and faster too. PCIe 4 motherboards are starting to come online now, PCIe 5 was finalized in May, and PCIe 6 is already in development. When every generation doubles the available bandwidth that can feed that NVMEe storage, the future is very, very bright for NVMe.

Do you see NAS overtaking SAN for larger workgroups? How about cloud taking on some of what NAS used to do?
For users who work in widely distributed teams, the cloud is absolutely eating NAS. When the solution driving your team’s projects and collaboration is the dashboard and focus of the team — and active cloud storage seamlessly supports all of the content underneath — it no longer needs to be on a NAS.

But for large teams that do fast-paced editing and creation, the answer to “what is the best shared storage for our team” is still usually a SAN, or tightly-coupled, high-performance NAS.

Either way, by moving content and project archives to the cloud, you can keep SAN and NAS costs in check and have a more productive workflow, and more opportunities to use all that content for new projects.


Quick Chat: The Rebel Fleet’s Michael Urban talks on-set workflows

When shooting major motion pictures and episodic television with multiple crews in multiple locations, production teams need a workflow that gives them fast access and complete control of the footage across the entire production, from the first day of the shoot to the last day of post. This is Wellington, New Zealand-based The Rebel Fleet’s reason for being.

What exactly do they do? Well we reached out to managing director Michael Urban to find out.

Can you talk more about what you do and what types of workflows you supply?
The Rebel Fleet supplies complete workflow solutions, from on-set Qtake video assist and DIT to dailies, QC, archive and delivery to post. By managing the entire workflow, we can provide consistency and certainty around the color pipeline, monitor calibration, crew expertise and communication, and production can rely on one team to take care of that part of the workflow.

We have worked closely with Moxion many times and use its Immediates workflow, which enables automated uploads direct from video assist into its secure dailies platform. Anyone with access to the project can view rushes and metadata from set moments after the video is shot. This also enables different shooting units to automatically and securely share media. Two units shooting in different countries can see what each other has shot, including all camera and scene/take metadata. This is then available and catalogued directly into the video assist system. We have a lot of experience working alongside camera and VFX on-set as well as delivering to post, making sure we are delivering exactly what’s needed in the right formats.

You recently worked on a film that was shot in New Zealand and China, and you sent crews to China. Can you talk about that workflow a bit and name the film?
I can’t name the film yet, but I can tell you that it’s in the adventure genre and is coming out in the second half of 2020. The main pieces of software are Colorfront On-Set Dailies for processing all the media and Yoyotta for downloading and verifying media. We also use Avid for some edit prep before handing over to editorial.

How did you work with the DP and director? Can you talk about those relationships on this particular film?
On this shoot the DP and director had rushes screenings each night to go over the main unit and second unit rushes and make sure the dailies grade was exactly what they wanted. This was the last finesse before handing over dailies to editorial, so it had to be right. As rushes were being signed off, we would send them off to the background render engine, which would create four different outputs in multiple resolutions and framing. This meant that moments after the last camera mag was signed off, the media was ready for Avid prep and delivery. Our data team worked hard to automate as many processes as possible so there would be no long nights sorting reports and sheets. That work happened as we went throughout the day instead of leaving a multitude of tasks for the end of the day.

How do your workflows vary from project to project?
Every shoot is approached with a clean slate, and we work with the producers, DP and post to make sure we create a workflow that suits the logistical, budgetary and technical needs of that shoot. We have a tool kit that we rely on and use it to select the correct components required. We are always looking for ways to innovate and provide more value for the bottom line.

You mentioned using Colorfront tools, what does that offer you? And what about storage? Seems like working on location means you need a solid way to back up.
Colorfront On-Set Dailies takes care of QC, grade, sound sync and metadata. All of our shared storage is built around Quantum Xcellis, plus the Quantum QXS hybrid storage systems for online and nearline. We create the right SAN for the job depending on the amount of storage and clients required for that shoot.

Can you name projects you’ve worked on in the past as well as some recent work?
Warner Bros.’ The Meg, DreamWorks’ Ghost in the Shell, Sonar’s The Shannara Chronicles, STX Entertainment’s Adrift, Netflix’s The New Legends of Monkey and The Letter for the King and Blumhouse’s Fantasy Island.


Deluxe NY adds color to Mister Rogers biopic

A Beautiful Day in the Neighborhood stars Tom Hanks as children’s television icon Fred Rogers in a story about kindness triumphing over cynicism. Inspired by the article “Can You Say…Hero?” by journalist Tom Junod, the film is directed by Marielle Heller. The cinematographer Jody Lee Lipes worked on the color finishing with Deluxe New York’s Sam Daley.

Together Heller and Lipes worked to replicate the feature’s late 1990’s film aesthetic through in-camera techniques. After testing various film and digital camera options, production opted to shoot a majority of the footage with ARRI Alexa cameras in Super 16 mode. To more accurately represent the look of the Mister Rogers’ Neighborhood, Lipes’ team scoured the globe for working versions of the same Ikegami video cameras that were used to tape the show. In a similar quest for authenticity, Daley got a touch up on the look of Mister Rogers’ Neighborhood by watching old episodes, even visiting a Pittsburgh museum that housed the show’s original set. He also researched film styles typical of the time period to help inform the overall look of the feature.

“Incorporating Ikegami video footage into the pipeline was the most challenging aspect of the color on this film, and we did considerable testing to make sure that the quality of the video recordings would hold up in a theatrical environment,” Daley explained. “Jody and I have been working together for more than 10 years; we’re aesthetically in-sync and we both like to take what some might consider risks visually, and this film is no different.”

Through the color finish process, Daley helped unify and polish the final footage, which included PAL and NTSC video in addition to the Alexa-acquired digital material. He paid careful attention to integrate the different video standards and frame rates while also shaping two distinct looks to reflect the narrative. For contrast between the optimistic Rogers and his colorful world, Daley incorporated a cool moody feel around the pessimistic Junod, named “Lloyd Vogel” in the film and played by Matthew Rhys.

2019 HPA Award winners announced

The industry came together on November 21 in Los Angeles to celebrate its own at the 14th annual HPA Awards. Awards were given to individuals and teams working in 12 creative craft categories, recognizing outstanding contributions to color grading, sound, editing and visual effects for commercials, television and feature film.

Rob Legato receiving Lifetime Achievement Award from presenter Mike Kanfer. (Photo by Ryan Miller/Capture Imaging)

As was previously announced, renowned visual effects supervisor and creative Robert Legato, ASC, was honored with this year’s HPA Lifetime Achievement Award; Peter Jackson’s They Shall Not Grow Old was presented with the HPA Judges Award for Creativity and Innovation; acclaimed journalist Peter Caranicas was the recipient of the very first HPA Legacy Award; and special awards were presented for Engineering Excellence.

The winners of the 2019 HPA Awards are:

Outstanding Color Grading – Theatrical Feature

WINNER: “Spider-Man: Into the Spider-Verse”
Natasha Leonnet // Efilm

“First Man”
Natasha Leonnet // Efilm

“Roma”
Steven J. Scott // Technicolor

Natasha Leonnet (Photo by Ryan Miller/Capture Imaging)

“Green Book”
Walter Volpatto // FotoKem

“The Nutcracker and the Four Realms”
Tom Poole // Company 3

“Us”
Michael Hatzer // Technicolor

 

Outstanding Color Grading – Episodic or Non-theatrical Feature

WINNER: “Game of Thrones – Winterfell”
Joe Finley // Sim, Los Angeles

 “The Handmaid’s Tale – Liars”
Bill Ferwerda // Deluxe Toronto

“The Marvelous Mrs. Maisel – Vote for Kennedy, Vote for Kennedy”
Steven Bodner // Light Iron

“I Am the Night – Pilot”
Stefan Sonnenfeld // Company 3

“Gotham – Legend of the Dark Knight: The Trial of Jim Gordon”
Paul Westerbeck // Picture Shop

“The Man in The High Castle – Jahr Null”
Roy Vasich // Technicolor

 

Outstanding Color Grading – Commercial  

WINNER: Hennessy X.O. – “The Seven Worlds”
Stephen Nakamura // Company 3

Zara – “Woman Campaign Spring Summer 2019”
Tim Masick // Company 3

Tiffany & Co. – “Believe in Dreams: A Tiffany Holiday”
James Tillett // Moving Picture Company

Palms Casino – “Unstatus Quo”
Ricky Gausis // Moving Picture Company

Audi – “Cashew”
Tom Poole // Company 3

 

Outstanding Editing – Theatrical Feature

Once Upon a Time… in Hollywood

WINNER: “Once Upon a Time… in Hollywood”
Fred Raskin, ACE

“Green Book”
Patrick J. Don Vito, ACE

“Rolling Thunder Revue: A Bob Dylan Story by Martin Scorsese”
David Tedeschi, Damian Rodriguez

“The Other Side of the Wind”
Orson Welles, Bob Murawski, ACE

“A Star Is Born”
Jay Cassidy, ACE

 

Outstanding Editing – Episodic or Non-theatrical Feature (30 Minutes and Under)

VEEP

WINNER: “Veep – Pledge”
Roger Nygard, ACE

“Russian Doll – The Way Out”
Todd Downing

“Homecoming – Redwood”
Rosanne Tan, ACE

“Withorwithout”
Jake Shaver, Shannon Albrink // Therapy Studios

“Russian Doll – Ariadne”
Laura Weinberg

 

Outstanding Editing – Episodic or Non-theatrical Feature (Over 30 Minutes)

WINNER: “Stranger Things – Chapter Eight: The Battle of Starcourt”
Dean Zimmerman, ACE, Katheryn Naranjo

“Chernobyl – Vichnaya Pamyat”
Simon Smith, Jinx Godfrey // Sister Pictures

“Game of Thrones – The Iron Throne”
Katie Weiland, ACE

“Game of Thrones – The Long Night”
Tim Porter, ACE

“The Bodyguard – Episode One”
Steve Singleton

 

Outstanding Sound – Theatrical Feature

WINNER: “Godzilla: King of Monsters”
Tim LeBlanc, Tom Ozanich, MPSE // Warner Bros.
Erik Aadahl, MPSE, Nancy Nugent, MPSE, Jason W. Jennings // E Squared

“Shazam!”
Michael Keller, Kevin O’Connell // Warner Bros.
Bill R. Dean, MPSE, Erick Ocampo, Kelly Oxford, MPSE // Technicolor

“Smallfoot”
Michael Babcock, David E. Fluhr, CAS, Jeff Sawyer, Chris Diebold, Harrison Meyle // Warner Bros.

“Roma”
Skip Lievsay, Sergio Diaz, Craig Henighan, Carlos Honc, Ruy Garcia, MPSE, Caleb Townsend

“Aquaman”
Tim LeBlanc // Warner Bros.
Peter Brown, Joe Dzuban, Stephen P. Robinson, MPSE, Eliot Connors, MPSE // Formosa Group

 

Outstanding Sound – Episodic or Non-theatrical Feature

WINNER: “The Haunting of Hill House – Two Storms”
Trevor Gates, MPSE, Jason Dotts, Jonathan Wales, Paul Knox, Walter Spencer // Formosa Group

“Chernobyl – 1:23:45”
Stefan Henrix, Stuart Hilliker, Joe Beal, Michael Maroussas, Harry Barnes // Boom Post

“Deadwood: The Movie”
John W. Cook II, Bill Freesh, Mandell Winter, MPSE, Daniel Colman, MPSE, Ben Cook, MPSE, Micha Liberman // NBC Universal

“Game of Thrones – The Bells”
Tim Kimmel, MPSE, Onnalee Blank, CAS, Mathew Waters, CAS, Paula Fairfield, David Klotz

“Homecoming – Protocol”
John W. Cook II, Bill Freesh, Kevin Buchholz, Jeff A. Pitts, Ben Zales, Polly McKinnon // NBC Universal

 

Outstanding Sound – Commercial 

WINNER: John Lewis & Partners – “Bohemian Rhapsody”
Mark Hills, Anthony Moore // Factory

Audi – “Life”
Doobie White // Therapy Studios

Leonard Cheshire Disability – “Together Unstoppable”
Mark Hills // Factory

New York Times – “The Truth Is Worth It: Fearlessness”
Aaron Reynolds // Wave Studios NY

John Lewis & Partners – “The Boy and the Piano”
Anthony Moore // Factory

 

Outstanding Visual Effects – Theatrical Feature

WINNER: “The Lion King”
Robert Legato
Andrew R. Jones
Adam Valdez, Elliot Newman, Audrey Ferrara // MPC Film
Tom Peitzman // T&C Productions

“Avengers: Endgame”
Matt Aitken, Marvyn Young, Sidney Kombo-Kintombo, Sean Walker, David Conley // Weta Digital

“Spider-Man: Far From Home”
Alexis Wajsbrot, Sylvain Degrotte, Nathan McConnel, Stephen Kennedy, Jonathan Opgenhaffen // Framestore

“Alita: Battle Angel”
Eric Saindon, Michael Cozens, Dejan Momcilovic, Mark Haenga, Kevin Sherwood // Weta Digital

“Pokemon Detective Pikachu”
Jonathan Fawkner, Carlos Monzon, Gavin Mckenzie, Fabio Zangla, Dale Newton // Framestore

 

Outstanding Visual Effects – Episodic (Under 13 Episodes) or Non-theatrical Feature

Game of Thrones

WINNER: “Game of Thrones – The Bells”
Steve Kullback, Joe Bauer, Ted Rae
Mohsen Mousavi // Scanline
Thomas Schelesny // Image Engine

“Game of Thrones – The Long Night”
Martin Hill, Nicky Muir, Mike Perry, Mark Richardson, Darren Christie // Weta Digital

“The Umbrella Academy – The White Violin”
Everett Burrell, Misato Shinohara, Chris White, Jeff Campbell, Sebastien Bergeron

“The Man in the High Castle – Jahr Null”
Lawson Deming, Cory Jamieson, Casi Blume, Nick Chamberlain, William Parker, Saber Jlassi, Chris Parks // Barnstorm VFX

“Chernobyl – 1:23:45”
Lindsay McFarlane
Max Dennison, Clare Cheetham, Steven Godfrey, Luke Letkey // DNEG

 

Outstanding Visual Effects – Episodic (Over 13 Episodes)

Team from The Orville – Outstanding VFX, Episodic, Over 13 Episodes (Photo by Ryan Miller/Capture Imaging)

WINNER: “The Orville – Identity: Part II”
Tommy Tran, Kevin Lingenfelser, Joseph Vincent Pike // FuseFX
Brandon Fayette, Brooke Noska // Twentieth Century FOX TV

“Hawaii Five-O – Ke iho mai nei ko luna”
Thomas Connors, Anthony Davis, Chad Schott, Gary Lopez, Adam Avitabile // Picture Shop

“9-1-1 – 7.1”
Jon Massey, Tony Pirzadeh, Brigitte Bourque, Gavin Whelan, Kwon Choi // FuseFX

“Star Trek: Discovery – Such Sweet Sorrow Part 2”
Jason Zimmerman, Ante Dekovic, Aleksandra Kochoska, Charles Collyer, Alexander Wood // CBS Television Studios

“The Flash – King Shark vs. Gorilla Grodd”
Armen V. Kevorkian, Joshua Spivack, Andranik Taranyan, Shirak Agresta, Jason Shulman // Encore VFX

The 2019 HPA Engineering Excellence Awards were presented to:

Adobe – Content-Aware Fill for Video in Adobe After Effects

Epic Games — Unreal Engine 4

Pixelworks — TrueCut Motion

Portrait Displays and LG Electronics — CalMan LUT based Auto-Calibration Integration with LG OLED TVs

Honorable Mentions were awarded to Ambidio for Ambidio Looking Glass; Grass Valley, for creative grading; and Netflix for Photon.

IDC goes bicoastal, adds Hollywood post facility 


New York’s International Digital Centre (IDC) has opened a new 6,800-square-foot digital post facility in Hollywood, with Rosanna Marino serving as COO. She will manage the day-to-day operations of the West Coast post house. IDC LA will focus on serving the entertainment, content creation, distribution and streaming industries.

Rosanna Marino

Marino will manage sales, marketing, engineering and the day-to-day operations for the Hollywood location, while IDC founder/CEO Marcy Gilbert, will lead the company’s overall activities and New York headquarters.

IDC will provide finishing, color grading and editorial in Dolby Vision 4K HDR, UHD as well as global QC. IDC LA features 11 bays and a DI theater, which includes Dolby 7.1 Atmos audio mixing, dubbing and audio description. They are also providing subtitle and closed caption-timed text creation and localization, ABS scripting and translations in over 40 languages.

To complete the end-to-end chain, they provide IMF and DCP creation, supplemental and all media fulfillment processing, including audio and timed text conforms for distribution. IDC is an existing Netflix Partner Program member — NP3 in New York and NPFP for the Americas and Canada.

IDC LA occupies the top two floors and rooftop deck in a vintage 1930’s brick building on Santa Monica Boulevard.

Abu Dhabi’s twofour54 is now Dolby Vision certified

Abu Dhabi’s twofour54 has become Dolby Vision certified in an effort to meet the demand for color grading and mastering Dolby Vision HDR content. twofour54 is the first certified Dolby Vision facility in the UAE, providing work in both Arabic and English.

“The way we consume content has been transformed by connectivity and digitalization, with consumers able to choose not only what they watch but where, when and how,” says Katrina Anderson, director of commercial services at twofour54. “This means it is essential that content creators have access to technology such as Dolby Vision in order to ensure their content reaches as wide an audience as possible around the world.”

With Netflix, Amazon Prime and others now competing with existing broadcasters, there is a big demand around the world for high-quality production facilities. According to twofour54, Netflix’s expenditure on content creation soared from $4.6 billion in 2015 to $12 billion last year, while other platforms — such as Amazon Prime, Apple TV and YouTube — are also seeking to create more unique content. Consequently, the global demand for production facilities such as those offered by twofour54 is outstripping supply.

“We have seen an increased interest for Dolby Vision in home entertainment due to growing popularity of digital streaming services in Middle East, and we are now able to support studios and content creators with leading-edge tools that are deployed at twofour54 world-class post facility,” explains Pankaj Kedia, managing director of emerging markets for Dolby Laboratories. “Dolby Vision is the preferred HDR mastering workflow for leading studios and a growing number of content creators, and hence this latest offering demonstrates twofour54 commitment to make Abu Dhabi a preferred location for film and TV production.”

Why is this important? For color grading of movies and episodic content, Dolby has created a workflow that generates shot-by-shot dynamic metadata that allows filmmakers to see how their content will look on consumer devices. The colorist can then add “trims” to adjust how the mapping looks and to deliver a better-looking SDR version for content providers serving early Ultra HD (UHD) televisions that are capable only of SDR reproduction.

The colorists at twofour54 use both Blackmagic DaVinci Resolve and FilmLight Baselight systems.

Main Image: Engineer Noura Al Ali

Harbor crafts color and sound for The Lighthouse

By Jennifer Walden

Director Robert Eggers’ The Lighthouse tells the tale of two lighthouse keepers, Thomas Wake (Willem Dafoe) and Ephraim Winslow (Robert Pattinson), who lose their minds while isolated on a small rocky island, battered by storms, plagued by seagulls and haunted by supernatural forces/delusion-inducing conditions. It’s an A24 film that hit theaters in late October.

Much like his first feature-length film The Witch (winner of the 2015 Sundance Film Festival Directing Award for a dramatic film and the 2017 Independent Spirit Award for Best First Feature), The Lighthouse is a tense and haunting slow descent into madness.

But “unlike most films where the crazy ramps up, reaching a fever pitch and then subsiding or resolving, in The Lighthouse the crazy ramps up to a fever pitch and then stays there for the next hour,” explains Emmy-winning supervising sound editor/re-recording mixer Damian Volpe. “It’s like you’re stuck with them, they’re stuck with each other and we’re all stuck on this rock in the middle of the ocean with no escape.”

Volpe, who’s worked with director Eggers on two short films — The Tell-Tale Heart and Brothers — thought he had a good idea of just how intense the film and post sound process would be going into The Lighthouse, but it ended up exceeding his expectations. “It was definitely the most difficult job I’ve done in over two decades of working in post sound for sure. It was really intense and amazing,” he says.

Eggers chose Harbor’s New York City location for both sound and final color. This was colorist Joe Gawler’s first time working with Eggers, but it couldn’t have been a more fitting film. The Lighthouse was shot on 35mm black & white (Double-X 5222) film with a 1.19:1 aspect ratio, and as it happens Gawler is well versed in the world of black & white. He’s remastered a tremendous amount of classic movie titles for The Criterion Collection, such as Breathless, Seventh Samurai and several Fellini films like 8 ½. “To take that experience from my Criterion title work and apply that to giving authenticity to a contemporary film that feels really old, I think it was really helpful,” Gawler says.

Joe Gawler

The advantage of shooting on film versus shooting digitally is that film negatives can be rescanned as technology advances, making it possible to take a film from the ‘60s and remaster it into 4K resolution. “When you shoot something digitally, you’re stuck in the state-of-the-moment technology. If you were shooting digitally 10 years ago and want to create a new deliverable of your film and reimagine it with today’s display technologies, you are compromised in some ways. You’re having to up-res that material. But if you take a 35mm film negative shot 100 years ago, the resolution is still inside that negative. You can rescan it with a new scanner and it’s going to look amazing,” explains Gawler.

While most of The Lighthouse was shot on black & white film (with Baltar lenses designed in the 1930s for that extra dose of authenticity), there were a few stock footage shots of the ocean with big storm waves and some digitally rendered elements, such as the smoke, that had to be color corrected and processed to match the rich, grainy quality of the film. “Those stock footage shots we had to beat up to make them feel more aged. We added a whole bunch of grain into those and the digital elements so they felt seamless with the rest of the film,” says Gawler.

The digitally rendered elements were separate VFX pieces composited into the black & white film image using Blackmagic’s DaVinci Resolve. “Conforming the movie in Resolve gave us the flexibility to have multiple layers and allowed us to punch through one layer to see more or less of another layer,” says Gawler. For example, to get just that right amount of smoke, “we layered the VFX smoke element on top of the smokestack in the film and reduced the opacity of the VFX layer until we found the level that Rob and DP Jarin Blaschke were happy with.”

In terms of color, Gawler notes The Lighthouse was all about exposure and contrast. The spectrum of gray rarely goes to true white and the blacks are as inky as they can be. “Jarin didn’t want to maintain texture in the blackest areas, so we really crushed those blacks down. We took a look at the scopes and made sure we were bottoming out so that the blacks were pure black.”

From production to post, Eggers’ goal was to create a film that felt like it could have been pulled from a 1930’s film archive. “It feels authentically antique, and that goes for the performances, the production design and all the period-specific elements — the lights they used and the camera, and all the great care we took in our digital finish of the film to make it feel as photochemical as possible,” says Gawler.

The Sound
This holds true for post sound, too. So much so that Eggers and Volpe kicked around the idea of making the soundtrack mono. “When I heard the first piece of score from composer Mark Korven, the whole mono idea went out the door,” explains Volpe. “His score was so wide and so rich in terms of tonality that we never would’ve been able to make this difficult dialogue work if we had to shove it all down one speaker’s mouth.”

The dialogue was difficult on many levels. First, Volpe describes the language as “old-timey, maritime” delivered in two different accents — Dafoe has an Irish-tinged seasoned sailor accent and Pattinson has an up-east Maine accent. Additionally, the production location made it difficult to record the dialogue, with wind, rain and dripping water sullying the tracks. Re-recording mixer Rob Fernandez, who handled the dialogue and music, notes that when it’s raining the lighthouse is leaking. You see the water in the shots because they shot it that way. “So the water sound is married to the dialogue. We wanted to have control over the water so the dialogue had to be looped. Rob wanted to save as much of the amazing on-set performances as possible, so we tried to go to ADR for specific syllables and words,” says Fernandez.

Rob Fernandez

That wasn’t easy to do, especially toward the end of the film during Dafoe’s monologue. “That was very challenging because at one point all of the water and surrounding sounds disappear. It’s just his voice,” says Fernandez. “We had to do a very slow transition into that so the audience doesn’t notice. It’s really focusing you in on what he is saying. Then you’re snapped out of it and back into reality with full surround.”

Another challenging dialogue moment was a scene in which Pattinson is leaning on Dafoe’s lap, and their mics are picking up each other’s lines. Plus, there’s water dripping. Again, Eggers wanted to use as much production as possible so Fernandez tried a combination of dialogue tools to help achieve a seamless match between production and ADR. “I used a lot of Synchro Arts’ Revoice Pro to help with pitch matching and rhythm matching. I also used every tool iZotope offers that I had at my disposal. For EQ, I like FabFilter. Then I used reverb to make the locations work together,” he says.

Volpe reveals, “Production sound mixer Alexander Rosborough did a wonderful job, but the extraneous noises required us to replace at least 60% of the dialogue. We spent several months on ADR. Luckily, we had two extremely talented and willing actors. We had an extremely talented mixer, Rob Fernandez. My dialogue editor William Sweeney was amazing too. Between the directing, the acting, the editing and the mixing they managed to get it done. I don’t think you can ever tell that so much of the dialogue has been replaced.”

The third main character in the film is the lighthouse itself, which lives and breathes with a heartbeat and lungs. The mechanism of the Fresnel lens at the top of the lighthouse has a deep, bassy gear-like heartbeat and rasping lungs that Volpe created from wrought iron bars drawn together. Then he added reverb to make the metal sound breathier. In the bowels of the lighthouse there is a steam engine that drives the gears to turn the light. Ephraim (Pattinson) is always looking up toward Thomas (Dafoe), who is in the mysterious room at the top of the lighthouse. “A lot of the scenes revolve around clockwork, which is just another rhythmic element. So Ephraim starts to hear that and also the sound of the light that composer Korven created, this singing glass sound. It goes over and over and drives him insane,” Volpe explains.

Damian Volpe

Mermaids make a brief appearance in the film. To create their vocals, Volpe and his wife did a recording session in which they made strange sea creature call-and-response sounds to each other. “I took those recordings and beat them up in Pro Tools until I got what I wanted. It was quite a challenge and I had to throw everything I had at it. This was more of a hammer-and-saw job than a fancy plug-in job,” Volpe says.

He captured other recordings too, like the sound of footsteps on the stairs inside a lighthouse on Cape Cod, marine steam engines at an industrial steam museum in northern Connecticut and more at the Mystic Sea Port… seagulls and waves. “We recorded so much. We dug a grave. We found an 80-year-old lobster pot that we smashed about. I recorded the inside of conch shells to get drones. Eighty percent of the sound in the film is material that I and Filipe Messeder (assistant and Foley editor) recorded, or that I recorded with my wife,” says Volpe.

But one of the trickiest sounds to create was a foghorn that Eggers originally liked from a lighthouse in Wales. Volpe tracked down the keeper there but the foghorn was no longer operational. He then managed to locate a functioning steam-powered diaphone foghorn in Shetland, Scotland. He contacted the lighthouse keeper Brian Hecker and arranged for a local documentarian to capture it. “The sound of the Sumburgh Lighthouse is a major element in the film. I did a fair amount of additional work on the recordings to make them sound more like the original one Rob [Eggers] liked, because the Sumburgh foghorn had a much deeper, bassier, whale-like quality.”

The final voice in The Lighthouse’s soundtrack is composer Korven’s score. Since Volpe wanted to blur the line between sound design and score, he created sounds that would complement Korven’s. Volpe says, “Mark Korven has these really great sounds that he generated with a ball on a cymbal. It created this weird, moaning whale sound. Then I created these metal creaky whale sounds and those two things sing to each other.”

In terms of the mix, nearly all the dialogue plays from the center channel, helping it stick to the characters within the small frame of this antiquated aspect ratio. The Foley, too, comes from the center and isn’t panned. “I’ve had some people ask me (bizarrely) why I decided to do the sound in mono. There might be a psychological factor at work where you’re looking at this little black & white square and somehow the sound glues itself to that square and gives you this idea that it’s vintage or that it’s been processed or is narrower than it actually is.

“As a matter of fact, this mix is the farthest thing from mono. The sound design, effects, atmospheres and music are all very wide — more so than I would do in a regular film as I tend to be a bit conservative with panning. But on this film, we really went for it. It was certainly an experimental film, and we embraced that,” says Volpe.

The idea of having the sonic equivalent of this 1930’s film style persisted. Since mono wasn’t feasible, other avenues were explored. Volpe suggested recording the production dialogue onto a NAGRA to “get some of that analog goodness, but it just turned out to be one thing too many for them in the midst of all the chaos of shooting on Cape Forchu in Nova Scotia,” says Volpe. “We did try tape emulator software, but that didn’t yield interesting results. We played around with the idea of laying it off to a 24-track or shooting in optical. But in the end, those all seemed like they’d be expensive and we’d have no control whatsoever. We might not even like what we got. We were struggling to come up with a solution.”

Then a suggestion from Harbor’s Joel Scheuneman (who’s experienced in the world of music recording/producing) saved the day. He recommended the outboard Rupert Neve Designs 542 Tape Emulator.

The Mix
The film was final mixed in 5.1 surround on a Euphonix S5 console. Each channel was sent through an RND 542 module and then into the speakers. The units’ magnetic heads added saturation, grain and a bit of distortion to the tracks. “That is how we mixed the film. We had all of these imperfections in the track that we had to account for while we were mixing,” explains Fernandez.

“You couldn’t really ride it or automate it in any way; you had to find the setting that seemed good and then just let it rip. That meant in some places it wasn’t hitting as hard as we’d like and in other places it was hitting harder than we wanted. But it’s all part of Rob Eggers’s style of filmmaking — leaving room for discovery in the process,” adds Volpe.

“There’s a bit of chaos factor because you don’t know what you’re going to get. Rob is great about being specific but also embracing the unknown or the unexpected,” he concludes.


Jennifer Walden is a New Jersey-based audio engineer and writer. Follow her on Twitter @audiojeney.

Color Chat: Light Iron’s Corinne Bogdanowicz

Corinne Bogdanowicz colorist at Light Iron, joined the post house in 2010 after working as a colorist and digital compositor for Post Logic/Prime Focus, Pacific Title and DreamWorks Animation.

Bogdanowicz, who comes from a family of colorists/color scientists (sister and father), has an impressive credit list, including the features 42, Flight, Hell or High Water, Allied and Wonder. On the episodic side, she has colored all five seasons of Amazon’s Emmy-winning series Transparent, as well as many other shows, including FX’s Baskets and Boomerang for BET. Her most recent work includes Netflix’s Dolemite is My Name and HBO’s Mrs. Fletcher.

HBO’s Mrs. Fletcher

We reached out to find out more…

NAME: Corinne Bogdanowicz

COMPANY: Light Iron

CAN YOU DESCRIBE YOUR COMPANY?
Light Iron is a post production company owned by Panavision. We have studios in New York and Los Angeles.

AS A COLORIST, WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
I think that most people would be surprised that we are the last stop for all visuals on a project. We are where all of the final VFX come together, and we also manage the different color spaces for final distribution.

ARE YOU SOMETIMES ASKED TO DO MORE THAN JUST COLOR ON PROJECTS?
Yes, I am very often doing work that crosses over into visual effects. Beauty work, paint outs and VFX integration are all commonplace in the DI suite these days.

WHAT’S YOUR FAVORITE PART OF THE JOB?
The collaboration between myself and the creatives on a project is my favorite aspect of color correction. There is always a moment when we start color where I get “the look,” and everyone is excited that their vision is coming to fruition.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Maybe farming? (laughs) I’m not sure. I love being outdoors and working with animals.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I have an art background, and when I moved to Los Angeles years ago I worked in VFX. I quickly was introduced to the world of color and found it was a great fit. I love the combination of art and technology, as well as constantly being introduced to new ideas by industry creatives.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
Where’d You Go, Bernadette?, Sextuplets, Truth Be Told, Transparent, Mrs. Fletcher and Dolemite is My Name.

Transparent

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
This is a hard question because I feel like I leave a little piece of myself in everything that I work on.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
My phone, the coffee maker and FilmLight Baselight.

WHAT DO YOU DO TO DE-STRESS FROM THE PRESSURES OF THE JOB?
I have two small children at home, so I think I de-stress when I get to work (laughs)!

Bonfire adds Jason Mayo as managing director/partner

Jason Mayo has joined digital production company Bonfire in New York as managing director and partner. Industry veteran Mayo will be working with Bonfire’s new leadership lineup, which includes founder/Flame artist Brendan O’Neil, CD Aron Baxter, executive producer Dave Dimeola and partner Peter Corbett. Bonfire’s offerings include VFX, design, CG, animation, color, finishing and live action.

Mayo comes to Bonfire after several years building Postal, the digital arm of the production company Humble. Prior to that he spent 14 years at Click 3X, where he worked closely with Corbett as his partner. While there he also worked with Dimeola, who cut his teeth at Click as a young designer/compositor. Dimeola later went on to create The Brigade, where he developed the network and technology that now forms the remote, cloud-based backbone referred to as the Bonfire Platform.

Mayo says a number of factors convinced him that Bonfire was the right fit for him. “This really was what I’d been looking for,” he says. “The chance to be part of a creative and innovative operation like Bonfire in an ownership role gets me excited, as it allows me to make a real difference and genuinely effect change. And when you’re working closely with a tight group of people who are focused on a single vision, it’s much easier for that vision to be fully aligned. That’s harder to do in a larger company.”

O’Neil says that having Mayo join as partner/MD is a major move for the company. “Jason’s arrival is the missing link for us at Bonfire,” he says. “While each of us has specific areas to focus on, we needed someone who could both handle the day to day of running the company while keeping an eye on our brand and our mission and introducing our model to new opportunities. And that’s exactly his strong suit.”

For the most part, Mayo’s familiarity with his new partners means he’s arriving with a head start. Indeed, his connection to Dimeola, who built the Bonfire Platform — the company’s proprietary remote talent network, nicknamed the “secret sauce” — continued as Mayo tapped Dimeola’s network for overflow and outsourced work while at Postal. Their relationship, he says, was founded on trust.

“Dave came from the artist side, so I knew the work I’d be getting would be top quality and done right,” Mayo explains. “I never actually questioned how it was done, but now that he’s pulled back the curtain, I was blown away by the capabilities of the Platform and how it dramatically differentiates us.

“What separates our system is that we can go to top-level people around the world but have them working on the Bonfire Platform, which gives us total control over the process,” he continues. “They work on our cloud servers with our licenses and use our cloud rendering. The Platform lets us know everything they’re doing, so it’s much easier to track costs and make sure you’re only paying for the work you actually need. More importantly, it’s a way for us to feel connected – it’s like they’re working in a suite down the hall, except they could be anywhere in the world.”

Mayo stresses that while the cloud-based Platform is a huge advantage for Bonfire, it’s just one part of its profile. “We’re not a company riding on the backs of freelancers,” he points out. “We have great, proven talent in our core team who work directly with clients. What I’ve been telling my longtime client contacts is that Bonfire represents a huge step forward in terms of the services and level of work I can offer them.”

Corbett believes he and Mayo will continue to explore new ways of working now that he’s at Bonfire. “In the 14 years Jason and I built Click 3X, we were constantly innovating across both video and digital, integrating live action, post production, VFX and digital engagements in unique ways,” he observes. “I’m greatly looking forward to continuing on that path with him here.”

Technicolor Post opens in Wales 

Technicolor has opened a new facility in Cardiff, Wales, within Wolf Studios. This expansion of the company’s post production footprint in the UK is a result of the growing demand for more high-quality content across streaming platforms and the need to post these projects, as well as the growth of production in Wales.

The facility is connected to all of Technicolor’s locations worldwide through the Technicolor Production Network, giving creatives easy access and to their projects no matter where they are shooting or posting.

The facility, an extension of Technicolor’s London operations, supports all Welsh productions and features a multi-purpose, state-of-the-art suite as well as space for VFX and front-end services including dailies. Technicolor Wales is working on Bad Wolf Production’s upcoming fantasy epic His Dark Materials, providing picture and sound services for the BBC/HBO show. Technicolor London’s recent credits include The Two Popes, The Souvenir, Chernobyl, Black Mirror, Gentleman Jack and The Spanish Princess.

Within this new Cardiff facility, Technicolor is offering 2K digital cinema projection, FilmLight Baselight color grading, realtime 4K HDR remote review, 4K OLED video monitoring, 5.1/7.1 sound, ADR recording/source connect, Avid Pro Tools sound mixing, dailies processing and Pulse cloud storage.

Bad Wolf Studios in Cardiff offers 125,000 square feet of stage space with five stages. There is flexible office space, as well as auxiliary rooms and costume and props storage. Its within

VFX house Blacksmith now offering color grading, adds Mikey Pehanich

New York-based visual effects studio Blacksmith has added colorist Mikey Pehanich to its team. With this new addition, Blacksmith expands its capabilities to now offer color grading in addition to VFX.

Pehanich has worked on projects for high-profile brands including Amazon, Samsung, Prada, Nike, New Balance, Marriott and Carhartt. Most recently, Pehanich worked on Smirnoff’s global “Infamous Since 1864” campaign directed by Rupert Sanders, Volkswagen’s Look Down in Awe spot from Garth Davis, Fisher-Price’s “Let’s Be Kids” campaign and Miller Lite’s newly launched Followers spot, both directed by Ringan Ledwidge.

Prior to joining Blacksmith, Pehanich spent six years as colorist at The Mill in Chicago. Pehanich was the first local hire when The Mill opened its Chicago studio in 2013. Initially cutting his teeth as color assistant, he quickly worked his way up to becoming a full-fledged colorist, lending his talent to campaigns that include Michelob’s 2019 Super Bowl spot featuring Zoe Kravitz and directed by Emma Westenberg, as well as music videos, including Regina Spektor’s Black and White.

In addition to commercial work, Pehanich’s diverse portfolio encompasses several feature films, short films and music videos. His recent longform work includes Shabier Kirchner’s short film Dadli about an Antiguan boy and his community, and Andre Muir’s short film 4 Corners, which tackles Chicago’s problem with gun violence.

“New York has always been a creative hub for all industries — the energy and vibe that is forever present in the air here has always been a draw for me. When the opportunity presented itself to join the incredible team over at Blacksmith, there was no way I could pass it up,” says Pehanich, who will be working on Blackmagic’s DaVinci Resolve.

 

Color grading Empire State Building’s immersive exhibits

As immersive and experiential projects are being mounted in more and more settings — and as display technology allows for larger and more high-resolution screens to be integrated into these installations —colorists are being called on to grade video and film content that’s meant to be viewed in vastly different settings than in the past. No longer are they grading for content that will live on a 50-inch flat screen TV or a 9-inch tablet —they’re grading for wall-sized screens that dominate museum exhibits or public spaces.

James Tillett

A recent example is when the Manhattan office of Squint /Opera, a London-based digital design studio, tapped Moving Picture Company colorist James Tillett to grade content that has taken over floor-to-ceiling screens in the new Second Floor Experience in the iconic Empire State Building. Comprising nine interactive and immersive galleries that recreate everything from the building’s construction to its encounter with its most famous visitor and unofficial mascot, King Kong, the 10,000-square-foot space is part of the building’s multimillion dollar renovation.

Here, Tillett discusses what went into grading for such a large-scale experiential project such as this.

How did this project come about?
Alvin Cruz, one of our creative directors here in New York, has a designer colleague who put us in contact with the Squint/Opera team. We met with them and they quickly realized they’d be able to do everything on this project except the color grade. That’s where we came in.

How did this project differ from the more traditional color grading work you usually do?
You have to work in a different color space if the final product will be shown in a theater versus, say, broadcast TV or online. The same thinking goes here, but as every experiential project is different, you have to evaluate based on the design of the space and the type of screen or projection system being used, and then make an educated guess on how the footage will respond.

What were the steps you took to tackle this kind of project?
The first thing we did when we got the footage from Squint/Opera was to bring it into the suite and view it in that environment. Then my executive producer, Ed Koenig, and I jumped on the Q train and went into the space at the Empire State Building to see how the same footage looked in the various gallery settings. This helped us to get a feel for how it will ultimately be seen. I also wanted to see how those spaces differed visually from our grading suite. That informed my process going forward.

What sections of the Experience required extra consideration?
The “Construction Area” gallery, which documents the construction of the building, has very large screens. This meant paying close attention to the visual details within each of the films. For example, zooming in close to certain parts of the image and keeping an eye on noise and grain structure.

The “Site Survey” gallery gives the visitor a sense of what it would be like on the ground as the building surveyors are taking their measurements. Visitors are able to look through various replica surveying devices and see different scenes unfolding. During the grade (I use FilmLight Baselight), we had a prototype device in the suite that Squint/Opera created with a 3D printer. This allowed us to preview the grade through the same type of special mirrored screen that’s used in the actual replica surveying devices in the exhibit. In fact, we actually ended up setting the calibration of these screens as part of the grading process and then transferred those settings over to the actual units at the ESB.

In the “King Kong” gallery, even though the video content is in black and white, it was important that the image on the screens was consistent with the model of King Kong’s hand that reaches into the physical space, which has a slightly reddish tone to it. We started off just trying to make the footage feel more like a vintage black and white film print, but realized we needed to introduce some color to make it sit better in the space. This meant experimenting with different levels of red/sepia tint to the black and white and exporting different versions, with a final decision then made on-site.

Were you able to replicate what the viewing conditions would be for these films while working in the color suite? And did this influence the grade?
What’s important about grading for experiential projects like this is that, while you can’t replicate the exact conditions, you still have to give the footage a grade that supports the theme or focus of the film’s content. You also have to fully understand and appreciate where it’s going to be seen and keep that top of mind throughout the entire process.

 

 

 

 

Nice Shoes Toronto adds colorist Yulia Bulashenko

Creative studio Nice Shoes has added colorist Yulia Bulashenko to its Toronto location. She brings over seven years of experience as a freelance colorist, working worldwide across on projects with such top global clients as Nike, Volkswagen, MTV, Toyota, Diesel, Uniqlo, Uber, Adidas and Zara, among numerous others.

Bulashenko’s resume includes work across commercials, music videos, fashion, and feature films. Notable projects include Sia and Diplo’s (LSD) music video for “Audio,” “Sound and Vision” a tribute to the late singer David Bowie directed by Canada for whom she has been a colorist of choice for the past five years; and feature films The Girl From The Song and Gold.

Toronto-based Bulashenko is available immediately and also available remotely via Nice Shoes’s New York, Boston, Chicago, and Minneapolis spaces.

Bulashenko began her career as a fashion photographer before transitioning into creating fashion films. Through handling all of the post on her own film projects, she discovered a love for color grading. After building relationships with a number of collaborators, she began taking on projects as a freelancer, working with clients in Spain and the UK working on a wide range of projects throughout Europe, Mexico, Qatar and India.

Managing director Justin Pandolfino notes, “We’re excited to announce Yulia as the first of a number of new signings as we enter our fourth year in the Toronto market. Bringing her onboard is part of our ongoing efforts to unite the best talent from around the world to deliver stunning design, animation, VFX, VR/AR, editorial, color grading and finishing for our clients.”

Colorist Chat: Scott Ostrowsky on Amazon’s Sneaky Pete

By Randi Altman

Scott Ostrowsky, senior colorist at Deluxe’s Level 3 in Los Angeles has worked on all three seasons of Amazon’s Sneaky Pete, produced by Bryan Cranston and David Shore and starring Giovanni Ribisi. Season 3 is the show’s last.

For those of you unfamiliar with the series, it follows a con man named Marius (Ribisi), who takes the place of his former cell-mate Pete and endears himself to Pete’s seemingly idyllic family while continuing to con his way through life. Over time he comes to love the family, which is nowhere as innocent as they seem.

Scott Ostrowsky

We reached out to this veteran colorist to learn more about how the look of the series developed over the seasons and how he worked with the showrunners and DPs.

You’ve been on Sneaky Pete since the start. Can you describe how the look has changed over the years?
I worked on Seasons 1 through Season 3. The DP for Season 1 was Rene Ohashi and it had somewhat of a softer feel. It was shot on a Sony F55. It mostly centered around the relationship of Bryan Cranston’s character and Giovanni Ribisi’s newly adopted fake family and his brother.

Season 2 was shot by DPs Frank DeMarco and William Rexer on a Red Dragon, and it was a more stylized and harsher look in some ways. The looks were different because the storylines and the locations had changed. So, even though we had some beautiful, resplendent looks in Season 2, we also created some harsher environments, and we did that through color correction. Going into Season 2, the storyline changed, and it became more defined in the sense that we used the environments to create an atmosphere that matched the storyline and the performances.

An example of this would be the warehouse where they all came together to create the scam/ heist that they were going to pull off. Another example of this would be the beautiful environment in the casino that was filled with rich lighting and ornate colors. But there are many examples of this through the show — both DPs used shadow and light to create a very emotional mood or a very stark mood and everything in between.

Season 3 shot by Arthur Albert and his son, Nick Albert on a Red Gemini, and it had a beautiful, resplendent, rich look that matched the different environments when it moved from the cooler look of New York to the more warm, colorful look in California.

So you gave different looks based on locale? 
Yes, we did. Many times, the looks would depend on time of day and the environment that they were in. An example of this might be the harsh fluorescent green in the gas station bathroom where Giovanni’s character is trying to figure out a way to help his brother and avoid his captures.

How did you work with the Alberts on the most recent season?
I work at Level 3 Post, which is a Deluxe company. I did Season 1 and 2 at the facility on the Sony lot. Season 3 was posted at Level 3. Arthur and Nick Albert came in to my color suite with the camera tests shot on the Red Gemini and also the Helium. We set up a workflow based on the Red cameras and proceeded to grade the various setups.

Once Arthur and Nick decided to use the Gemini, we set up our game plan for the season. When I received my first conform, I proceeded to grade it based on our conversations. I was very sensitive to the way they used their setups, lighting and exposures. Once I finished my first primary grade, Arthur would come in and sit with me to watch the show and make any changes. After Arthur approved the grade, The producers and showrunner would come in for their viewing. They could make any additional changes at that time. (Read our interview with Arthur Albert here.)

How do you prefer to work with directors/DPs?
The first thing is have conversation with them on their approach and how they view color as being part of the story they want to tell. I always like to get a feel for how the cinematographer will shoot the show and what, if any, LUTs they’re using so I can emulate that look as a starting point for my color grading.

It is really important to me to find out how a director envisions the image he or she would like to portray on the screen. An example of this would be facial expressions. Do we want to see everything or do they mind if the shadow side remains dark and the light falls off.

A lot of times, it’s about how the actors emote and how they work in tandem with each other to create tension, comedy or other emotions — and what the director is looking for in these scenes.

Any tips for getting the most out of a project from a color perspective?
Communication. Communication. Communication. Having an open dialogue with the cinematographer, showrunners and directors is extremely important. If the colorist is able to get the first pass very close, you spend more time on the nuisances rather than balancing or trying to find a look. That is why it is so important to have an understanding of the essence of what a director, cinematographer and showrunner is looking for.

How do you prefer the DP or director to describe their desired look?
However they’re comfortable in enlightening me to their styles or needs for the show is fine. Usually, we can discuss this when we have a camera test before principal photography starts. There’s no one way that you can work with everybody — you just adapt to how they work. And as a colorist, it’s your job to make that image sing or shine the way that they intended it to.

You used Resolve on this. Is there a particular tool that came in handy for this show?
All tools on the Resolve are useful for a drama series. You would not buy the large crayon box and throw out colors you didn’t like because, at some point, you might need them. I use all tools — from keys, windows, log corrections and custom curves to create the looks that were needed.

You have been working in TV for many years. How has color grading changed during that time?
Color correction has become way more sophisticated over the years, and is continually growing and expanding into a blend of not only color grading but helping to create environments that are needed to express the look of a show. We no longer just have simple color correctors with simple secondaries; the toolbox continues to grow with added filters, added grain and sometimes even helping to create visual effects, which most color correctors are able to do today.

Where do you find inspiration? Art? Photography?
I’ve always loved photography and B&W movies. There’s a certain charm or subtlety that you find in B&W, whether it’s a film noir, the harshness of film grain, or just the use of shadow and light. I’ve always enjoyed going to museums and looking at different artists and how they view the world and what inspires them.

To me, it’s trying to portray an image and have that image make a statement. In daily life, you can see multiple examples as you go through your day, and I try and keep the most interesting ones that I can remember in my lexicon of images.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Review: PixelTools V.1 PowerGrade presets for Resolve

By Brady Betzel

Color correction and color grading can be tricky (especially for those of us who don’t work as a dedicated colorist). And to be good at one doesn’t necessarily mean you will be good at the other. After watching hundreds of hours of tutorials, the only answer to getting better at color correction and color grading is to practice. As trite and cliche as it sounds, it’s the truth. There is also the problem of a creative block. I can sometimes get around a creative block when color correcting or general editing by trying out of the box ideas, like adding a solid color on top of footage and changing blend modes to spark some ideas.

An easier way to get a bunch of quick looks on your footage is with LUTs (Look Up Tables) and preset color grades. LUTs can sometimes work at getting your footage into an acceptable spot color correction-wise or technically, in the correct color space (the old technical vs. creative LUTs discussion). They often need (or should) be tweaked to fit the footage you are using.

Dawn

This is where PixelTool’s PowerGrade presets for Blackmagic’s DaVinci Resolve come in to play. PixelTool’s presets give you that instant wow of a color grade, sharpening and even grain, but with the flexibility to tweak and adjust to your own taste.

PixelTool’s PowerGrade V.1 are a set of Blackmagic’s DaVinci Resolve PowerGrades (essentially pre-built color grades sometimes containing noise reduction, glows or film grain) that retail for $79.99. Once purchased, the PowerGrade presets can be downloaded immediately. If you aren’t sure about the full commitment to purchase for $79.99, you can download eight sample PowerGrade presets to play with by signing up for PixelTools’ newsletter.

While it doesn’t typically matter what version of Resolve you are using with the PixelTool PowerGrade, you will probably want to make sure you are using Resolve Studio 15 (or higher) or you may miss out on some of the noise reduction or film. I’m running Resolve 16 Studio.

What are PowerGrades? In Resolve, you can save and access pre-built color correction node trees across all projects in a single database. This way if you have an amazing orange and teal, bleach bypass, or maybe a desaturated look with a vignette and noise reduction that you don’t want to rebuild inside every project you can them in the PowerGrades folder in the color correction tab. Easy! Just go into the Color Correction Tab > Gallery (in the upper left corner) > click the little split window icon > right click and “Add PowerGrade Album.”

Golden

Installing the PixelTools presets is pretty easy, but there are a few steps you are going to want to follow if you’ve never made a PowerGrades folder before. Luckily, there is a video just for that. Once you’ve added the presets into your database you can access over 110 grades in both Log and Rec 709 color spaces. In addition, there is a folder of “Utilities,” which offers some helpful tools like Scanlines (Mild-Intense), various Vignettes, Sky Debanding, preset Noise Reductions, two-and three-way Grain Nodes and much more. Some of the color grading presets can fit on one node but some have five or six nodes like the “2-Strip Holiday.” They will sometimes be applied as a Compound Node for organization-sake but can be decomposed to see all the goodness inside.

The best part of PixelTools, other than the great looks, is the ability to decompose or view the Compound Node structure and see what’s under the hood. Not only does it make you appreciate all of the painstaking work that is already done for you, but you can study it, tweak it and learn from it. I know a lot of companies that don’t like to reveal how things are done, but with PixelTools you can break the grades. Follows my favorite motto: “A rising tide lifts all boats” mindset.

From the understated “2-Strip Holiday” look to the crunchy “Bleach Duotone 2” with the handy “Saturation Adjust” node on the end of the tree, PixelTools is the prime example of pre-built looks that can be as easy as drag-and-dropping onto a clip or as intricate as adjusting each node to the way you like it. One of my favorite looks is a good-old Bleach Bypass — use two layer nodes (one desaturated and one colored), layer mix with a composite mode set to Overlay and adjust saturation to taste. The Bleach Bypass setup is not a tightly guarded secret, but PixelTools gets you right to the Bleach Bypass look with the Bleach Duotone 2 and also adds a nice orange and teal treatment on top.

2-Strip Holiday

Now I know what you are thinking — “Orange and Teal! Come on, what are we Michael Bay making Transformers 30?!” Well, the answer is, obviously, yes. But to really dial the look to taste on my test footage I brought down the Saturation node at the end of the node tree to around 13%, and it looks fantastic! Moral of the story is: always dial in your looks, especially with presets. Just a little customization can take your preset-look to a personalized look quickly. Plus, you won’t be the person who just throws on a preset and walks away.

Will these looks work with my footage? If you shot in a Log-ish style like SLog or BMD Film, Red Log Film or even GoPro Flat you can use the Log presets and dial them to taste. If you shot footage in Rec. 709 with your Canon 5D Mark II, you can just use the standard looks. And if you want to create your own basegrade on Log footage just add the PixelTool PowerGrade Nodes after!

Much like my favorite drag-and-drop tools from Rampant Design, PixelTools will give you a jump on your color grading quickly and if nothing else can maybe shake loose some of that colorist creative block that creeps in. Throw on that “Fuji 1” or “Fuji 2” look, add a serial node in the beginning and crank up the red highlights…who knows it may give you some creative jumpstart that you are looking for. Know the rules to break the rules, but also break the rules to get those creative juices flowing.

Saturate-Glow-Shadows

Summing Up
In the end, PixelTools is not just a set of PowerGrades for DaVinci Resolve, they can also be creative jumpstarts. If you think your footage is mediocre, you will be surprised at what a good color grade will do. It can save your shoot. But don’t forget about the rendering when you are finished. Rendering speed will still be dependent on your CPU and GPU setup. In fact, using an Asus ConceptD 7 laptop with an Nvidia RTX 2080 GPU, I exported a one-minute long Blackmagic Raw sequence with only color correction (containing six clips) to 10-bit DPX files in :46 seconds, with a random PixelTools PowerGrade applied to each clip it took :40 seconds! In this case the Nvidia RTX 2080 really aided in the fast export but your mileage may vary.

Check out pixeltoolspost.com and make sure to at least download their sample pack. From the one of five Kodak looks, two Fuji Looks, Tobacco Newspaper to Old Worn VHS 2 with a hint of chromatic aberration you are sure to find something that fits your footage.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Colorist Joanne Rourke grades Netflix horror film In the Tall Grass

Colorists are often called on to help enhance a particular mood or item for a film, show or spot. For Netflix’s In the Tall Grass — based on a story from horror writers Stephen King and Joe Hill — director Vincenzo Natali and DP Craig Wrobleski called on Deluxe Toronto’s Joanne Rourke to finesse the film’s final look using color to give the grass, which plays such a large part in the film, personality.

In fact, most of the film takes place in a dense Kansas field. It all begins when a brother and his pregnant sister hear a boy’s cries coming from a field of tall grass and go to find him. Soon they realize they can’t escape.

Joanne Rourke

“I worked with Vincenzo more than 20 years ago when I did the video mastering for his film Cube, so it was wonderful to reconnect with him and a privilege to work with Craig. The color process on this project was highly collaborative and we experimented a lot. It was decided to keep the day exteriors natural and sunny with subtle chromatic variations between. While this approach is atypical for horror flicks, it really lends itself to a more unsettling and ominous feeling when things begin to go awry,” explains Rourke.

In the Tall Grass was principally shot using the ARRI Alexa LF camera system, which helped give the footage a more immersive feeling when the characters are trapped in the grass. The grass itself comprised a mix of practical and CG grass that Rourke adjusted the color of depending on the time of day and where the story was taking place in the field. For the night scenes, she focused on giving the footage a silvery look while keeping the overall look as dark as possible with enough details visible. She was also mindful to keep the mysterious rock dark and shadowed.

Rourke completed the film’s first color pass in HDR, then used that version to create an SDR trim pass. She found the biggest challenge of working in HDR on this film to be reining in unwanted specular highlights in night scenes. To adjust for this, she would often window specific areas of the shot, an approach that leveraged the benefits of HDR without pushing the look to the extreme. She used Blackmagic Resolve 15 along with the occasional Boris FX Sapphire plugins.

“Everyone involved on this project had a keen attention to detail and was so invested in the final look of the project, which made for such great experience,” says Rourke. “I have many favorite shots, but I love how the visual of the dead crow on the ground perfectly captures the silver feel. Craig and Vincenzo created such stunning imagery, and I was just happy to be along for the ride. Also, I had no idea that head squishing could be so gleeful and fun.”

In the Tall Grass is now streaming on Netflix.

Harbor adds talent to its London, LA studios

Harbor has added to its London- and LA-based studios. Marcus Alexander joins as VP of picture post, West Coast and Darren Rae as senior colorist. He will be supervising all dailies in the UK.

Marcus Alexander started his film career in London almost 20 years ago as an assistant editor before joining Framestore as a VFX editor. He helped Framestore launch its digital intermediate division, producing multiple finishes on a host of tent-pole and independent titles, before joining Deluxe to set up its London DI facility. Alexander then relocated to New York to head up Deluxe New York DI. With the growth in 3D movies, he returned to the UK to supervise stereo post conversions for multiple studios before his segue into VFX supervising.

“I remember watching It Came from Outer Space at a very young age and deciding there and then to work in movies,” says Alexander. “Having always been fascinated with photography and moving images, I take great pride in thorough involvement in my capacity from either a production or creative standpoint. Joining Harbor allows me to use my skills from a post-finishing background along with my production experience in creating both 2D and 3D images to work alongside the best talent in the industry and deliver content we can be extremely proud of.”

Rae began his film career in the UK in 1995 as a sound sync operator at Mike Fraser Neg Cutters. He moved into the telecine department in 1997 as a trainee. By 1998 he was a dailies colorist working with 16mm and 35mm film. From 2001, Rae spent three years with The Machine Room in London as telecine operator and joined Todd AO’s London lab in 2014 as colorist working on drama and commercials 35mm and 16mm film and 8mm projects for music videos. In 2006 Rae moved into grading dailies at Todd AO parent company Deluxe in Soho London, moving to Company 3 London in 2007 as senior dailies colorist. In 2009, he was promoted to supervising colorist.

Prior to joining Harbor, Rae was senior colorist for Pinewood Digital, supervising multiple shows and overseeing a team of four, eventually becoming head of grading. Projects include Pokemon Detective Pikachu, Dumbo, Solo: A Star Wars Story, The Mummy, Rogue One, Doctor Strange and Star Wars Episode VII — The Force Awakens.

“My main goal is to make the director of photography feel comfortable. I can work on a big feature film from three months to a year, and the trust the DP has in you is paramount. They need to know that wherever they are shooting in the world, I’m supporting them. I like to get under the skin of the DP right from the start to get a feel for their wants and needs and to provide my own input throughout the entire creative process. You need to interpret their instructions and really understand their vision. As a company, Harbor understands and respects the filmmaker’s process and vision, so for me, it’s the ideal new home for me.”

Harbor has also announced that colorists Elodie Ichter and Katie Jordan are now available to work with clients on both the East and West Coasts in North America as well as the UK. Some of the team’s work includes Once Upon a Time in Hollywood, The Irishman, The Hunger Games, The Maze Runner, Maleficent, The Wolf of Wall Street, Anna, Snow White and the Huntsman and Rise of the Planet of the Apes.

Charlieuniformtango names company vets as new partners

Charlieuniformtango principal/CEO Lola Lott has named three of the full-service studio’s most veteran artists as new partners — editors Deedle LaCour and James Rayburn, and Flame artist Joey Waldrip. This is the first time in the company’s almost 25-year history that the partnership has expanded. All three will continue with their current jobs but have received the expanded titles of senior editor/partner and senior Flame artist/partner, respectively. Lott, who retains majority ownership of Charlieuniformtango, will remain principal/CEO, and Jack Waldrip will remain senior editor/co-owner.

“Deedle, Joey and James came to me and Jack with a solid business plan about buying into the company with their futures in mind,” explains Lott. “All have been with Charlieuniformtango almost from the beginning: Deedle for 20 years, Joey for 19 years and James for 18. Jack and I were very impressed and touched that they were interested and willing to come to us with funding and plans for continuing and growing their futures with us.

So why now after all these years? “Now is the right time because while Jack and I still have a passion for this business and we also have employees/talent — that have been with us for over 18 years — who also have a passion be a partner in this company,” says Lott. “While still young, they have invested and built their careers within the Tango culture and have the client bonds, maturity and understanding of the business to be able to take Tango to a greater level for the next 20 years. That was mine and Jack’s dream, and they came to us at the perfect time.”

Charlieuniformtango is a full-service creative studio that produces, directs, shoots, edits, mixes, animates and provides motion graphics, color grading, visual effects and finishing for commercials, short films, full-length feature films, documentaries, music videos and digital content.

Main Image: (L-R) Joey Waldrip, James Rayburn, Jack Waldrip, Lola Lott and Deedle LaCour

Colorist Chat: Lucky Post’s Neil Anderson

After joining Lucky Post in Dallas in 2013 right out of film school, Neil Anderson was officially promoted to colorist in 2017. He has worked on a variety of projects during his time at the studio, including projects for Canada Dry, Costa, TGI Fridays, The Salvation Army and YETI. He also contributed to Augustine Frizzell’s feature comedy, Never Goin’ Back, which premiered at Sundance and was distributed by A24.

YETI

We checked in with Anderson to find out how he works, some favorite projects and what inspires him.

What do you enjoy most about your work?
That’s a really hard question because there are a lot of things I really enjoy about color grading. If I had to choose, I think it comes back to the fact that it’s rewarding to both left- and right-brained people. It truly is both an art and a science.

The satisfaction I get when I first watch a newly graded spot is also very special. A cohesive and mindful color grade absolutely transforms the piece into something greater, and it’s a great feeling to be able to make such a powerful impact.

What’s the most misunderstood aspect of color artistry?
I’m not sure many people stop and think about how amazing it is that we can fine tune our engineering to something as wild as our eye sight. Our vision is very fluid and organic, constantly changing under different constraints and environments, filled with optical illusions and imperfect guesses. There are immensely strange phenomena that drastically change our perception of what we see. Yet we need to make camera systems and displays work with this deeply non-uniform perception. It’s an absolutely massive area of study that we take for granted; I’m thankful for those color scientists out there.

Where do you find your creative inspiration?
I definitely like to glean new ideas and ways of approaching new projects from seeing other great colorists. Sometimes certain commercials come on TV that catch my eye and I’ll excitedly say to my partner Odelie, “That is damn good color!” Depending on the situation, I might get an eye-roll or two from her.

Tell us about some recent projects, and what made them stand out to you creatively?
Baylor Scott & White Health: I just loved how moody we took these in the end. They are very inspiring stories that we wanted to make feel even more impactful. I think the contrast and color really turned out beautiful.

Is This All There Is?

Is This All There Is? by Welcome Center: This is a recent music video that we filmed in a stunningly dilapidated house. The grit and grain we added in color really brings out the “worst” of it.

Hurdle: This was a documentary feature I worked on that I really enjoyed. The film was shot over a six-month window in the West Bank in Israel, so wrangling it in while also giving it a distinctly unique look was both difficult and fun.

Light From Light: Also a feature film that I finished a few months ago. I really enjoyed the process of developing the look with its wonderful DP Greta Zozula. We specifically wanted to capture the feeling of paintings by Andrew Wyeth, Thomas Eakins and Johannes Vermeer.

Current bingeable episodics and must see films?
Exhibit A, Mindhunter, Midsommar and The Cold Blue.

When you are not at Lucky Post, where do you like to spend time?
I’m an avid moviegoer so definitely a lot of my time (and money) is spent at the theater. I’m also a huge sports fan; you’ll find me anywhere that carries my team’s games! (Go Pack Go)

Favorite podcast?
The Daily (“The New York Times”)

Current Book?
“Parting the Waters: America in the King Years 1954-1963”

Dumbest thing you laughed at today?
https://bit.ly/2MYs0V1

Song you can’t stop listening to?
John Frusciante — 909 Day

Color grading IT Chapter Two’s terrifying return

In IT Chapter Two, the kids of the Losers’ Club are all grown up and find themselves lured back to their hometown of Derry. Still haunted both by the trauma that monstrous clown Pennywise let loose on the community and by each one’s own unique insecurities, the group (James McAvoy, Jessica Chastain, Bill Hader) find themselves up against even more terrifying forces than they faced in the first film, IT.

Stephen Nakamura

IT Chapter Two director Andy Muschietti called on cinematographer Checco Varese and colorist Stephen Nakamura of Company 3. Nakamura returned to the franchise, performing the final color grade at Efilm in Hollywood. “I felt the first one was going to be a big hit when we were working on it, because these kids’ stories were so compelling and the performances were so strong. It was more than just a regular horror movie. This second one, in my opinion, is just as powerful in terms of telling these characters’ stories. And, not surprisingly, it also takes the scary parts even further.”

According to Nakamura, Muschietti “is a very visually oriented director. When we were coloring both of the films, he was very aware of the kinds of things we can do in the DI to enhance the imagery and make things even more scary. He pushed me to take some scenes in Chapter Two in directions I’ve never gone with color. I think it’s always important, whether you’re a colorist or a chef or a doctor, to always push yourself and explore new aspects of your work. Andy’s enthusiasm encouraged me to try new approaches to working in DaVinci Resolve. I think the results are very effective.”

For one thing, the technique he used to bring up just the light level in the eyes of the shapeshifting clown Pennywise got even more use here because there were more frightening characters to use it on. In many cases, the companies that created the visual effects also provided mattes that let Nakamura easily isolate and adjust the luminance of each individual eye in Resolve. When such mattes weren’t available, he used Resolve to track each eyeball a frame at a time.

“Resolve has excellent tracking capabilities, but we were looking to isolate just the tiny whites of the characters’ eyes,” Nakamura explains, “and there just wasn’t enough information to track.” It was meticulous work, he recalls, “but it’s very effective. The audience doesn’t consciously know we’re doing anything, but it makes the eyes brighter in a very strange way, kind of like a cat’s eyes when they catch the light. It really enhances the eerie feeling.”

In addition, Nakamura and the filmmakers made use of Resolve’s Flicker tool in the OpenFX panel to enhance the flickering effect in a scene involving flashing lights, taking the throbbing light effects further than they did on set. Not long ago, this type of enhancement would have been a more involved process in which the shots would likely be sent to a visual effects house. “We were able to do it as part of the grading, and we all thought it looked completely realistic. They definitely appreciated the ability to make little enhancements like that in the final grade, when everyone can see the scenes with the grade in context and on a big screen.”

Portions of the film involve scenes of the Losers’ Club as children, which were comprised of newly shot material (not cut in from the production of the first It). Nakamura applied a very subtle amount of Resolve’s mid-tone detail tool over them primarily to help immediately and subliminally orient the audience in time.

But the most elaborate use of the color corrector involved one short sequence in which Hader’s character, walking in a local park on a pleasant, sunny day, has a sudden, terrifying interaction with a very frightening character. The shots involved a significant amount of CGI and compositing work, which was completed at several effects houses. Muschietti was pleased with the effects work, but he wanted Nakamura to bring in an overall quality to the look of the scene that made it feel a bit more otherworldly.

Says Nakamura, “Andy described something that reminded me of the old-school, two-strip color process, where essentially anything red would get pushed into being a kind of magenta, and something blue or green would become a kind of cyan.”

Nakamura, who colored Martin Scorsese’s The Aviator (shot by Robert Richardson, ASC), had designed something at that point to create more of a three-strip look, but this process was more challenging, as it involved constraining the color palette to an even greater degree — without, of course, losing definition in the imagery.

With a bit of trial and error, Nakamura came up with the notion of using the splitter/combiner node and recombined some nodes in the output, forcing the information from the green channel into the red and blue channels. He then used a second splitter/combiner node to control the output. “It’s almost like painting a scene with just two colors,” he explains. “Green grass and blue sky both become shades of cyan, while skin and anything with red in it goes into the magenta area.”

The work became even more complex because the red-haired Pennywise also makes an appearance; it was important for him to retain his color, despite the rest of the scene going two-tone. Nakamura treated this element as a complex chroma key, using a second splitter/combiner node and significantly boosting the saturation just to isolate Pennywise while preventing the two-tone correction from affecting him.

When it came time to complete the pass for HDR Dolby Cinema — designed for specialty projectors capable essentially of displaying brighter whites and darker blacks than normal cinema projectors — Muschietti was particularly interested in the format’s treatment of dark areas of the frame.

“Just like in the first one,” Nakamura explains, “we were able to make use of Dolby Cinema to enhance suspense. People usually talk about how bright the highlights can be in HDR. But, when you push more light through the picture than you do for the P3 version, we also have the ability to make shadowy areas of the image appear even darker while keeping the details in those really dark areas very clear. This can be very effective in a movie like this, where you have scary characters lurking in the shadows.

“The color grade always plays some kind of role in a movie’s storytelling,” Nakamura sums up, “but this was a fun example of how work we did in the color grade really helped scare the audience.”

You can check out our Q&A with Nakamura about his work on the original IT.

Wildlife DP Steve Lumpkin on the road and looking for speed

For more than a decade, Steve Lumpkin has been traveling to the Republic of Botswana to capture and celebrate the country’s diverse and protected wildlife population. As a cinematographer and still photographer, Under Prairies Skies Photography‘s Lumpkin will spend a total of 65 days this year filming in the bush for his current project, Endless Treasures of Botswana.

Steve Lumpkin

It’s a labor of love that comes through in his stunning photographs, whether they depict a proud and healthy lioness washed with early-morning sunlight, an indolent leopard draped over a tree branch or a herd of elephants traversing a brilliant green meadow. The big cats hold a special place in Lumpkin’s heart, and documenting Botswana’s largest pride of lions is central to the project’s mission.

“Our team stands witness to the greatest conservation of the natural world on the planet. Botswana has the will and the courage to protect all things wild,” he explains. “I wanted to fund a not-for-profit effort to create both still images and films that would showcase The Republic of Botswana’s success in protecting these vulnerable species. In return, the government granted me a two-year filming permit to bring back emotional, true tales from the bush.”

Lumpkin recently graduated to shooting 4K video in the bush in Apple ProRes Raw, using a Sony FS5 camera and an Atomos Inferno recorder. He brings the raw footage back to his US studio for post, working in Apple Final Cut Pro on an iMac 5K and employing a variety of tools, including Color Grading Central and Neat Video.

Leopard

Until recently, Lumpkin was hitting a performance snag when transferring files from his QNAP TBS 882T NAS storage system to his iMac Pro. “I was only getting read times of about 100 Mb/sec from Thunderbolt, so editing 4K footage was painful,” he says. “At the time, I was transitioning to ProRes RAW, and I knew I needed a big performance kick.”

On the recommendation of Bob Zelin, video engineering consultant and owner of Rescue 1, Lumpkin installed Sonnet’s Solo10G Thunderbolt 3 adapter. The Solo10G uses the 10GbE standard to connect computers via Ethernet cables to high-speed infrastructure and storage systems. “Instantly, I jumped to a transfer rate of more than 880MB per second, a nearly tenfold throughput increase,” he says. “The system just screams now – the Solo10G has accelerated every piece of my workflow, from ingest to 4K editing to rendering and output.”

“So many colleagues I know are struggling with this exact problem — they need to work with huge files and they’ve got these big storage arrays, but their Thunderbolt 2 or 3 connections alone just aren’t cutting it.”

With Lumpkin, everything comes down to the wildlife. He appreciates any tools that help streamline his ability to tell the story of the country and its tremendous success in protecting threatened species. “The work we’re doing on behalf of Botswana is really what it’s all about — in 10 or 15 years, that country might be the only place on the planet where some of these animals still exist.

“Botswana has the largest herd of elephants in Africa and the largest group of wild dogs, of which there are only about 6,000 left,” says Lumpkin. “Products like Sonnet’s Solo10G, Final Cut, the Sony FS5 camera and Atomos Inferno, among others, help our team celebrate Botswana’s recognition as the conservation leader of Africa.”

FotoKem expands post services to Santa Monica

FotoKem is now offering its video post services in Santa Monica. This provides an accessible location for those working on the west side of LA, as well as access to the talent from its Burbank and Hollywood studios.

Designed to support an entire pipeline of services, the FotoKem Santa Monica facility is housed just off the 10 freeway, above FotoKem’s mixing and recording studio Margarita Mix. For many projects, color grading, sound mixing and visual effects reviews often take place in multiple locations around town. This facility offers showrunners and filmmakers a new west side post production option. Additionally, the secure fiber network connecting all FotoKem-owned locations ensures feature film and episodic finishing work can take place in realtime among sites.

FotoKem Santa Monica features a DI color grading theater, episodic and commercial color suite, editorial conform bay and a visual effects team — all tied to the comprehensive offerings at FotoKem’s main Burbank campus, Keep Me Posted’s episodic finishing facility and Margarita Mix Hollywood’s episodic grading suites. FotoKem’s entire roster of colorists are available to collaborate with filmmakers to ensure their vision is supported throughout the process. Recent projects include Shazam!, Vice, Aquaman, The Dirt, Little and Good Trouble.

Colorist Jimmy Hsu joins Encore Vancouver

Seasoned colorist Jimmy Hsu has joined Encore Vancouver, bringing with him experience in content creation and color science. He comes to Encore Vancouver from Side Street Post Production, where he began as an online editor in 2012 before focusing on color grading.

Hsu’s work spans live action and animated projects across genres, including features, video game cinematics and commercials for clients such as Universal Studios, Disney and Lifetime.

Upon graduating from British Columbia’s Simon Fraser University with a bachelor’s in interactive arts and film production, Hsu held various roles in production and post production, including as a creative editor and motion graphics artist. Having edited more than a hundred movie trailers, Hsu is well-versed in project deliverables and specs, which helps inform his color process. He also draws from his artistic background, leveraging the latest capabilities of Blackmagic DaVinci Resolve to incorporate significant compositing and visual effects work into his projects.


Senior colorist Maria Carretero joins Nice Shoes

NYC-based post studio Nice Shoes has hired senior colorist Maria Carretero, who comes to Nice Shoes with nearly two decades of global experience in color grading under her belt. Her portfolio includes a wide range of feature films, short films, music videos and commercials for brands like Apple, Jeep, Porsche, Michael Kors, Disney and Marriott, among many others. She will be based at Nice Shoes’ NYC studio, also working across Nice Shoes’s Boston, Chicago, Toronto and Minneapolis spaces and through its network of remote partnerships globally.

She comes to Nice Shoes from Framestore in Chicago, where she spent nearly two years establishing relationships with agencies such as BBDO, FCB, DDB, Leo Burnett Chicago and Media Arts Lab LA.

Carretero is originally from Spain, where she received an education in fine arts. She soon discovered the creative possibilities in digital color grading, quickly establishing a career for herself as an international artist. Her background in painting, coupled with her natural eye for nuanced visuals, are the tools that help her maximize her clients’ creative visions. Carretero’s ability to convey a brand story through her work has earned her a long list of awards, including Cannes Lions and a Clio.

Carretero’s recent work includes Jeep’s Recalculating, Disney’s You Can Fly and Bella Notte, Porsche’s The Fix and Avocados From Mexico’s Top Dog spot for Super Bowl 2019.

“Nice Shoes brings together the expertise backed by 20 years of experience with a personal approach that really celebrates female talent and collaboration,” adds Carretero. “I’m thrilled to be joining a team that truly supports the creative exploration process that color takes in storytelling. I’ve always wanted to live in New York. Throughout my whole life, I visited this city again and again and was fascinated by the diversity, the culture, and incredible energy that you breathe in as you walk the city’s streets.”

AJA adds HDR Image Analyzer 12G and more at IBC

AJA will soon offer the new HDR Image Analyzer 12G, bringing 12G-SDI connectivity to its realtime HDR monitoring and analysis platform developed in partnership with Colorfront. The new product streamlines 4K/Ultra HD HDR monitoring and analysis workflows by supporting the latest high-bandwidth 12G-SDI connectivity. The HDR Image Analyzer 12G will be available this fall for $19,995.

HDR Image Analyzer 12G offers waveform, histogram and vectorscope monitoring and analysis of 4K/Ultra HD/2K/HD, HDR and WCG content for broadcast and OTT production, post, QC and mastering. It also features HDR-capable monitor outputs that not only go beyond HD resolutions and offer color accuracy but make it possible to configure layouts to place the preferred tool where needed.

“Since its release, HDR Image Analyzer has powered HDR monitoring and analysis for a number of feature and episodic projects around the world. In listening to our customers and the industry, it became clear that a 12G version would streamline that work, so we developed the HDR Image Analyzer 12G,” says Nick Rashby, president of AJA.

AJA’s video I/O technology integrates with HDR analysis tools from Colorfront in a compact 1-RU chassis to bring HDR Image Analyzer 12G users a comprehensive toolset to monitor and analyze HDR formats, including PQ (Perceptual Quantizer) and hybrid log gamma (HLG). Additional feature highlights include:

● Up to 4K/Ultra HD 60p over 12G-SDI inputs, with loop-through outputs
● Ultra HD UI for native resolution picture display over DisplayPort
● Remote configuration, updates, logging and screenshot transfers via an integrated web UI
● Remote Desktop support
● Support for display referred SDR (Rec.709), HDR ST 2084/PQ and HLG analysis
● Support for scene referred ARRI, Canon, Panasonic, Red and Sony camera color spaces
● Display and color processing lookup table (LUT) support
● Nit levels and phase metering
● False color mode to easily spot pixels out of gamut or brightness
● Advanced out-of-gamut and out-of-brightness detection with error intolerance
● Data analyzer with pixel picker
● Line mode to focus a region of interest onto a single horizontal or vertical line
● File-based error logging with timecode
● Reference still store

At IBC 2019, AJA also showed new products and updates designed to advance broadcast, production, post and pro AV workflows. On the stand were the Kumo 6464-12G for routing and the newly shipping Corvid 44 12G developer I/O models. AJA has also introduced the FS-Mini utility frame sync Mini-Converter and three new OpenGear-compatible cards: OG-FS-Mini, OG-ROI-DVI and OG-ROI-HDMI. Additionally, the company previewed Desktop Software updates for Kona, Io and T-Tap; Ultra HD support for IPR Mini-Converter receivers; and FS4 frame synchronizer enhancements.

IBC 2019 in Amsterdam: Big heads in the cloud

By David Cox

IBC 2019 kicked off with an intriguing announcement from Avid. The company entered into a strategic alliance with Microsoft and Disney’s Studio Lab to enable remote editorial workflows in the cloud.

The interesting part for me is how this affects the perception of post producing in the cloud, rather than the actual technology of it. It has been technically possible to edit remotely in the cloud for some time —either by navigating the Wild West interfaces of the principal cloud providers and “spinning up” a remote computer, connecting some storage and content, and then running an edit app or alternatively, by using a product that takes care of all that such as Blackbird. No doubt, the collaboration with Disney will produce products and services within an ecosystem that makes the technical use of the cloud invisible.

Avid press conference

However, what interests me is that arguably, the perception of post producing in the cloud is instantly changed. The greatest fear of post providers relates to the security of their clients’ intellectual property. Should a leak ever occur, to retain the client (or indeed avoid a catastrophic lawsuit), the post facility would have to make a convincing argument that security protocols were appropriate. Prior to the Disney/Avid/Microsoft Azure announcement, the part of that argument where the post houses say “…then we sent your valuable intellectual property to the cloud” caused a sticky moment. However, following this announcement, there has been an inherent endorsement by the owner of one of the most valuable IP catalogs (Disney) that post producing in the cloud is safe — or at least will be.

Cloudy Horizons
At the press conference where Avid made its Disney announcement, I asked whether the proposed cloud service would be a closed, Avid-only environment or an open platform to include other vendors. I pointed out that many post producers also use non-Avid products for various aspects, from color grading to visual. Despite my impertinence in mentioning competitors (even though Avid had kindly provided lunch), CEO Jeff Rosica provided a well-reasoned and practical response. To paraphrase, while he did not explicitly say the proposed ecosystem would be closed, he suggested that from a commercial viewpoint, other vendors would more likely want to make their own cloud offerings.

Rosica’s comments suggest that post houses can expect many clouds on their horizons from various application developers. The issue will then be how these connect to make coherent and streamlined workflows. This is not a new puzzle for post people to solve — we have been trying to make local systems from different manufacturers to talk to each other for years, with varying degrees of success. Making manufacturers’ various clouds work together would be an extension of that endeavor. Hopefully, manufacturers will use their own migrations to the cloud to further open their systems, rather than see it as an opportunity to play defensive, locking bespoke file systems and making cross-platform collaboration unnecessarily awkward. Too optimistic, perhaps!

Or One Big Cloud?
Separately to the above, just prior to IBC, MovieLabs introduced its white paper, which discussed a direction of travel for movie production toward the year 2030. The IBC produced a MovieLabs panel on the Sunday of the show, moderated by postPerspective’s own Randi Altman and featuring tech chiefs from the major studios. It would be foolish not to pay it proper consideration, given that it’s backed by Disney, Sony, Paramount, Warner Bros. and Universal.

MovieLabs panel

To summarize, the proposition is that the digital assets that will be manipulated to make content stay in one centralized cloud. Apps that manipulate those assets, such as editorial and visual effects apps, delivery processes and so on, will operate in the same cloud space. The talent that drives those apps will do so via the cloud. Or to put it slightly differently, the content assets don’t move — rather, the production apps and talent move to the assets. Currently, we do the opposite: the assets are transferred to where the post services are provided.

There are many advantages to this idea. Multiple transfers of digital assets to many post facilities would end. Files would be secured on a policy basis, enabling only the relevant operators to have access for the appropriate duration. Centralized content libraries would be produced, helping to enable on-the-fly localization, instant distribution and multi-use derivatives, such as marketing materials and games.

Of course, there are many questions. How do the various post application manufacturers maintain their product values if they all work as in-cloud applications on someone else’s hardware? What happens to traditional post production facilities if they don’t need any equipment and their artists log in from wherever? How would a facility protect itself from payment disputes if it does not have control over the assets it produces?

Personally, I have moved on from the idea of brick-and-mortar facilities. Cloud post permits nearly unlimited resources and access to a global pool of talent, not just those who reside within a commutable distance from the office. I say, bring it on… within reason. Of course, this initiative relates only to the production of content for those key studios. There’s a whole world of content production beyond that scope.

Blackmagic

Knowing Your Customer
Another area of interest for me at IBC 2019 was how offerings to colorists have become quite polarized. On one hand there is the seemingly all-conquering Resolve from Blackmagic Design. Inexpensive, easy to access and ubiquitous. On the other hand there is Baselight from FilmLight — a premium brand with a price tag and associated entry barrier to match. The fact that these two products are both successful in the same market but with very different strategies is testament to a fundamental business rule: “Know your customer.” If you know who your customer is going to be, you can design and communicate the ideal product for them and sell it at the right price.

A chat with FilmLight’s joint founder, Wolfgang Lempp, and development director Martin Tlaskal was very informative. Lempp explained that the demand placed on FilmLight’s customers is similarly polarized. On one hand, clients — including major studios and Netflix — mandate fastidious adherence to advanced and ever-improving technical standards, as well as image pipelines that are certified at every step. On the other hand, different clients place deadline or budget as a prevalent concern. Tlaskal set out for FilmLight to support those color specialists that aim for top-of-the industry excellence. Having the template for the target customer defines and drives what features FilmLight will develop for its Baselight product.

FilmLight

At IBC 2019, FilmLight hosted guest speaker-led demonstrations (“Colour on Stage”) to inspire creative grading and to present its latest features and improvements including better hue-angle keying, tracking and dealing with lens distortions.

Blackmagic is no less focused on knowing its customer, which explains its success in recent years. DaVinci Resolve once shared the “premium” space occupied by FilmLight but went through a transition to aim itself squarely at a democratized post production landscape. This shift meant a recognition that there would be millions of content producers and thousands of small post houses rather than a handful of large post facilities. That transition required a great deal more than merely slashing the price. The software product would have to work on myriad hardware combinations, not just the turnkey approved setup, and would need to have features and documentation aimed at those who hadn’t spent the past three years training in a post facility. By knowing exactly who the customer would be, Blackmagic built Resolve into an extremely successful, cross-discipline, post production powerhouse. Blackmagic was demonstrating the latest Resolve at IBC 2019, although all new features had been previously announced because, as director of software engineering Rohit Gupta explained, Blackmagic does not time its feature releases to IBC.

SGO

Aiming between the extremities established by FilmLight and Blackmagic Design, SGO promoted a new positioning of its flagship product, Mistika, via the Boutique subproduct. This is essentially a software-only Mistika that runs on PC or Mac. Subscription prices range from 99 euros per month to 299 euros per month, depending on features, although there have been several discounted promotions. The more expensive options include SGO’s highly regarded stereo 3D tools and camera stitching features for producing wrap-around movies.

Another IBC — done!


David Cox is a VFX compositor and colorist with more than 20 years of experience. He started his career with MPC and The Mill before forming his own London-based post facility. Cox specializes in unusual projects, such as those using very high resolutions and interactive immersive experiences featuring realtime render engines and augmented reality.

Colorist Chat: Technicolor’s Doug Delaney

Industry veteran Doug Delaney started his career in VFX before the days of digital, learning his craft from the top film timers and color scientists as well as effects supervisors.

Today he is a leading colorist and finisher at Technicolor, working on major movies including the recent Captain Marvel. We spoke to him to find out more about how he works.

NAME: Doug Delaney

TITLE: Senior Colorist

IN ADDITION TO CAPTAIN MARVEL, CANYOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
We have just wrapped on Showtime’s The Loudest Voice, which documented Fox News’ Roger Ailes and starred Russell Crow, Naomi Watts and Sienna Miller.

I also just had the immense pleasure of working with DP Cameron Duncan on Nat Geo’s thriller The Hot Zone. For that show we actually worked together early on to establish two looks — one for laboratory scenes taking place in Washington, DC, and another for scenes in central Africa. These looks were then exported as LUTs for dailies so that the creative intent was established from the beginning of shooting and carried through to finishing.

And earlier this year I worked on Love, Death & Robots, which just received two Emmy nominations, so big congrats to that team!

ARE YOU SOMETIMES ASKED TO DO MORE THAN JUST COLOR ON PROJECTS?
Yes, these days I tend to think of “colorists” as finishing artists — meaning that our suites are typically the last stop for a project and where everything comes together.

The technology we have access to in our suites continues to develop, and therefore our capabilities have expanded — there is more we can do in our suites that previously would have needed to be handled by others. A perfect example is visual effects. Sometimes we get certain shots in from VFX vendors that are well-executed but need to be a bit more nuanced — say it’s a driving scene against a greenscreen, and the lighting outside the car feels off for the time of day it’s supposed to be in the scene. Whereas we used to have to kick it back to VFX to fix, I can now go in and use the alpha channels and mattes to color-correct that imbalance.

And what’s important about this new ability is that in today’s demanding schedules and deadlines, it allows us to work collaboratively in real time with the creative rather than in an iterative workflow that takes time we often don’t have.

WHAT’S YOUR FAVORITE PART OF THE JOB?
The look development. That aspect can take on various conversations depending on the project. Sometimes it’s talking with filmmakers in preproduction, sometimes just when it gets to post, but ultimately, being part of the creative journey and how to deliver the best-looking show is what I love.

That and when the final playback happens in our room, when the filmmakers see for the first time all of the pieces of the puzzle come together with sound … it’s awesome.

ANY SUGGESTIONS FOR GETTING THE MOST OUT OF A PROJECT FROM A COLOR PERSPECTIVE?
Understanding that each project has a different relationship with the filmmaker, there needs to be transparency and agreement to the process amongst the director, DP, execs, etc. Whether a clear vision is established early on or they are open to further developing the look, a willingness to engage in an open dialogue is key.

Personally, I love when I’m able to help develop the color pipeline in preproduction, as I find it often makes the post experience more seamless. For example, what aired on Strange Angel Season 2 was not far removed from dailies because we had established a LUT in advance and had worked with wardrobe, make-up and others to carry the look through. It doesn’t need to be complicated, but open communication and planning really can go a long way in creating a stunning visual identity and a seamless experience.

HOW DO YOU PREFER THE DP OR DIRECTOR TO DESCRIBE THE LOOK THEY WANT? PHYSICAL EXAMPLES, FILMS TO EMULATE, ETC.?
Physical examples — photo books, style sheets with examples of tones they like and things like that. But ultimately my role is to correctly interpret what it is that they like in what they are showing me and to discern if what they are looking for is a literal representation, or more of an inspiration to start from and massage. Again, the open communication and ability to develop strong working relationships — in which I’m able to discern when there is a direct ask versus a need versus an opportunity to do more and push the boundaries — is key to a successful project.

WHAT SYSTEM DO YOU WORK ON?
Baselight. I love the flexibility of the system and the support that the FilmLight team provides us, as we are constantly pushing the capabilities of the platform, and they continue to deliver.

 

Boris FX beefs up film VFX arsenal, buys SilhouetteFX, Digital Film Tools

Boris FX, a provider of integrated VFX and workflow solutions for video and film, has bought SilhouetteFX (SFX) and Digital Film Tools (DFT). The two companies have a long history of developing tools used on Hollywood blockbusters and experience collaborating with top VFX studios, including Weta Digital, Framestore, Technicolor and Deluxe.

This is the third acquisition by Boris FX in recent years — Imagineer Systems (2014) and GenArts (2016) — and builds upon the company’s editing, visual effects, and motion graphics solutions used by post pros working in film and television. Silhouette and Digital Film Tools join Boris FX’s tools Sapphire, Continuum and Mocha Pro.

Silhouette’s groundbreaking non-destructive paint and advanced rotoscoping technology was recognized earlier this year by the Academy of Motion Pictures (Technical Achievement Award). It first gained prominence after Weta Digital used the rotoscoping tools on King Kong (2005). Now the full-fledged GPU-accelerated node-based compositing app features over 100 VFX nodes and integrated Boris FX Mocha planar tracking. Over the last 15 years, feature film artists have used Silhouette on films including Avatar (2009), The Hobbit (2012), Wonder Woman (2017), Avengers: End Game (2019) and Fast & Furious Presents: Hobbs & Shaw (2019).

Avengers: End Game courtesy of Marvel

Digital Film Tools (DFT) emerged as an off-shoot of a LA-based motion picture visual effects facility whose work included hundreds of feature films, commercials and television shows.

The Digital Film Tools portfolio includes standalone applications as well as professional plug-in collections for filmmakers, editors, colorists and photographers. The products offer hundreds of realistic filters for optical camera simulation, specialized lenses, film stocks and grain, lens flares, optical lab processes, color correction, keying and compositing, as well as natural light and photographic effects. DFT plug-ins support Adobe’s Photoshop, Lightroom, After Effects and Premiere Pro; Apple’s Final Cut Pro X and Motion; Avid’s Media Composer; and OFX hosts, including Foundry Nuke and Blackmagic DaVinci Resolve.

“This acquisition is a natural next step to our continued growth strategy and singular focus on delivering the most powerful VFX tools and plug-ins to the content creation market,”
“Silhouette fits perfectly into our product line with superior paint and advanced roto tools that highly complement Mocha’s core strength in planar tracking and object removal,” says Boris Yamnitsky, CEO/founder of Boris FX. “Rotoscoping, paint, digital makeup and stereo conversion are some of the most time-consuming, labor-intensive aspects of feature film post. Sharing technology and tools across all our products will make Silhouette even stronger as the leader in these tasks. Furthermore, we are very excited to be working with such an accomplished team [at DFT] and look forward to collaborating on new product offerings for photography, film and video.”

Silhouette founders, Marco Paolini, Paul Miller and Peter Moyer, will continue in their current leadership roles and partner with the Mocha product development team to collaborate on delivering next-generation tools. “By joining forces with Boris FX, we are not only dramatically expanding our team’s capabilities, but we are also joining a group of like-minded film industry pros to provide the best solutions and support to our customers,” says Marco Paolini, Product Designer. “The Mocha planar tracking option we currently license is extremely popular with Silhouette paint and roto artists, and more recently through OFX, we’ve added support for Sapphire plug-ins. Working together under the Boris FX umbrella is our next logical step and we are excited to add new features and continue advancing Silhouette for our user base.”

Both Silhouette and Digital Film Tool plug-ins will continue to be developed and sold under the Boris FX brand. Silhouette will adopt the Boris FX commitment to agile development with annual releases, annual support and subscription options.

Main Image: Silhouette

SGO Mistika Boutique at IBC with Dolby Vision, color workflows

At IBC, SGO will be showing enhancements and upgrades of its subscription-based finishing solution, Mistika Boutique. The company will demo color management solutions as well as HDR content delivery workflows with recently integrated Dolby Vision support.

This professional color grading toolset combined with the finishing functionality of Mistika Boutique will be showcased running on a Mac Pro workstation with Tangent Arc control panels and output to a Canon 4K HDR reference display through Blackmagic Design DeckLink I/O.

Mistika Boutique is hardware-agnostic and runns on both Windows and MacOS.

SGO is offering a variety of sessions highlighting the trending topics for the content creation industry that feature Mistika Boutique as well as Mistika Workflows and Mistika VR at their stand.

While at the show, SGO is offering a special IBC promotion for Mistika Boutique. Anyone who subscribes by September 30, 2019 will get the Professional Immersive Edition for €99/month or €990/year (or whatever your bank’s conversion rate is), which represents a saving of over 65% from the normal price. The special IBC promotional price will be maintained as long as the subscription is not canceled and remains active.

Company 3 buys Sixteen19, offering full-service post in NYC

Company 3 has acquired Sixteen19, a creative editorial, production and post company based in New York City. The deal includes Sixteen19’s visual effects wing, PowerHouse VFX, and a mobile dailies operation with international reach.

The acquisition helps Company 3 further serve NYC’s booming post market for feature film and episodic TV. As part of the acquisition, industry veterans and Sixteen19 co-founders Jonathan Hoffman and Pete Conlin, along with their longtime collaborator, EVP of business development and strategy Alastair Binks, will join Company 3’s leadership team.

“With Sixteen19 under the Company 3 umbrella, we significantly expand what we bring to the production community, addressing a real unmet need in the industry,” says Company 3 president Stefan Sonnenfeld. “This infusion of talent and infrastructure will allow us to provide a complete suite of services for clients, from the start of production through the creative editing process to visual effects, final color, finishing and mastering. We’ve worked in tandem with Sixteen19 many times over the years, so we know that they have always provided strong client relationships, a best-in-class team and a deeply creative environment. We’re excited to bring that company’s vision into the fold at Company 3.”

Sonnenfeld will continue to serve as president of Company 3, and oversee operations of Sixteen19. As a subsidiary of Deluxe, Company 3 is part of a broad portfolio of post services. Bringing together the complementary services and geographic reach of Company3, Sixteen19 and Powerhouse VFX, will expand Company 3’s overall portfolio of post offerings and reach new markets in the US and internationally.

Sixteen19’s New York location includes 60 large editorial suites; two 4K digital cinema grading theaters; and a number of comfortable spaces, open environments and many common areas. Sixteen19’s mobile dailies services will add a perfect companion to Company 3’s existing offerings in that arena. PowerHouse VFX includes dedicated teams of experienced supervisors, producers and artists in 2D and 3D visual effects and compositing.

“The New York film community initially recognized the potential for a Company 3 and Sixteen19 partnership,” says Sixteen19’s Hoffman. “It’s not just the fact that a significant majority of the projects we work on are finished at Company 3, it’s more that our fundamental vision about post has always been aligned with Stefan’s. We value innovation; we’ve built terrific creative teams; and above all else, we both put clients first, always.”

Sixteen19 and Powerhouse VFX will retain their company names.

Behind the Title: Mission’s head of digital imaging, Pablo Garcia Soriano

NAME: Pablo Garcia Soriano (@pablo.garcia.soriano)

COMPANY: UK-based Mission (@missiondigital)

CAN YOU DESCRIBE YOUR COMPANY?
Mission is a provider of DIT and digital lab services based in London, with additional offices in Cardiff, Rome, Prague and Madrid. We process and manage media and metadata, producing rich deliverables with as much captured metadata as possible — delivering consistency and creating efficiencies in VFX and post production.

WHAT’S YOUR JOB TITLE?
Head of Digital Imaging

WHAT DOES THAT ENTAIL?
I work with cinematographers to preserve their vision from the point of capture until the final deliverable. This means supporting productions through camera tests, pre-production and look design. I also work with manufacturers, which often means I get an early look at new products.

Mission

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
It sounds like a very technical job, but it’s so much more than engineering — it’s creative engineering. It’s problem solving and making technical complexities seem easy to a creative person.

WHAT’S YOUR FAVORITE PART OF THE JOB?
I love working with cinematographers to help them achieve their vision and make sure it is preserved through post. I also enjoy being able to experiment with the latest technology and have an influence on products. Recently, I’ve been involved with growing Mission’s international presence with our Madrid office, which is particularly close to my heart.

WHAT’S YOUR LEAST FAVORITE?
Sometimes I get to spend hours in a dark room with a probe calibrating monitors. It’s dull but necessary!

WHAT IS YOUR MOST PRODUCTIVE TIME OF THE DAY?
In the early to mid-morning after two coffees. Also at the end of the day when the office is quieter.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Gardening… or motor racing.

WHY DID YOU CHOOSE THIS PROFESSION?
I feel like it chose me. I’m an architect by training, but was a working musician until around the age of 28 when I stepped down from the stage and started as a freelancer doing music promos. I was doing a bit of everything on those, director, editor, finishing, etc. Then I was asked to be the assistant editor on two films by a colleague whom I was sharing and office with.

After this experience (and due to the changes the music industry was going through), I decided to focus fully on editing several documentaries, short films. I then ended up on a weekly TV show where I was in charge of the final assembly. This is where I started paying attention to continuity and the overall look. I was using Apple Final Cut and Apple Color, which I loved. All of this happened in a very organic way and I was always self-taught.

I didn’t take studying seriously until I met the DP Rafa Roche, AEC, on our first film together around the age of 31. Rafa mentored me, teaching me all about cameras, lenses, filters and filled my brain with curiosity about all the technical stuff (signal, codecs, workflows). From there to now it all has been a bit of a rollercoaster with some moments of real vertigo caused by how fast it all has developed.

Downton Abby

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
We work on a lot of features and television in the UK and Europe — recent projects include Cats, Downton Abbey, Cursed and Criminal.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
In 2018, I was the HDR image supervisor for the World Cup in Moscow. Knowing the popularity of football and working on a project that would be seen by so many people around the world was truly an honor, despite the pressure!

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
A good reference monitor, a good set of speakers and Spotify.

DO YOU LISTEN TO MUSIC WHILE YOU WORK?
Yes, music is a huge part of my life. I have very varied taste. For example, I enjoy Wilco, REM and Black Sabbath.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I like to walk by the River Thames in Hammersmith, London, near where I live.

FilmLight sets speakers for free Color On Stage seminar at IBC

At this year’s IBC, FilmLight will host a free two-day seminar series, Color On Stage, on September 14 and 15. The event features live presentations and discussions with colorists and other creative professionals. The event will cover topics ranging from the colorist today to understanding color management and next-generation grading tools.

“Color on Stage offers a good platform to hear about real-world interaction between colorists, directors and cinematographers,” explains Alex Gascoigne, colorist at Technicolor and one of this year’s presenters. “Particularly when it comes to large studio productions, a project can take place over several months and involve a large creative team and complex collaborative workflows. This is a chance to find out about the challenges involved with big shows and demystify some of the more mysterious areas in the post process.”

This year’s IBC program includes colorists from broadcast, film and commercials, as well as DITs, editors, VFX artists and post supervisors.

Program highlights include:
•    Creating the unique look for Mindhunter Season 2
Colorist Eric Weidt will talk about his collaboration with director David Fincher — from defining the workflow to creating the look and feel of Mindhunter. He will break down scenes and run through color grading details of the masterful crime thriller.

•    Realtime collaboration on the world’s longest running continuing drama, ITV Studios’ Coronation Street
The session will address improving production processes and enhancing pictures with efficient renderless workflows, with colorist Stephen Edwards, finishing editor Tom Chittenden and head of post David Williams.

•    Looking to the future: Creating color for the TV series Black Mirror
Colorist Alex Gascoigne of Technicolor will explain the process behind grading Black Mirror, including the interactive episode Bandersnatch and the latest Season 5.

•    Bollywood: A World of Color
This session will delve into the Indian film industry with CV Rao, technical general manager at Annapurna Studios in Hyderabad. In this talk, CV will discuss grading and color as exemplified by the hit film Baahubali 2: The Conclusion.

•    Joining forces: Strengthening VFX and finishing with the BLG workflow
Mathieu Leclercq, head of post at Mikros Image in Paris, will be joined by colorist Sebastian Mingam and VFX supervisor Franck Lambertz to showcase their collaboration on recent projects.

•    Maintaining the DP’s creative looks from set to post
Meet with French DIT Karine Feuillard, ADIT — who worked on the latest Luc Besson film Anna as well as the TV series The Marvelous Mrs Maisel — and FilmLight workflow specialist Matthieu Straub.

•    New color management and creative tools to make multi-delivery easier
The latest and upcoming Baselight developments, including a host of features aimed to simplify delivery for emerging technologies such as HDR. With FilmLight’s Martin Tlaskal, Daniele Siragusano and Andy Minuth.

Color On Stage will take place in Room D201 on the second floor of the Elicium Centre (Entrance D), close to Hall 13. The event is free to attend but spaces are limited. Registion is available here.

Harbor expands to LA and London, grows in NY

New York-based Harbor has expanded into Los Angeles and London and has added staff and locations in New York. Industry veteran Russ Robertson joins Harbor’s new Los Angeles operation as EVP of sales, features and episodic after a 20-year career with Deluxe and Panavision. Commercial director James Corless and operations director Thom Berryman will spearhead Harbor’s new UK presence following careers with Pinewood Studios, where they supported clients such as Disney, Netflix, Paramount, Sony, Marvel and Lucasfilm.

Harbor’s LA-based talent pool includes color grading from Yvan Lucas, Elodie Ichter, Katie Jordan and Billy Hobson. Some of the team’s projects include Once Upon a Time … in Hollywood, The Irishman, The Hunger Games, The Maze Runner, Maleficent, The Wolf of Wall Street, Snow White and the Huntsman and Rise of the Planet of the Apes.

Paul O’Shea, formerly of MPC Los Angeles, heads the visual effects teams, tapping lead CG artist Yuichiro Yamashita for 3D out of Harbor’s Santa Monica facility and 2D creative director Q Choi out of Harbor’s New York office. The VFX artists have worked with brands such as Nike, McDonald’s, Coke, Adidas and Samsung.

Harbor’s Los Angeles studio supports five grading theaters for feature film, episodic and commercial productions, offering private connectivity to Harbor NY and Harbor UK, with realtime color-grading sessions, VFX reviews and options to conform and final-deliver in any location.

The new UK operation, based out of London and Windsor, will offer in-lab and near-set dailies services along with automated VFX pulls and delivery through Harbor’s Anchor system. The UK locations will draw from Harbor’s US talent pool.

Meanwhile, the New York operation has grown its talent roster and Soho footprint to six locations, with a recently expanded offering for creative advertising. Veteran artists on the commercial team include editors Bruce Ashley and Paul Kelly, VFX supervisor Andrew Granelli, colorist Adrian Seery, and sound mixers Mark Turrigiano and Steve Perski.

Harbor’s feature and episodic offering continues to expand, with NYC-based artists available in Los Angeles and London.

GLOW’s DP and colorist adapt look of new season for Vegas setting

By Adrian Pennington

Netflix’s Gorgeous Ladies of Wrestling (GLOW) are back in the ring for a third round of the dramatic comedy, but this time the girls are in Las Vegas. The glitz and glamour of Sin City seems tailor-made for the 1980s-set GLOW and provided the main creative challenge for Season 3 cinematographer Chris Teague (Russian Doll, Broad City).

DP Chris Teague

“Early on, I met with Christian Sprenger, who shot the first season and designed the initial look,” says Teague, who was recently nominated for an Emmy for his work on Russian Doll. “We still want GLOW to feel like GLOW, but the story and character arc of Season 3 and the new setting led us to build on the look and evolve elements like lighting and dynamic range.”

The GLOW team is headlining the Fan-Tan Hotel & Casino, one of two main sets along with a hotel built for the series and featuring the distinctive Vegas skyline as a backdrop.

“We discussed compositing actors against greenscreen, but that would have turned every shot into a VFX shot and would have been too costly, not to mention time-intensive on a TV schedule like ours,” he says. “Plus, working with a backdrop just felt aesthetically right.”

In that vein, production designer Todd Fjelsted built a skyline using miniatures, a creative decision in keeping with the handcrafted look of the show. That decision, though, required extensive testing of lenses, lighting and look prior to shooting. This testing was done in partnership with post house Light Iron.

“There was no overall shift in the look of the show, but together with Light Iron, we felt the baseline LUT needed to be built on, particularly in terms of how we lit the sets,” explains Teague.

“Chris was clear early on that he wanted to build upon the look of the first two seasons,” says Light Iron colorist Ian Vertovec. “We adjusted the LUT to hold a little more color in the highlights than in past seasons. Originally, the LUT was based on a film emulation and adjusted for HDR. In Season 1, we created a period film look and transformed it for HDR to get a hybrid film emulation LUT. For Season 3, for HDR and standard viewing, we made tweaks to the LUT so that some of the colors would pop more.”

The show was also finished in Dolby Vision HDR. “There was some initial concern about working with backdrops and stages in HDR,” Teague says. “We are used to the way film treats color over its exposure range — it tends to desaturate as it gets more overexposed — whereas HDR holds a lot more color information in overexposure. However, Ian showed how it can be a creative tool.”

Colorist Ian Vertovec

“The goal was to get the 1980s buildings in the background and out the hotel windows to look real — emulating marquees with flashing lights,” adds Vertovec. “We also needed it to be a believable Nevada sky and skyline. Skies and clouds look different in HDR. So, when dialing this in, we discussed how they wanted it to look. Did it feel real? Is the sky in this scene too blue? Information from testing informed production, so everything was geared toward these looks.”

“Ian has been on the first two seasons, so he knows the look inside and out and has a great eye,” Teague continues. “It’s nice to come into a room and have his point of view. Sometimes when you are staring at images all day, it’s easy to lose your objectivity, so I relied on Ian’s insight.” Vertovec grades the show on FilmLight’s Baselight.

As with Season 2, GLOW Season 3 was a Red Helium shoot using Red’s IPP2 color pipeline in conjunction with Vertovec’s custom LUTs all the way to post. Teague shot full 8K resolution to accommodate his choice of Cooke anamorphic lenses, desqueezed and finished in a 2:1 ratio.

“For dailies I used an iPad with Moxion, which is perhaps the best dailies viewing platform I’ve ever worked with. I feel like the color is more accurate than other platforms, which is extremely useful for checking out contrast and shadow level. Too many times with dailies you get blacks washed out and highlights blown and you can’t judge anything critical.”

Teague sat in on the grade of the first three of the 10 episodes and then used the app to pull stills and make notes remotely. “With Ian I felt like we were both on the same page. We also had a great DIT [Peter Brunet] who was doing on-set grading for reference and was able to dial in things at a much higher level than I’ve been able to do in the past.”

The most challenging but also rewarding work was shooting the wrestling performances. “We wanted to do something that felt a little bigger, more polished, more theatrical,” Teague says. “The performance space had tiered seating, which gave us challenges and options in terms of moving the cameras. For example, we could use telescoping crane work to reach across the room and draw characters in as they enter the wrestling ring.”

He commends gaffer Eric Sagot for inspiring lighting cues and building them into the performance. “The wrestling scenes were the hardest to shoot but they’re exciting to watch — dynamic, cinematic and deliberately a little hokey in true ‘80s Vegas style.”


Adrian Pennington is a UK-based journalist, editor and commentator in the film and TV production space. He has co-written a book on stereoscopic 3D and edited several publications.

Digital Arts expands team, adds Nutmeg Creative talent

Digital Arts, an independently owned New York-based post house, has added several former Nutmeg Creative talent and production staff members to its roster — senior producer Lauren Boyle, sound designer/mixers Brian Beatrice and Frank Verderosa, colorist Gary Scarpulla, finishing editor/technical engineer Mark Spano and director of production Brian Donnelly.

“Growth of talent, technology, and services has always been part of the long-term strategy for Digital Arts, and we’re fortunate to welcome some extraordinary new talent to our staff,” says Digital Arts owner Axel Ericson. “Whether it’s long-form content for film and television, or working with today’s leading agencies and brands creating dynamic content, we have the talent and technology to make all of our clients’ work engaging, and our enhanced services bring their creative vision to fruition.”

Brian Donnelly, Lauren Boyle and Mark Spano.

As part of this expansion, Digital Arts will unveil additional infrastructure featuring an ADR stage/mix room. The current facility boasts several state-of-the-art audio suites, a 4K finishing theater/mixing dubstage, four color/finishing suites and expansive editorial and production space, which is spread over four floors.

The former Nutmeg team has hit the ground running working their long-time ad agency, network, animation and film studio clients. Gary Scarpulla worked on color for HBO’s Veep and Los Espookys, while Frank Verderosa has been working with agency Ogilvy on several Ikea campaigns. Beatrice mixed spots for Tom Ford’s cosmetics line.

In addition, Digital Arts’ in-house theater/mixing stage has proven to be a valuable resource for some of the most popular TV productions, including recording recent commentary sessions for the legendary HBO series, Game of Thrones and the final season of Veep.

Especially noteworthy is colorist Ericson’s and finishing editor Mark Spano’s collaboration with Oscar-winning directors Karim Amer and Jehane Noujaim to bring to fruition the Netflix documentary The Great Hack.

Digital Arts also recently expanded its offerings to include production services. The company has already delivered projects for agencies Area 23, FCB Health and TCA.

“Digital Arts’ existing infrastructure was ideally suited to leverage itself into end-to-end production,” Donnelly says. “Now we can deliver from shoot to post.”

Tools employed across post are Avid Pro Tools, D Control ES, S3 for audio post and Avid Media Composer, Adobe Premiere and Blackmagic Resolve for editing. Color grading is via Resolve.

Main Image: (L-R) Frank Verderosa, Brian Beatrice and Gary Scarpulla

 

Blackmagic: Resolve 16.1 in public beta, updates Pocket Cinema Camera

Blackmagic Design has announced DaVinci Resolve 16.1, an updated version of its edit, color, visual effects and audio post software that features updates to the new cut page, further speeding up the editing process.

With Resolve 16, introduced at NAB 2019, now in final release, the Resolve 16.1 public beta is now available for download from the Blackmagic Design website. This new public beta will help Blackmagic continue to develop new ideas while collaborating with users to ensure those ideas are refined for real-world workflows.

The Resolve 16.1 public beta features changes to the bin that now make it possible to place media in various folders and isolate clips from being used when viewing them in the source tape, sync bin or sync window. Clips will appear in all folders below the current level, and as users navigate around the levels in the bin, the source tape will reconfigure in real time. There’s even a menu for directly selecting folders in a user’s project.

Also new in this public beta is the smart indicator. The new cut page in DaVinci Resolve 16 introduced multiple new smart features, which work by estimating where the editor wants to add an edit or transition and then applying it without the editor having to waste time placing exact in and out points. The software guesses what the editor wants to do and just does it — it adds the inset edit or transition to the edit closest to where the editor has placed the CTI.

But a problem can arise in complex edits, where it is hard to know what the software would do and which edit it would place the effect or clip into. That’s the reason for the beta version’s new smart indicator. The smart indicator provides a small marker in the timeline so users get constant feedback and always know where DaVinci Resolve 16.1 will place edits and transitions. The new smart indicator constantly live-updates as the editor moves around the timeline.

One of the most common items requested by users was a faster way to cut clips in the timeline, so now DaVinci Resolve 16.1 includes a “cut clip” icon in the user interface. Clicking on it will slice the clips in the timeline at the CTI point.

Multiple changes have also been made to the new DaVinci Resolve Editor Keyboard, including a new adaptive scroll feature on the search dial, which will automatically slow down a job when editors are hunting for an in point. The live trimming buttons have been renamed to the same labels as the functions in the edit page, and they have been changed to trim in, trim out, transition duration, slip in and slip out. The function keys along the top of the keyboard are now being used for various editing functions.

There are additional edit models on the function keys, allowing users to access more types of editing directly from dedicated keys on the keyboard. There’s also a new transition window that uses the F4 key, and pressing and rotating the search dial allows instant selection from all the transition types in DaVinci Resolve. Users who need quick picture picture-in in-picture effects can use F5 and apply them instantly.

Sometimes when editing projects with tight deadlines, there is little time to keep replaying the edit to see where it drags. DaVinci Resolve 16.1 features something called a Boring Detector that highlights the timeline where any shot is too long and might be boring for viewers. The Boring Detector can also show jump cuts, where shots are too short. This tool allows editors to reconsider their edits and make changes. The Boring Detector is helpful when using the source tape. In that case, editors can perform many edits without playing the timeline, so the Boring Detector serves as an alternative live source of feedback.

Another one of the most requested features of DaVinci Resolve 16.1 is the new sync bin. The sync bin is a digital assistant editor that constantly sorts through thousands of clips to find only what the editor needs and then displays them synced to the point in the timeline the editor is on. The sync bin will show the clips from all cameras on a shoot stacked by camera number. Also, the viewer transforms into a multi-viewer so users can see their options for clips that sync to the shot in the timeline. The sync bin uses date and timecode to find and sync clips, and by using metadata and locking cameras to time of day, users can save time in the edit.

According to Blackmagic, the sync bin changes how multi-camera editing can be completed. Editors can scroll off the end of the timeline and keep adding shots. When using the DaVinci Resolve Editor Keyboard, editors can hold the camera number and rotate the search dial to “live overwrite” the clip into the timeline, making editing faster.

The closeup edit feature has been enhanced in DaVinci Resolve 16.1. It now does face detection and analysis and will zoom the shot based on face positioning to ensure the person is nicely framed.

If pros are using shots from cameras without timecode, the new sync window lets them sort and sync clips from multiple cameras. The sync window supports sync by timecode and can also detect audio and sync clips by sound. These clips will display a sync icon in the media pool so editors can tell which clips are synced and ready for use. Manually syncing clips using the new sync window allows workflows such as multiple action cameras to use new features such as source overwrite editing and the new sync bin.

Blackmagic Pocket Cinema Camera
Besides releasing the DaVinci Resolve 16.1 public beta, Blackmagic also updated the Blackmagic Pocket Cinema Camera. Blackmagic not only upgraded the camera from 4K to 6K resolution, but it changed the mount to the much used Canon EF style. Previous iterations of the Pocket Cinema Camera used a Micro 4/3s mount, but many users chose to purchase a Micro 4/3s-to-Canon EF adapter, which easily runs over $500 new. Because of the mount change in the Pocket Cinema Camera 6K, users can avoid buying the adapter and — if they shoot with Canon EF — can use the same lenses.

London’s Cheat expands with color and finishing suites

London-based color and finishing house Cheat has expanded, adding three new grading and finishing suites, a production studio and a client lounge/bar space. Cheat now has four large broadcast color suites and services two other color suites at Jam VFX and No.8 in Fitzrovia and Soho, respectively. Cheat has a creative partnership with these studios.

Located in the Arthaus building in Hackney, all four of Cheat’s color suites have calibrated projection or broadcast monitoring and are equipped with cutting-edge hardware for HDR and working with 8K. Cheat was the first color company to complete a TV series in 8K on Netflix’s The End of The F***ing World in 2017. Having invested in improved storage and network infrastructure during this period, the facility is well-equipped to take on 8K and HDR projects.

Cheat uses Autodesk Flame for finishing and Blackmagic DaVinci Resolve for color grading.

The new HDR grading suite offers HDR mastering above 2,000 nits with a Flanders Scientific XM310K reference monitor that can master up to 3,000 nits. Cheat is also now a full-fledged Dolby Vision-certified mastering facility.

“Improving client experience was, of course, a key consideration in shaping the design of the renovation,” says Toby Tomkins, founder of Cheat. “The new color suite is our largest yet and comfortably seats up to 10 people. We designed it from the ground up with a raised client platform and a custom-built bias wall. This allows everyone to look at the same single monitor while grading and maintaining the spacious and relaxed feel of our other suites. The new lounge and bar area also offer a relaxing area for clients to feel at home.”

Dick Wolf’s television empire: his production and post brain trust

By Iain Blair

The TV landscape is full of scripted police procedurals and true crime dramas these days, but the indisputable and legendary king of that crowded landscape is Emmy-winning creator/producer Dick Wolf, whose name has become synonymous with high-quality drama.

Arthur Forney

Since it burst onto the scene back in 1990, his Law & Order show has spawned six dramas and four international spinoffs, while his “Chicago” franchise gave birth to another four series — the hugely popular Chicago Med, Chicago Fire and Chicago P.D. His Chicago Justice was cancelled after one season.

Then there’s his “FBI” shows, as well as the more documentary-style Cold Justice. If you’ve seen Cold Justice — and you should — you know that this is the real deal, focusing on real crimes. It’s all the more fascinating and addictive because of it.

Produced by Wolf and Magical Elves, the real-life crime series follows veteran prosecutor Kelly Siegler, who gets help from seasoned detectives as they dig into small-town murder cases that have lingered for years without answers or justice for the victims. Together with local law enforcement from across the country, the Cold Justice team has successfully helped bring about 45 arrests and 20 convictions. No case is too cold for Siegler, as the new season delves into new unsolved homicides while also bringing updates to previous cases. No wonder Wolf calls it “doing God’s work.” Cold Justice airs on true crime network Oxygen.

I recently spoke with Emmy-winning Arthur Forney, executive producer of all Wolf Entertainment’s scripted series (he’s also directed many episodes), about posting those shows. I also spoke with Cold Justice showrunner Liz Cook and EP/head of post Scott Patch.

Chicago Fire

Dick Wolf has said that, as head of post, you are “one of the irreplaceable pieces of the Wolf Films hierarchy.” How many shows do you oversee?
Arthur Forney: I oversee all of Wolf Entertainment’s scripted series, including Law & Order: Special Victims Unit, Chicago Fire, Chicago P.D., Chicago Med, FBI and FBI: Most Wanted.

Where is all the post done?
Forney: We do it all at NBCUniversal StudioPost in LA.

How involved is Dick Wolf?
Forney: Very involved, and we talk all the time.

How does the post pipeline work?
Forney: All film is shot on location and then sent back to the editing room and streamed into the lab. From there we do all our color corrections, which takes us into downloading it into Avid Media Composer.

What are the biggest challenges of the post process on the shows?
Forney: Delivering high-quality programming with a shortened post schedule.

Chicago Med

What are the editing challenges involved?
Forney: Trying to find the right way of telling the story, finding the right performances, shaping the show and creating intensity that results in high-quality television.

What about VFX? Who does them?
Forney: All of our visual effects are done by Spy Post in Santa Monica. All of the action is enhanced and done by them.

Where do you do the color grading?
Forney: Coloring/grading is all done at NBCUniversal StudioPost.

Now let’s talk to Cook and Patch about Cold Justice:

Liz and Scott, I recently saw the finale to Season 5 of Cold Justice. That was a long season.
Liz Cook: Yes, we did 26 episodes, so it was a lot of very long days and hard work.

It seems that there’s more focus than ever on drug-related cases now.
Cook: I don’t think that was the intention going in, but as we’ve gone on, you can’t help but recognize the huge drug problem in America now. Meth and opioids pop up in a lot of cases, and it’s obviously a crisis, and even if they aren’t the driving force in many cases, they’re definitely part of many.

L-R: Kelly Siegler, Dick Wolf, Scott Patch and Liz Cook. Photo by Evans Vestal Ward

How do you go about finding cases for the show?
Cook: We have a case-finding team, and they get the cases various ways, including cold-calling. We have a team dedicated to that, calling every day, and we get most of them that way. A lot come through agencies and sheriff’s departments that have worked with us before and want to help us again. And we get some from family members and some from hits on the Facebook page we have.

I assume you need to work very closely with local law agencies as you need access to their files?
Cook: Exactly. That’s the first part of the whole puzzle. They have to invite us in. The second part is getting the family involved. I don’t think we’d ever take on a case that the family didn’t want us to do.

What’s involved for you, and do you like being a showrunner?
Cook: It’s a tough job and pretty demanding, but I love it. We go through a lot of steps and stuff to get a case approved, and to get the police and family on board, and then we get the case read by one of our legal readers to evaluate it and see if there’s a possibility that we can solve it. At that point we pitch it to the network, and once they approve it and everyone’s on board, then if there are certain things like DNA and evidence that might need testing, we get all that going, along with ballistics that need researching, and stuff like phone records and so on. And it actually moves really fast – we usually get all these people on board within three weeks.

How long does it take to shoot each show?
Cook: It varies, as each show is different, but around seven or eight days, sometimes longer. We have a case coming up with cadaver dogs, and that stuff will happen before we even get to the location, so it all depends. And some cases will have 40 witnesses, while others might have over 100. So it’s flexible.

Cold Justice

Where do you post, and what’s the schedule like?
Scott Patch: We do it all at the Magical Elves offices here in Hollywood — the editing, sound, color correction. The online editor and colorist is Pepe Serventi, and we have it all on one floor, and it’s really convenient to have all the post in house. The schedule is roughly two months from the raw footage to getting it all locked and ready to air, which is quite a long time.

Dailies come back to us and we do our first initial pass by the story team and editors, and they’ll start whittling all the footage down. So it takes us a couple of weeks to just look at all the footage, as we usually have about 180 hours of it, and it takes a while to turn all that into something the editors can deal with. Then it goes through about three network passes with notes.

What about dealing with all the legal aspects?
Patch: That makes it a different kind of show from most of the others, so we have legal people making sure all the content is fine, and then sometimes we’ll also get notes from local law agencies, as well as internal notes from our own producers. That’s why it takes two months from start to finish.

Cook: We vet it through local law, and they see the cuts before it airs to make sure there are no problems. The biggest priority for us is that we don’t hurt the case at all with our show, so we always check it all with the local D.A. and police. And we don’t sensationalize anything.

Cold Justice

Patch: That’s another big part of editing and post – making sure we keep it authentic. That can be a challenge, but these are real cases with real people being accused of murder.

Cook: Our instinct is to make it dramatic, but you can’t do that. You have to protect the case, which might go to trial.

Talk about editing. You have several editors, I assume because of the time factor. How does that work?
Patch: Some of these cases have been cold for 25 or 30 years, so when the field team gets there, they really stand back and let the cops talk about the case, and we end up with a ton of stuff that you couldn’t fit into the time slot however hard you tried. So we have to decide what needs to be in, what doesn’t.

Cook: On day one, our “war room” day, we meet with the local law and everyone involved in the case, and that’s eight hours of footage right there.

Patch: And that gets cut down to just four or five minutes. We have a pretty small but tight team, with 10 editors who split up the episodes. Once in a while they’ll cross over, but we like to have each team and the producers stay with each episode as long as they can, as it’s so complicated. When you see the finished show, it doesn’t seem that complicated, but there are so many ways you could handle the footage that it really helps for each team to really take ownership of that particular episode.

How involved is Dick Wolf in post?
Cook: He loves the whole post process, and he watches all the cuts and has input.

Patch: He’s very supportive and obviously so experienced, and if we’re having a problem with something, he’ll give notes. And for the most part, the network gives us a lot of flexibility to make the show.

What about VFX on the show?
Patch: We have some, but nothing too fancy, and we use an outside VFX/graphics company, LOM Design. We have a lot of legal documents on the show, and that stuff gets animated, and we’ll also have some 3D crime scene VFX. The only other outside vendor is our composer, Robert ToTeras.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Shipping + Handling adds Jerry Spivack, Mike Pethel, Matthew Schwab

VFX creative director Jerry Spivack and colorists Michael Pethel and Matthew Schwab have joined LA’s Shipping + Handling, Spot Welders‘ VFX, color grading, animation, and finishing arm/sister company.

Alongside executive producer Scott Friske and current creative director Casey Price, Spivack will help lead the company’s creative team. As the creative director/co-founder at Ring of Fire, Spivack was responsible for crafting and spearheading VFX on commercials for brands including FedEx, Nike and Jaguar; episodic work for series television including Netflix’s Wormwood and 12 seasons of FX’s It’s Always Sunny in Philadelphia; promos for NBC’s The Voice and The Titan Games; and feature films such as Sony Pictures’ Spider-Man 2, Bold Films’ Drive and Warner Bros.’ The Bucket List.

Colorist Pethel was a founding partner of Company 3 and for the past five years has served client and director relationships under his BeachHouse Color brand, which he will continue to maintain. Pethel’s body of work includes campaigns for Carl’s Jr., Chase, Coke, Comcast/Xfinity, Hyundai, Jeep, Netflix and Southwest Airlines.

Commenting on the move, Pethel says, “I’m thrilled to be joining such a fantastic group of highly regarded and skilled professionals at Shipping + Handling. There is so much creativity here; the people are awesome to work with and the technology they are able to offer clientele at the facility is top-notch.”

Schwab formally joins the Shipping + Handling roster after working closely with the company over the past two years on multiple campaigns for Apple, Acura, QuickBooks and many others. Aside from his role at Shipping + Handling, Schwab will also continue his work through Roving Picture Company. Having worked with a number of internationally recognized brands, Schwab has collaborated on projects for Amazon, Honda, Mercedes-Benz, National Geographic, Netflix, Nike, PlayStation and Smirnoff.

“It’s exciting to be part of a team that approaches every project with such energy. This partnership represents a shared commitment to always deliver outstanding color and technical results for our clients,” says Schwab.

“Pethel is easily amongst the best colorists in our industry. As a longtime client of his, I have a real understanding of the professionalism he brings to every session. He is a delight in the room and wickedly talented. Schwab’s talent has just been realized in the last few years, and we are pleased to offer his skill to our clients. If our experience working with him over the last couple of years is any indication, we’re going to make a lot of clients happy he’s on our roster,” adds Friske.

Spivack, Pethel and Schwab will operate out of Shipping + Handling’s West Coast office on the creative campus it shares with its sister company, editorial post house Spot Welders.

Image: (L-R) Mike Pethel, Matthew Schwab, Jerry Spivack

 

Point.360 adds senior colorist Patrick Woodard

Senior colorist Patrick Woodard has joined the creative team at Point.360 in Burbank. He was most recently at Hollywood’s DigitalFilm Tree, where he colored dozens of television shows, including ABC’s American Housewife, CBS’ NCIS: Los Angeles, NBC’s Great News and TBS’ Angie Tribeca. Over the years, he also worked on Weeds, Everybody Hates Chris, Cougar Town and Sarah Silverman: We Are Miracles.

Woodard joins Point.360 senior colorist Charlie Tucker, whose recent credits include the final season of the Netflix’s Orange Is the New Black, CW’s Legacies and Roswell, New Mexico, YouTube’s Cobra Kai, as well as the Netflix comedy Medical Police.

“Patrick is an exceptional artist with an extensive background in photography,” says Point.360’s SVP of episodic Jason Kavner. “His ability to combine his vast depth of technical expertise and his creative vision to quickly create a highly-developed aesthetic has the won the loyalty of many DPs and creatives alike.”

Point360 has four color suites at its Burbank facility. “Although we have the feel of a boutique episodic facility, we are able to offer a robust end to end pipeline thanks to our long history as a premier mastering company,” reports Kavner. “We are currently servicing 4K Dolby Vision projects for Netflix such as the upcoming Jenji Kohan series currently being called Untitled Vigilante Project, as well as the UHD SDR Sony produced YouTube series Cobra Kai. We also continue to offer the same end-to-end service to our traditional studio and network clients on series such as Legacies for the CW, Fresh Off The Boat, Family Guy and American Dad for 20th Century Fox, and Drunk History and Robbie for Comedy Central.

Woodard, who will be working on Resolve at Point360, was also a recent subject of our Behind the Title series. You can read that here.

Brittany Howard music video sets mood with color and VFX

The latest collaboration between Framestore and director Kim Gehrig is for Brittany Howard’s debut solo music video for Stay High, which features a color grade and subtle VFX by the studio. A tribute to the Alabama Shakes’ lead singer’s late father, the stylized music video stars actor Terry Crews (Brooklyn Nine-Nine, The Expendables) as a man finishing a day’s work and returning home to his family.

Produced by production company Somesuch, the aim of Stay High is to present a natural and emotionally driven story that honors the singer’s father, K.J. Howard. Shot in her hometown of Nashville, the music video features Howard’s family and friends while the singer pops up in several scenes throughout the video as different characters.

The video begins with Howard’s father getting off of work at his factory job. The camera follows him on his drive home, all the while he’s singing “Stay High.” As he drives home, we see images people and locations where Howard grew up. The video ends when her dad pulls into his driveway and is met by his daughters and wife.

“Kim wanted to really highlight the innocence of the video’s story, something I kept in mind while grading the film,” says Simon Bourne, Framestore’s head of creative color, who’s graded several films for the director. “The focus needed to always be on Terry with nothing in his surroundings distracting from that and the grade needed to reflect that idea.”

Framestore’s creative director Ben Cronin, who was also a compositor on the project along with Nuke compositor Christian Baker, adds, “From a VFX point of view, our job was all about invisible effects that highlighted the beautiful job that Ryley Brown, the film’s DP, did and to complement Kim’s unique vision.”

“We’ve worked with Kim on several commercials and music video projects, and we love collaborating because her films are always visually-interesting and she knows we’ll always help achieve the ground-breaking and effortlessly cool work that she does.”

Review: FXhome’s HitFilm Pro 12 for editing, compositing, VFX

By Brady Betzel

If you have ever worked in Adobe Premiere Pro, Apple FCP X or Avid Media Composer and wished you could just flip a tab and be inside After Effects, with access to 3D objects directly in your timeline, you are going to want to take a look at FXhome’s HitFilm Pro 12.

Similar to how Blackmagic brought Fusion inside of its most recent versions of DaVinci Resolve, HitFilm Pro offers a nonlinear editor, a composite/VFX suite and a finishing suite combined into one piece of software. Haven’t heard about HitFilm yet? Let me help fill in some blanks.

Editing and 3D model Import

Editing and 3D model Import

What is HitFilm Pro 12?
Technically, HitFilm Pro 12 is a non-subscription-based nonlinear editor, compositor and VFX suite that costs $299. Not only does that price include 12 months of updates and tech support, but one license can be used on up to three computers simultaneously. In my eyes, HitFilm Pro is a great tool set for independent filmmakers, social media content generators and any editor who goes beyond editing and dives into topics like 3D modeling, tracking, keying, etc. without having to necessarily fork over money for a bunch of expensive third-party plugins. That doesn’t mean you won’t want to buy third-party plugins, but you are less likely to need them with HitFilm’s expansive list of native features and tools.

At my day job, I use Premiere, After Effects, Media Composer and Resolve. I often come home and want to work in something that has everything inside, and that is where HitFilm Pro 12 lives. Not only does it have the professional functionality that I am used to, such as trimming, color scopes and more, but it also has BorisFX’s Mocha planar tracking plugin built in for no extra cost. This is something I use constantly and love.

One of the most interesting and recent updates to HitFilm Pro 12 is the ability to use After Effects plugins. Not all plugins will work since there are so many, but in a video released after NAB 2019, HitFilm said plugins like Andrew Kramer’s Video CoPilot Element3D and ones from Red Giant are on the horizon. If you are within your support window, or you continue to purchase HitFilm, FXhome will work with you to get your favorite After Effects plugins working directly inside of HitFilm.

Timeline and 3D model editor

Some additional updates to HitFilm Pro 12 include a completely redesigned user interface that resembles Premiere Pro… kind of. Threaded rendering has also been added, so Windows users who have Intel and Nvidia hardware will see increased GPU speeds, the ability to add title directly in the editor and more.

The Review
So how doees HitFilm Pro 12 compare to today’s modern software packages? That is an interesting question. I have become more and more of a Resolve convert over the past two years, so I am constantly comparing everything to that. In addition, being an Avid user for over 15 years, I am used to a rock-solid NLE with only a few hiccups here and there. In my opinion, HitFilm 12 lands itself right where Premiere and FCP X live.

It feels prosumer-y, in a YouTuber or content-generator capacity. Would it stand up to 10 hours of abuse with content over 45 minutes? It probably would, but much like with Premiere, I would probably split my edits in scenes or acts to avoid slowdowns, especially when importing things like OBJ files or composites.

The nonlinear editor portion feels like Premiere and FCP X had a baby, but left out FCP X’s Magnetic Timeline feature. The trimming in the timeline feels smooth, and after about 20 minutes of getting comfortable with it I felt like it was what I am generally used to. Cutting in footage feels good using three-point edits or simply dragging and dropping. Using effects feels very similar to the Adobe world, where you can stack them on top of clips and they each affect each other from the top down.

Mocha within HitFilm Pro

Where HitFilm Pro 12 shines is in the inclusion of typically third-party plugins directly in the timeline. From the ability to create a scene with 3D cameras and particle generators to being able to track using BorisFX’s Mocha, HitFilm Pro 12 has many features that will help take your project to the next level. With HitFilm 12 Pro’s true 3D cameras, you can take flat text and enhance it with raytraced lighting, shadows and even textures. You can even use the included BorisFX Continuum 3D Objects to make great titles relatively easily. To take it a step further, you can even track them and animate them.

Color Tools
By day, I am an online editor/colorist who deals with the finishing aspect of media creation. Throughout the process, from color correction to exporting files, I need tools that are not only efficient but accurate. When I started to dig into the color correction side of HitFilm Pro 12, things slowed down for me. The color correction tools are very close to what you’ll find in other NLEs, like Premiere and FCP X, but they don’t quite rise to the level of Resolve. HitFilm Pro 12 does operate inside of a 32-bit color pipeline, which really helps avoid banding and other errors when color correcting. However, I didn’t feel that the toolset was making me more efficient; in fact, it was the opposite. I felt like I had to learn FXhome’s way of doing it. It wasn’t that it totally slowed me down, but I felt it could be better.

Color

Color

Summing Up
In the end, HitFilm 12 Pro will fill a lot of holes for individual content creators. If you love learning new things (like I do), then HitFilm Pro 12 will be a good investment of your time. In fact, FXhome post tons of video tutorials on all sorts of good and topical stuff, like how to create a Stranger Things intro title.

If you are a little more inclined to work with a layer-based workflow, like in After Effects, then HitFilm Pro Pro 12 is the app you’ll want to learn. Check out HitFilm Pro 12 on FXhome’s website and definitely watch some of the company’s informative tutorials.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

a52 Color adds colorist Gregory Reese

Colorist Gregory Reese has joined LA-based grading and finishing studio a52 Color, which is led by executive producer Thatcher Peterson and includes colorists Paul Yacono and Daniel de Vue.

Reese comes to a52 Color after eight years at The Mill. While there he colored a spectrum of commercials for athletic brands, including Nike and Reebok, as well as campaigns for Audi, Apple, Covergirl, GMC, Progressive and Samsung. He worked with such directors as AG Rojas, Matt Lambert and Harold Einstein while developing the ability to grade for any style.

Reese contributed to several projects for Apple, including the History of Sound spot, which sonically chronicles the decades from the late 1800s to 2015. The spot earned Reese an HPA Award nomination for Outstanding Color Grading in a Commercial.

“Color is at the center of how audiences engage with a picture in motion,” explains Reese. “Some of its technical components may not always be instantly recognized by the audience, but when it’s done right, it can make for an emotional experience.”

Merging his love for music with the passion for his craft, Reese has collaborated with artists like Jack Ü, Major Lazer, Arctic Monkeys, Run The Jewels, Jack White, Pharrell Williams and many more. Peterson and Reese previously worked together at The Mill in LA. “Having had the fortunate experience of working with Gregory at The Mill, I knew he was the real deal when it came to a seasoned colorist,” says Peterson.

The all-new facility was yet another perk that sealed the deal for Reese, as he explains: “One of the biggest barriers for entry to color is not having access to theaters. a52 Color solves that problem with having the ability to grade both broadcast and theatrical formats as well as giving us a high level of creative freedom. It left me immediately impressed by how invested they are in making it the absolute best place to go for color grading.”

He will be working on FilmLight Baselight.

Colorist Chat: Refinery’s Kyle Stroebel

This Cape Town, South Africa-based artist says that “working creatively with a director and DP to create art is a privilege.”

NAME: Colorist Kyle Stroebel

COMPANY: Refinery in Cape Town, South Africa

CAN YOU DESCRIBE YOUR COMPANY?
We are a full-service post company in the heart of Cape Town. We specialize in front-end dailies and data solutions, and have a full finishing department with a VFX arm and audio division.

Our work varies from long-form feature and television programming to commercials and music video content. We are a relatively young team that loves what we do.

AS A COLORIST, WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
We are by far the most important members of the team and the creative success of a movie is largely based around our skills! Okay, honestly? I have a shot on my timeline that is currently on version 54, and my client still needs an additional eyelash painted out.

I think the surprising thing to the uninformed is the minute elements that we focus on in detail. It’s not all large brush strokes and emotional gesturing; the images you see have more often than not gone through painstaking hours of crafting and creative processing. For us the beauty is in the detail.

Flatland

WHAT SYSTEM DO YOU WORK ON?
FilmLight’s Baselight

ARE YOU SOMETIMES ASKED TO DO MORE THAN JUST COLOR ON PROJECTS?
We are a small team handling multiple projects simultaneously, and our Baselight suites perform multiple functions as a result. My fellow colorist David Grant and I will get involved in our respective projects early on. We handle conform, VFX pulls and versioning and follow the pipe through until the film or project has cleared QC.

With Baselight’s enhanced toolset and paint functionality, we are now saving our clients both time and money by handling a variety of cleanups and corrections without farming the shots out to VFX or Flame.

Plus, the DI is pretty much the last element in the production process. We’re counselors, confidants and financial advisors. People skills come in really handy. (And a Spotify playlist for most tastes and moods is a prerequisite.)

WHAT’S YOUR FAVORITE PART OF THE JOB?
Making something amazing happen with a client’s footage. When they didn’t realize that their own footage could look like what the final product looks like… and sharing in that excitement when it happens.

WHAT’S YOUR LEAST FAVORITE?
Insane deadlines. As our tools have improved, the expectation for lightning-fast turnarounds has increased. I’m a perfectionist with my work and would love to spend days molding certain shots and trying new things. Walking away from a grade and coming back to it is often very fruitful because looking at a complex shot with fresh eyes frequently produces new outlooks and better results. But with hard delivery dates this is becoming seldom-afforded.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Scuba diving with manta rays in Bali; it’s a testament to how much I love what I do that I’m not doing that every day of my life.

WHY DID YOU CHOOSE THIS PROFESSION?
I sometimes wonder that myself when it’s 3am and I’m in a room with no windows for the 17th consecutive hour. Truthfully, I chose it because changing something from the banal to the magnificent gives me joy. Working creatively with a director and DP to create art is a privilege, and the fact that they must sweat and literally bleed to capture the images while I fiddle with the aircon in my catered suite doesn’t hurt.

I was in my third year of film school and brought one of my 16mm projects in to grade with a colorist in telecine. I couldn’t believe what I was seeing. I knew I wanted to do that.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
There have been a load of amazing projects recently. Our local industry has been very busy, and we have benefited greatly from that. I recently finished a remake of the cult classic Critters for Warner Bros.

Flatland

Before that I completed a movie called Flatland that premiered at Berlinale and then went to Cannes. There are a few other movies that I can’t chat too much about right now. I also did a short piece by one of South Africa’s biggest directors, Kim Geldenhuys, for the largest blue diamond found in recent history.

Changing of the seasons has also meant a couple of amazing fashion pieces for different fashion houses’ new collections.

HOW DO YOU PREFER TO WORK WITH THE DP/DIRECTOR?
Depends on the project. Depends on the director and DP too, actually. With long-form work,  I love to spend a day or two together with them in the beginning, and then I take a day or two to go over and play with a couple of scenes on my own. From there we should have reached a pretty cohesive vision as to what the directors wants and how I see the footage. Once that vision is aligned, I like to work on my own while listening to loud music and giving everything a more concrete look. Then, ideally, the director returns for a few days at the end, and we get stuck into the minutia.

With commercials, I like working with the director from early in the morning so that we know where we want to go before the agency has input and makes alterations! It’s a fine balancing act.

ANY SUGGESTIONS FOR GETTING THE MOST OUT OF A PROJECT FROM A COLOR PERSPECTIVE?
Have the colorist involved early on. When you begin shooting, have the colorist and DP develop a relationship so that the common vision develops during principal photography. That way, when the edit is locked, you have already experimented with ideas and the DP is shooting for a more precise look.

CAN YOU TALK ABOUT YOUR WORK ON THE WARNER BROS. FILM? EXPLAIN YOUR PROCESS ON THAT? ANY PARTICULARLY CHALLENGING SCENES?
Critters is a cult horror franchise from the late ’80 and early ‘90s. The challenge was to be really dark and moody but still stay true to the original and fit in with modern viewing devices without losing drastic detail. It centers on a lot of practical on-set special effects, something in increasing decline with advancements in CGI. Giving the puppets a lifelike appearance while still making them believable came with quite a few challenges.

HOW DO YOU PREFER THE DP OR DIRECTOR TO DESCRIBE THE LOOK THEY WANT? PHYSICAL EXAMPLES, FILMS TO EMULATE, ETC.?
Practical examples or references are very helpful. Matching something is easy, developing beyond that to give it a unique quality is what keeps it interesting. Certain directors find it easier to work with non-specifics and let me interpret the vibe and mood from more emotional explanations rather than technical jargon. While sometimes harder to initially interpret, that approach has benefits because it’s a bit more open-ended.

Red Bull

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
I love and hate most of the things I work on for a variety of reasons. It’s hard to pick one. Gun to my head? Probably a short film for Red Bull Music by Petite Noir. It was shot by Deon Van Zyl in the Namib desert and had just the most exquisite visuals from the outset. I still watch it when I’m feeling down.

WHERE DO YOU FIND INSPIRATION? ART? PHOTOGRAPHY?
At the risk of sounding like a typical millennial, I use Instagram a heck of a lot. I get to see what the biggest and best colorists are doing around the world. Before Instagram, you would only see pieces of critical acclaim. Now, through Instagram and Vimeo, I get to see so many passion projects in which people are trying new things and pushing boundaries beyond what clients, brands and studios want. I can spend days in galleries and bask in the glory of Caravaggio and Vermeer, but I can also scroll quickly through very contemporary looks, innovations and trends.

Red Bull

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
My phone. I hate it, but my life happens largely through that porthole. My NutriBullet. My Baselight. I’ve never loved an inanimate object like I love my Baselight.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
Instagram as mentioned. I love the work of Joseph Bicknell, Kath Raisch, Sofie Borup, Craig Simonetti, Matt Osborne and then anything that comes from The Mill channel. Also, a wide range of directors and the associated Vimeo links. I can honestly get lost on an obscure Korean channel with magnificent images and languages I don’t understand.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I run. Even If I’m breaking 90-hour weeks, I always make sure I run three or four times a week. And I love cooking. It’s expressive. I get to make meals for my partner Katherine, who tends to be very receptive.