Tag Archives: Blackmagic Resolve

Color grading IT Chapter Two’s terrifying return

In IT Chapter Two, the kids of the Losers’ Club are all grown up and find themselves lured back to their hometown of Derry. Still haunted both by the trauma that monstrous clown Pennywise let loose on the community and by each one’s own unique insecurities, the group (James McAvoy, Jessica Chastain, Bill Hader) find themselves up against even more terrifying forces than they faced in the first film, IT.

Stephen Nakamura

IT Chapter Two director Andy Muschietti called on cinematographer Checco Varese and colorist Stephen Nakamura of Company 3. Nakamura returned to the franchise, performing the final color grade at Efilm in Hollywood. “I felt the first one was going to be a big hit when we were working on it, because these kids’ stories were so compelling and the performances were so strong. It was more than just a regular horror movie. This second one, in my opinion, is just as powerful in terms of telling these characters’ stories. And, not surprisingly, it also takes the scary parts even further.”

According to Nakamura, Muschietti “is a very visually oriented director. When we were coloring both of the films, he was very aware of the kinds of things we can do in the DI to enhance the imagery and make things even more scary. He pushed me to take some scenes in Chapter Two in directions I’ve never gone with color. I think it’s always important, whether you’re a colorist or a chef or a doctor, to always push yourself and explore new aspects of your work. Andy’s enthusiasm encouraged me to try new approaches to working in DaVinci Resolve. I think the results are very effective.”

For one thing, the technique he used to bring up just the light level in the eyes of the shapeshifting clown Pennywise got even more use here because there were more frightening characters to use it on. In many cases, the companies that created the visual effects also provided mattes that let Nakamura easily isolate and adjust the luminance of each individual eye in Resolve. When such mattes weren’t available, he used Resolve to track each eyeball a frame at a time.

“Resolve has excellent tracking capabilities, but we were looking to isolate just the tiny whites of the characters’ eyes,” Nakamura explains, “and there just wasn’t enough information to track.” It was meticulous work, he recalls, “but it’s very effective. The audience doesn’t consciously know we’re doing anything, but it makes the eyes brighter in a very strange way, kind of like a cat’s eyes when they catch the light. It really enhances the eerie feeling.”

In addition, Nakamura and the filmmakers made use of Resolve’s Flicker tool in the OpenFX panel to enhance the flickering effect in a scene involving flashing lights, taking the throbbing light effects further than they did on set. Not long ago, this type of enhancement would have been a more involved process in which the shots would likely be sent to a visual effects house. “We were able to do it as part of the grading, and we all thought it looked completely realistic. They definitely appreciated the ability to make little enhancements like that in the final grade, when everyone can see the scenes with the grade in context and on a big screen.”

Portions of the film involve scenes of the Losers’ Club as children, which were comprised of newly shot material (not cut in from the production of the first It). Nakamura applied a very subtle amount of Resolve’s mid-tone detail tool over them primarily to help immediately and subliminally orient the audience in time.

But the most elaborate use of the color corrector involved one short sequence in which Hader’s character, walking in a local park on a pleasant, sunny day, has a sudden, terrifying interaction with a very frightening character. The shots involved a significant amount of CGI and compositing work, which was completed at several effects houses. Muschietti was pleased with the effects work, but he wanted Nakamura to bring in an overall quality to the look of the scene that made it feel a bit more otherworldly.

Says Nakamura, “Andy described something that reminded me of the old-school, two-strip color process, where essentially anything red would get pushed into being a kind of magenta, and something blue or green would become a kind of cyan.”

Nakamura, who colored Martin Scorsese’s The Aviator (shot by Robert Richardson, ASC), had designed something at that point to create more of a three-strip look, but this process was more challenging, as it involved constraining the color palette to an even greater degree — without, of course, losing definition in the imagery.

With a bit of trial and error, Nakamura came up with the notion of using the splitter/combiner node and recombined some nodes in the output, forcing the information from the green channel into the red and blue channels. He then used a second splitter/combiner node to control the output. “It’s almost like painting a scene with just two colors,” he explains. “Green grass and blue sky both become shades of cyan, while skin and anything with red in it goes into the magenta area.”

The work became even more complex because the red-haired Pennywise also makes an appearance; it was important for him to retain his color, despite the rest of the scene going two-tone. Nakamura treated this element as a complex chroma key, using a second splitter/combiner node and significantly boosting the saturation just to isolate Pennywise while preventing the two-tone correction from affecting him.

When it came time to complete the pass for HDR Dolby Cinema — designed for specialty projectors capable essentially of displaying brighter whites and darker blacks than normal cinema projectors — Muschietti was particularly interested in the format’s treatment of dark areas of the frame.

“Just like in the first one,” Nakamura explains, “we were able to make use of Dolby Cinema to enhance suspense. People usually talk about how bright the highlights can be in HDR. But, when you push more light through the picture than you do for the P3 version, we also have the ability to make shadowy areas of the image appear even darker while keeping the details in those really dark areas very clear. This can be very effective in a movie like this, where you have scary characters lurking in the shadows.

“The color grade always plays some kind of role in a movie’s storytelling,” Nakamura sums up, “but this was a fun example of how work we did in the color grade really helped scare the audience.”

You can check out our Q&A with Nakamura about his work on the original IT.

Blackmagic’s Resolve 16: speedy cut page, Resolve Editor Keyboard, more

Blackmagic was at NAB with Resolve 16, which in addition to dozens of new features includes a new editing tab focused on speed. While Resolve still has its usual robust editing offerings, this particular cut page is designed for those working on short-form projects and on tight deadlines. Think of having a client behind you watching you cut something together, or maybe showing your director a rough cut. You get in, you edit and you go — it’s speedy, like editing triage.

For those who don’t want to edit this way, no worries, you don’t have to use this new tab. Just ignore it and move on. It’s an option, and only an option. That’s another theme with Resolve 16 — if you don’t want to see the Fairlight tab, turn it off. You want to see something in a different way, turn it on.

Blackmagic also introduced the DaVinci Resolve Editor Keyboard, a new premium keyboard for Resolve that helps improve the speed of editing. It allows the use of two hands while editing, so transport control and selecting clips can be done while performing edits. The Resolve Editor Keyboard will be available in August for $995.

The keyboard combined with the new cut page is designed to further speed up editing. This alternate edit page lets users import, edit, trim, add transitions, titles, automatically match color, mix audio and more. Whether you’re delivering for broadcast or for YouTube, the cut page allows editors to do all things in one place. Plus, the regular edit page is still available, so customers can switch between edit and cut pages to change editing styles right in the middle of a job.

“The new cut page in DaVinci Resolve 16 helps television commercial and other high-end editors meet super tight deadlines on fast turn-around projects,” says Grant Petty, Blackmagic CEO. “We’ve designed a whole new high-performance, nonlinear workflow. The cut page is all about power and speed. Plus, editors that need to work on more complex projects can still use the regular edit page. DaVinci Resolve 16 gives different editors the choice to work the way they want.”

The cut page is reminiscent of how editors used to work in the days of tape, where finding a clip was easy because customers could just spool up and down the tape to see their media and select shots. Today, finding the right clip in a bin with hundreds of files can be slow. With source tape, users no longer have to hunt through bins to find the clip they need. They can click on the source tape button and all of the clips in their bin appear in the viewer as a single long “tape.” This makes it easy to scrub through all of the shots, find the parts they want and quickly edit them to the timeline. Blackmagic calls it an “old-fashioned” concept that’s been modernized to help editors find the shots they need fast.

The new cut page features a dual timeline so editors don’t have to zoom in or out. The upper timeline shows users the entire program, while the lower timeline shows the current work area. Both timelines are fully functional, allowing editors to move and trim clips in whichever timeline is most convenient.

Also new is the DaVinci Neural Engine, which uses deep neural networks and learning, along with AI, to power new features such as speed warp motion estimation for retiming, super scale for up-scaling footage, auto color and color matching, facial recognition and more. The DaVinci Neural Engine is entirely cross-platform and uses the latest GPU innovations for AI and deep learning. The Neural Engine provides simple tools to solve complex, repetitive and time-consuming problems. For example, it enables facial recognition to automatically sort and organize clips into bins based on people in the shot.

DaVinci Resolve 16 also features new adjustment clips that let users apply effects and grades to clips on the timeline below; quick export that can be used to upload projects to YouTube, Vimeo and Frame.io from anywhere in the application; and new GPU-accelerated scopes providing more technical monitoring options than before. So now sharing your work on social channels, or for collaboration via Frame.io., is simple because it’s integrated into Resolve 16 Studio

DaVinci Resolve 16 Studio features improvements to existing ResolveFX, along with several new plugins that editors and colorists will like. There are new ResolveFX plugins for adding vignettes, drop shadows, removing objects, adding analog noise and damage, chromatic aberration, stylizing video and more. There are also improvements to the scanline, beauty, face refinement, blanking fill, warper, dead pixel fixer and colorspace transformation plugins. Plus, users can now view and edit ResolveFX keyframes from the timeline curve editor on the edit page or from the keyframe panel on the color page.

Here are all the updates within Resolve 16:

• DaVinci Neural Engine for AI and deep learning features
• Dual timeline to edit and trim without zooming and scrolling
• Source tape to review all clips as if they were a single tape
• Trim interface to view both sides of an edit and trim
• Intelligent edit modes to auto-sync clips and edit
• Timeline review playback speed based on clip length
• Built-in tools for retime, stabilization and transform
• Render and upload directly to YouTube and Vimeo
• Direct media import via buttons
• Scalable interface for working on laptop screens
• Create projects with different frame rates and resolutions
• Apply effects to multiple clips at the same time
• DaVinci Neural Engine detects faces and auto-creates bins
• Frame rate conversions and motion estimation
• Cut and edit page image stabilization
• Curve editor ease in and out controls
• Tape-style audio scrubbing with pitch correction
• Re-encode only changed files for faster rendering
• Collaborate remotely with Frame.io integration
• Improved GPU performance for Fusion 3D operations
• Cross platform GPU accelerated tools
• Accelerated mask operations including B-Spline and bitmap
• Improved planar and tracker performance
• Faster user and smart cache
• GPU-accelerated scopes with advanced technical monitoring
• Custom and HSL curves now feature histogram overlay
• DaVinci Neural Engine auto color and shot match
• Synchronize SDI output to viewer zoom
• Mix and master immersive 3D audio
• Elastic wave audio alignment and retiming
• Bus tracks with automation on timeline
• Foley sampler, frequency analyzer, dialog processor, FairlightFX
• 500 royalty-free Foley sounds effects
• Share markers and notes in collaboration workflows
• Individual user cache for collaborative projects
• Resolve FX plugins with timeline and keyframes

Roy H. Wagner, ASC, to speak at first Blackmagic Collective event

By Randi Altman

The newly formed Blackmagic Collective, a group founded by filmmakers for filmmakers, is holding the first of its free monthly meetings on Saturday, January 12 at the Gnomon School of Visual Effects in Hollywood.

The group, headed up by executive director Brett Harrison, says they are dedicated to sharing info on the art of filmmaking as well as education. “With Blackmagic Design’s support, the group will feature ‘TED Talk’-like presentations from media experts, panels covering post and production topics and film festivals, as well as networking opportunities.”

In addition, Blackmagic Design is offering free Resolve training attached to the meetings. While Blackmagic is a sponsor, this is not a Blackmagic-run group. According to Harrison, “The Blackmagic Collective is an independent group created to support the art of filmmaking as a whole. We are also a 501(c)(3) charity, with plans to find ways to give back to the community.” Membership is free, with no application process. Members can simply sign-up on the site. Despite the name, Harrison insists that the group, while inspired by Blackmagic’s filmmaking tools, is focused on filmmaking as a whole. “You do not need to use BMD tools to be a member,” adds Harrison.

On creating the Collective, Harrison says, “After producing the Blackmagic Design Conference + Expo in LA early in 2018, I realized that a monthly group in Hollywood for filmmakers to learn from other professionals and share with and inspire each other would be well-received and vital, particularly for Blackmagic users in the industry. BMD allows for an end-to-end workflow that encompasses the spectrum of production and post, with endless topics for our group to focus on, though we will be speaking on a range of topics and not strictly BMD gear and software.”

At their first meeting, esteemed film and television cinematographer Roy H. Wagner, ASC, will be interviewed by Christian Sebaldt, ASC, with a focus on Roy’s new feature film Stand!. There will be a panel discussing the art and experiences of young colorists from Efilm, Apache and Company 3. Also, the Blackmagic Collective will be announcing a film festival that will start in April and end in November with a final competition. Filmmakers can submit films each month. Selected films will be streamed on the group website, with a select few shown at the monthly meetings starting in April. Members will have the opportunity to vote for the best each month, with a final competition for the top five films at the November event.

In case you were wondering, and we know you are, the current plan for the film festival is this:
“Our film festival submissions must use BMD technology to be eligible to enter the contest. That may include cameras, software or both, depending on the category,” explains Harrison.

The Collective will also be hosting job fairs at every other meeting.

“We are thrilled to be supporting the Blackmagic Collective,” says Blackmagic president Dan May. “Our company shares a passion with filmmakers by creating hardware and software that make their craft easier and more cost effective. We feel the Collective will provide the added resource of bringing a focus to the art form of filmmaking, as well as helping share new ideas and technology among creatives at all skill levels, from student to professional.”

You can sign up for the Resolve editing class or the event (or both) at the website.

Review: Blackmagic’s eGPU and Intel i9 MacBook Pro 2018

By Brady Betzel

Blackmagic’s eGPU is worth the $699 price tag. You can buy it from Apple’s website, where it is being sold exclusively for the time being. Wait? What? You wanted some actual evidence as to why you should buy the BMD eGPU?

Ok, here you go…

MacBook Pro With Intel i9
First, I want to go over the latest Apple MacBook Pro, which was released (or really just updated) this past July. With some controversial fanfare, the 2018 MacBook Pro can now be purchased with the blazingly fast Intel i9, 2.6GHz (Turbo Boost up to 4.3GHz) six-core processor. In addition, you can add up to 32GB of 2400MHz DDR4 onboard memory. The Radeon Pro 560x GPU with 4GB of GDDR5 memory and even a 4TB SSD storage drive. It has four Thunderbolt 3 ports and, for some reason, a headphone jack. Apple is also touting its improved butterfly keyboard switches as well as its True Tone display technology. If you want to read more about that glossy info head over to Apple’s site.

The 2018 MacBook Pro is a beast. I am a big advocate for the ability to upgrade and repair computers, so Apple’s venture to create what is essentially a leased computer ecosystem that needs to be upgraded every year or two usually puts a bad taste in my mouth.

However, the latest MacBook Pros are really amazing… and really expensive. The top-of-the-line MacBook Pro I was provided with for this review would cost $6,699! Yikes! If I was serious, I would purchase everything but the $2,000 upgrade from the 2TB SSD drive to the 4TB, and it would still cost $4,699. But I suppose that’s not a terrible price for such an intense processor (albeit not technically workstation-class).

Overall, the MacBook Pro is a workhorse that I put through its video editing and color correcting paces using three of the top four professional nonlinear editors: Adobe Premiere, Apple FCP X and Blackmagic’s Resolve 15 (the official release). More on those results in a bit, but for now, I’ll just say a few things: I love how light and thin it is. I don’t like how hot it can get. I love how fast it charges. I don’t like how fast it loses charge when doing things like transcoding or exporting clips. A 15-minute export can drain the battery over 40% while playing Spotify for eight hours will hardly drain the battery at all (maybe 20%).

Blackmagic’s eGPU with Radeon Pro 580 GPU
One of the more surprising releases from Blackmagic has been this eGPU offering. I would never have guessed they would have gone into this area, and certainly would never have guessed they would have gone with a Radeon card, but here we are.

Once you step back from the initial, “Why in the hell wouldn’t they let it be user-replaceable and also not brand dependent” shock, it actually makes sense. If you are Mac OS user, you probably can do a lot in terms of external GPU power already. When you buy a new iMac, iMac Pro or MacBook Pro, you are expecting it to work, full stop.

However, if you are a DIT or colorist that is more mobile than that sweet million-dollar color bay you dream of, you need more. This is where the BMD eGPU falls nicely into place. You plug it in and instantly see it populate in the menu bar. In addition, the eGPU acts as a dock with four USB 3 ports, two Thunderbolt 3 ports and an HDMI port. The MacBook Pro will charge off of the eGPU as well, which eliminates the need for your charger at your docking point.

On the go, the most decked out MacBook Pro can handle its own. So it’s no surprise that FCP X runs remarkably fast… faster than everything else. However, you have to be invested in an FCP X workflow and paradigm — and while I’m not there yet, maybe the future will prove me wrong. Recently, I saw someone on Twitter who developed an online collaboration workflow, so people are excited about it.

Anyway, many of the nonlinear editors I work with can also play on the MacBook Pro, even with 4K Red, ARRI and, especially, ProRes footage. Keep in mind though, with the 2K, 4K, or whatever K footage, you will need to set the debayer to around “half good” if you want a fluid timeline. Even with the 4GB Radeon 560x I couldn’t quite play realtime 4K footage without some sort of compromise in quality.

But with the Blackmagic eGPU, I significantly improved my playback capabilities — and not just in Resolve 15. I did try and plug the eGPU into a PC with Windows 10 I was reviewing at the same time and it was recognized, but I couldn’t get all the drivers sorted out. So it’s possible it will work in Windows, but I couldn’t get it there.

Before I get to the Resolve testing, I did some benchmarking. First I ran Cinebench R15 without the eGPU attached and got the following scores: OpenGL – 99.21fps, reference match 99.5%, CPU – 947cb, CPU (single core) 190cb and MP ratio of 5.00x. With the GPU attached: Open GL — 60.26fps, reference match 99.5%, CPU — 1057 cb, CPU (single core) 186cb and MP ratio of 5.69x. Then I ran Unigine’s Valley Benchmark 1.0 without the eGPU, which got 21.3fps and a score of 890 (minimum 12.4fps/maximum 36.2fps). With the eGPU it got 25.6fps and a score of 1073 (minimum 19.2 fps/max 37.1fps)

Resolve 15 Test
I based all of my tests on a similar (although not exact for the different editing applications) 10-minute timeline, 23.98fps, 3840×2160, 4K and 8K RAW Red footage (R3D files) and Alexa (.ari and ProRes444XQ) UHD footage, all with edit page resizes, simple color correction and intermittent sharpening and temporal noise reduction (three frames, better, medium, 10, 10 and 5).

Playback: Without the eGPU I couldn’t play 23.98fps, 4K Red R3D without being set to half-res. With the eGPU I could playback at full-res in realtime (this is what I was talking about in sentence one of this review). The ARRI footage would play at full res, but would go between 1fps and 7fps at full res. The 8K Red footage would play in realtime when set to quarter-res.

One of the most re-assuring things I noticed when watching my Activity Monitor’s GPU history readout was that Resolve uses both GPUs at once. Not all of the apps did.

Resolve 15 Export Tests
In the following tests, I disabled all cache or optimized media options, including Performance Mode.

Test 1: H.264 at 23.98fps, UHD, auto-quality, no frame reordering, force highest-quality debayer/resizes and encoding profile Main)
a. Without eGPU (Radeon Pro 560x): 22 minutes, 16 seconds
b. With BMD eGPU (Radeon Pro 580): 16 minutes and 21 seconds

Test 2: H.265 10-bit, 23.98/UHD, auto quality, no frame reordering, force highest-quality debayer/resizes)
a. Without eGPU: stopped rendering after 10 frames
b. With BMD eGPU: same result

Test 3:
ProRes4444 at 23.98/UHD
a. Without eGPU: 27 min and 29 seconds
b. With BMD eGPU: 22 minutes and 57 seconds

Test 4:
– Edit page cache – enabled Smart User Cache at ProResHQ
a. Without eGPU: 17 minutes and 28 seconds
b. With BMD eGPU: 12 minutes and 22 seconds

Adobe Premiere Pro v.12.1.2
I performed similar testing in Adobe Premiere Pro using a 10-minute timeline at 23.98fps, 3840×2160, 4K and 8K RAW Red footage (R3D files) and Alexa (DNxHR SQ 8-bit) UHD footage, all with Effect Control tab resizes and simple Lumetri color correction, including sharpening and intermittent denoise (16) under the HSL Secondary tab in Lumetri applied to shadows only.

In order to ensure your eGPU will be used inside of Adobe Premiere, you must use Metal as your encoder. To enable it go to File > Project Settings > General and change the renderer to Mercury Playback Engine GPU acceleration Metal — (OpenCL will only use the internal GPU for processing.)

Premiere did not handle the high-resolution media as aptly as Resolve had, but it did help a little. However, I really wanted to test the export power with the added eGPU horsepower. I almost always send my Premiere sequences to Adobe Media Encoder to do the processing, so that is where my exports were processed.

Adobe Media Encoder
Test 1: H.264 (No render used during exports: 23.98/UHD, 80Mb/s, software encoding doesn’t allow for profile setup)
a. Open CL with no eGPU: about 140 minutes (sorry had to chase the kids around and couldn’t watch this snail crawl)
b. Metal no eGPU: about 137 minutes (chased the kids around again, and couldn’t watch this snail crawl, either)
c. Open CL with eGPU: wont work, Metal only
d. Metal with eGPU: one hour

Test 2: H.265
a. Without eGPU: failed (interesting result)
b. With eGPU: 40 minutes

Test 3: ProRes4444
a. Without eGPU: three hours
b. With eGPU: one hour and 14 minutes

FCP X
FCP X is an interesting editing app, and it is blazing fast at handling ProRes media. As I mentioned earlier, it hasn’t been in my world too much, but that isn’t because I don’t like it. It’s because professionally I haven’t run into it. I love the idea of roles, and would really love to see that playout in other NLEs. However, my results speak for themselves.

One caveat to using the eGPU in FCP X is that you must force it to work inside of the NLE. At first, I couldn’t get it to work. The Activity Monitor would show no activity on the eGPU. However, thanks to a Twitter post, James Wells (@9voltDC) sent me to this, which allows you to force FCP X to use the eGPU. It took a few tries but I did get it to work, and funny enough I saw times when all three GPUs were being used inside of FCP X, which was pretty good to see. This is one of those use-at-your-own risk things, but it worked for me and is pretty slick… if you are ok with using Terminal commands. This also allows you to force the eGPU onto other apps like Cinebench.

Anyways here are my results with the BMD eGPU exporting from FCP X:

Test 1: H.264
a. Without eGPU: eight minutes
b. With eGPU: eight minutes and 30 seconds

Test 2: H.265: Not an option

Test 3: ProRes4444
a. Without eGPU: nine minutes
b. With eGPU: six minutes and 30 seconds

Summing Up
In the end, the Blackmagic eGPU with Radeon Pro 580 GPU is a must buy if you use your MacBook Pro with Resolve 15. There are other options out there though, like the Razer Core v2 or the Akitio Node Pro.

From this review I can tell you that the Blackmagic eGPU is silent even when processing 8K Red RAW footage (even when the MacBook Pro fans are going at full speed), and it just works. Plug it in and you are running, no settings, no drivers, no cards to install… it just runs. And sometimes when I have three little boys running around my house, I just want that peace of mind and I want things to just work like the Blackmagic eGPU.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Point 360 grows team with senior colorist Charlie Tucker

Senior colorist Charlie Tucker has joined Burbank’s Point 360. He comes to the facility from Technicolor, and brings with him over 20 years of color grading experience.

The UK-born Tucker’s credits include TV shows such as The Vampire Diaries and The Originals on CW, Wet Hot American Summer and A Futile & Stupid Gesture on Netflix, as well as Amazon’s Lore. He also just completed YouTube Red’s show Cobra Kai. Tucker, who joined the company just last week, will be working on Blackmagic Resolve.

Now at Point 360, Tucker reteams with Jason Kavner, who took the helm as senior VP of episodic sales in 2017. Tucker also joins fellow senior colorist Aidan Stanford, whose recent credits include the Academy Award-winning feature Get Out and the film Happy Death Day. Stanford’s recent episodic work includes the FX series You’re the Worst and ABC’s Fresh Off the Boat.

When prodded to sum up his feelings regarding joining Point 360, Tucker said, “I am chuffed to bits to now be part of and call Point 360 my home. It is a bloody lovely facility that has a welcoming, collaborative feel, which is refreshing to find within this pressure cooker we call Hollywood. The team I am privileged to join is a brilliant, talented and very experienced group of industry professionals who truly enjoy what they do, and I know my clients will love my new coloring bay and the creative vibe that Point 360 has created.”

Director HaZ Dulull on his sci-fi offering The Beyond

By Randi Altman

Director Hasraf “HaZ” Dulull is no stranger to making movies. Before jumping into writing and directing short sci-fi films, he was a visual effects supervisor and producer. His short film resume includes Project Kronos, I.R.I.S. and Sync. Recently, his first feature film, The Beyond, was released by Gravitas Ventures.

When I first met HaZ a few years back, we were both at an Adobe event — on a canal boat in Amsterdam during IBC. We started talking about visual effects, the industry and his drive to make movies.

This Brit is friendly, intelligent and incredibly hands-on in all aspects of what he does. His latest is The Beyond, which he describes as “a cerebral science-fiction feature film that blends the realism of documentary with the fantastical, ‘big idea’ nature of the science-fiction films of today.” The Beyond tells the story of a ground-breaking mission that sent astronauts — modified with advanced robotics — through a newly discovered wormhole known as the Void. When the mission returns unexpectedly, the space agency races to discover what the astronauts encountered on their first-of-its-kind interstellar space journey.

HaZ on set

HaZ was so hands-on that he provided some of the film’s visual effects and edited the film. Here is the trailer. If you like what you see, the film is available for purchase or rent on most digital platforms.

When I reached out to HaZ to talk about The Beyond, he was in Vancouver working on an eight-part TV series for Disney called Fast Layne. “I directed episodes 1 and 2, and am currently directing episodes 7 and 8,” he says. “The beauty of starting and ending the series is it allowed me to set the show’s style and tone.”

It seems he can’t sit still! Let’s find out more about how he works and The Beyond

Can you talk about prepro? How much of that included visual effects prepro?
Most people who know me will say I’m obsessed with prep. I had about six months of hardcore prep on this, from doing little storyboards, known as HaZ-Grams, right through to previs of the key sequences.

But even during the script-writing stage (six months before actual prep), I was coming up with visuals to support the ideas I was writing in the script. Sometimes I would knock up a test VFX scene just to see how complex it would be to create this idea I was writing in the script. Prep worked hand in hand with the script development and the budgeting of the film. The film was self-financed and later additional financing came in (during post production of the film), so I wanted to ensure everything was mapped out technically, as there was no “fix it in post” scenarios in this film — I wouldn’t allow it.

During location scouting, I would have my iPhone with me and shoot a bunch of footage and still imagery, so when I went back home I could write those locations into the script to make them work with the scenarios depicted in the film.

As part of prep we actually shot a test scene to really see if this mocku-mentary format would work to tell a grounded sci-fi story. This was also used to attract crew and other casting to the project, as well as get distributors primed early on.

Many shots from that test actually made it into the final movie —I wasn’t kidding about not wasting any budget or material on this production! So prep pretty much helped shape the script too, as I knew I wasn’t in the financial position to write stuff and then go and build it. I had to reverse engineer it in a way. In the film we have tons of locations, such as the Space Centre with actual real rockets. We also had a team in Iceland shooting alien landscapes, and we even shot some scenes in Malaysia to give the film a global feel — with each of those opportunities the script was tweaked to make full use of those location opportunities we had.

You shot with Blackmagic cameras. Was that your choice? The DP’s? Have you shot with these before?
From the start, I knew we were going to shoot on Blackmagic cameras. This was mainly down to the fact my DP Adam Batchelor — who had shot Sync with me and the proof of concept tests we did for this film — was a Blackmagic advocate and knew the cameras inside out, but more importantly he was able to get cinematic imagery using those cameras.

Blackmagic was very supportive of the film and have been of my career since my short films, so they came on as one of the executive producers on the film. No one had ever shot a full feature film using just the Blackmagic cameras. We also then used a Resolve pipeline to delivery. So The Beyond is the perfect case study for it.

Can you talk about that workflow? Any hiccups? 
I think the only hiccups were the fact we were using a beta version of Resolve 14, so there were the expected crashes, etc. That would usually be seen as risky on a feature film, but luckily we didn’t have a distributor in place with a release date, so the risk was minimal.

The good thing was I would generate an error log report from Resolve and send it over to Blackmagic, who would then instantly send out a new patch. So we were looked after rather than being left on our own to scream at the monitor.

We stuck with a Pro Res 4444 QuickTime workflow for all material from footage to VFX renders, and enabled proxy on the fly within Resolve. This was great as it meant I was working with the highest-resolution imagery within Resolve, and it was fairly fast too. Things started to slow down when I had multiple layers of VFX and composites/groups, which I then had to render out as a new clip and bring back in.

How did you and the DP develop the look you wanted? Any scenes stick out that you guys worked on?
I was very fortunate to get Max Horton, who had worked on films like Gravity, to come onboard to grade this film at the Dolby Vision lab in London’s Soho. We also did an HDR version of the film, which I think is the first indie film to have an HDR treatment done to it.

We had three to four days of grading with Max, and I was in the room with him the whole time. This was because I had already done a first-pass temp grade myself while editing the film in the beta version of Resolve 14. This made the workflow as simple as exporting my Resolve file and then the material hand-over to Max, who would load up the Resolve file, link up the material and work from there.

Max kept everything photographically like a documentary but with a slight cinematic flair to it. The big challenge was matching all the various sources of material from the various Blackmagic cameras (Ursa Mini Pro, the Production Camera and the Pocket Camera) to the DJI Osmo, drone footage and stock footage.

How many VFX shots were there? Who did them?
There were around 750 visual effects shots. I designed all the VFX scenes and handled a huge portion of the compositing myself, including invisible effects shots, all the space scenes, alien planet scenes, memory scenes and tons more — this would not have been possible without the support of my VFX team who worked on their assigned sequences and shots and also generated tons of CGI assets for me to use to create my shots in comp.

My VFX team members included my long-time collaborator John Sellings, who was the VFX supervisor for all the Human 2.0 sequences. Filmmore, in Amsterdam and Brussels, handled Human 2.0 scenes in the transcode bay with in-house VFX supervisor Hans Van Helden. London’s Squint VFX handled the Human 2.0 scenes in wake-up lab. Charles Wilcocks was the Human 2.0 CG supervisor who worked on the shape and look of the Human 2.0.

Hussin Khan looked after the Malaysian team, which provided rotoscoping support and basic comps. Dan Newlands was our on-set tracking supervisor. He ensured all data was captured correctly and supervised anything tracking related in the Human 2.0 scenes.

Another long-time collaborator was Andrea Tedeschi, who handled the CG and comps for the spacecraft carrier at the end of the film, as well as rendering out the CG astronaut passes. Rhys Griffith handled the rigging for the Human 2.0 characters in Maya, and also looked after the CG passes for the alpha Human 2.0 scenes using Blender. Aleksandr Uusmees provided all the particles and simulation rendered out of Houdini as CG passes/elements, which I then used to create the wormhole effects, alien spheres and other shots that needed those elements.

JM Blay designed and created the standalone motion graphics sequences to visualize the Human 2.0 medical procedure, as well as mission trajectory graphics. He also created several “kit-bash” graphics assets for me to use, including UI graphics, from his After Effects files.

Territory Studio created the awesome end titles and credits sequence, which you can read more about on their site.

As a VFX pro yourself, do you find that you are harder to please because it’s your wheelhouse?
Oh boy. Ask any of the VFX guys on the team and they will say I am a beast to work with because I am hands-on, and also I know how long things take. But on the flip side that had its advantages, as they knew they were not going to get revision after revision, because with each brief I also presented a proposed methodology, and made sure we locked down on that first before proceeding with the shots.

Was this your biggest directing job to date? Can you talk about any surprises?
It wasn’t my biggest directing job to date, as during post production of The Beyond my second sci-fi film Origin Unknown (starring Katee Sackhoff from Battlestar Galactica, The Flash) was green-lit and that had its own set of challenges. We can talk more about that when the film is released theatrically and VOD later this year via Kew Media.

This was, however, my biggest producing job to date; there were so many logistics and resources to manage whilst directing too. The cool thing about the way we made this film was that most of the crew were on my short films, including some of the key cast too, so we embraced the guerrilla nature of the production and focused on maximizing our resources to the fullest within the time and budget constraints.

What did you learn on this film that will help on your next?
The other hat I was wearing was the producer hat, and one thing I had to embrace was the sheer amount of paperwork! I may have taken the same filmmaking approach as I did on my short films — guerrilla and thinking outside the box technically and creatively— but making a commercial feature film, I had to learn to deal with things like clearances, E&O (errors and omission) insurance, chain of title, script report and a whole bunch of paperwork required before a distributor will pick up your film.

Thankfully my co-producer Paula Crickard, who is currently wrapping post on Terry Gilliam’s Don Quixote, came in during the post stage of the film and helped.

The other thing I learned was the whole sales angle — getting a reputable distributor on board to sell the film in all worldwide territories and how to navigate that process with rights and IP and more contracts etc. The advise I got from other filmmakers is getting the right distributor is a big part in how your film will be released, and to me it was important the distributor was into the film and not just the trailer, but also what their marketing and sales strategy were. The Beyond was never designed to be a theatrical film and therefore I wanted someone that had a big reach in the VOD world through their brand, especially since The Beyond doesn’t have big-name actors in there.

What was the most challenging scene or scenes? Why and how did you overcome those challenges?
The Human 2.0 scenes were the most challenging because they had to look photoreal due to it being a documentary narrative. We did first try and do it all in-camera using a built suit, but it wasn’t achieving the look we wanted and the actors would feel uncomfortable with it, and also to do it properly with practical would cost a fortune. So we went with a full-digital solution for the Human 2.0 bodies, by having the actors wear a tight grey suit with tracking markers on and we restricted our camera moves for simplicity to enable object tracking to work as accurately as possible. We also shot multiple reference footage from all angles to help with match moving. Having an on set-tracking supervisor helped massively and allowed us to make this happen within the budget, while looking and feeling real.

Our biggest issue came when our actress made very tiny movements due to breathing in close-up shots. Because our Human 2.0 was human consciousness in a synthetic shell, breathing didn’t make sense and we began making up for it by freezing the image or doing some stabilization, which ended up being nearly impossible for the very close-up shots.

In the end, I had to think outside the box, so I wrote a few lines into the script that explained that the Human 2.0 was breathing to make it psychologically more acceptable to other humans. Those two lines saved us weeks and possibly months of time.

Being a VFX movie you would expect us to use a form of greenscreen or bluescreen, but we didn’t — in fact, the only stage used was for the “white room” astronaut scene, which was shot over at Asylum FX in London. There was an actor wearing an astronaut suit in a bright photography room, and we used brightly exposed lighting to give a surreal feeling. We used VFX to augment it.

As a writer and a director, how was it seeing your vision through from start to finish.
It didn’t really hit me until I watched the press screening of it at the Dolby Vision office in Soho. It had the fully mixed sound and the completed grade. I remember looking across at my DP and other team members thinking, “Whoa! It looks and feels like a feature film, and we did that in a year!”

You edited the film yourself?
Yes, I was the editor on the film! I shoot for the edit. I started off using Adobe Premiere CC for the early offline and then quickly moved over to Resolve 14, where I did the majority of the editing. It was great because I was doing a lot of online editorial tasks like stabilizing, basic VFX, pan and scans, as well as establishing temp looks while editing. So in a way there was no offline and online editorial, as it was all part of one workflow. We did all our deliverables out of Resolve 14, too.

Color plays big role in director Sean Baker’s The Florida Project

Director Sean Baker is drawing wide praise for his realistic portrait of life on the fringe in America in his new film The Florida Project. Baker applies a light touch to the story of a precocious six-year-old girl living in the shadow of Disney World, giving it the feel of a slice-of-life documentary. That quality is carried through in the film’s natural look. Where Baker shot his previous film, Tangerine, entirely with an iPhone, The Florida Project was recorded almost wholly on anamorphic 35mm film by cinematographer Alexis Zabe.

Sam Daley

Post finishing for the film was completed at Technicolor PostWorks New York, which called on a traditional digital intermediate workflow to accommodate Baker’s vision. The work began with scanning the 35mm negative to 2K digital files for dailies and editorial. It ended months later with rescanning at 4K and 6K resolution, editorial conforming and color grading in the facility’s 4K DI theater. Senior colorist Sam Daley applied the final grade via Blackmagic Resolve v.12.5.

Shooting on film was a perfect choice, according to Daley, as it allowed Baker and Zabe to capture the stark contrasts of life in Central Florida. “I lived in Florida for six years, so I’m familiar with the intensity of light and how it affects color,” says Daley. “Pastels are prominent in the Florida color palette because of the way the sun bleaches paint.”

He adds that Zabe used Kodak Vision3 50D and 250D stock for daylight scenes shot in the hot Florida sun, noting, “The slower stock provided a rich color canvas, so much so, that at times we de-emphasized the greenery so it didn’t feel hyper real.”

The film’s principal location is a rundown motel, ironically named the Magic Castle. It does not share the sun-bleached look of other businesses and housing complexes in the area as it has been freshly painted a garish shade of purple.

Baker asked Daley to highlight such contrasts in the grade, but to do so subtly. “There are many colorful locations in the movie,” Daley says. “The tourist traps you see along the highway in Kissimmee are brightly colored. Blue skies and beautiful sunsets appear throughout the film. But it was imperative not to allow the bright colors in the background to distract from the characters in the foreground. The very first instruction that I got from Sean was to make it look real, then dial it up a notch.”

Mixing Film and Digital for Night Shots
To make use of available light, nighttime scenes were not shot on film, but rather were captured digitally on an Arri Alexa. Working in concert with color scientists from Technicolor PostWorks New York and Technicolor Hollywood, Daley helmed a novel workflow to make the digital material blend with scenes that were film-original. He first “pre-graded” the digital shots and then sent them to Technicolor Hollywood where they were recorded out to film. After processing at FotoKem, the film outs were returned to Technicolor Hollywood and scanned to 4K digital files. Those files were rushed back to New York via Technicolor’s Production Network where Daley then dropped them into his timeline for final color grading. The result of the complex process was to give the digitally acquired material a natural film color and grain structure.

“It would have been simpler to fly the digitally captured scenes into my timeline and put on a film LUT and grain FX,” explains Daley, “but Sean wanted everything to have a film element. So, we had to rethink the workflow and come up with a different way to make digital material integrate with beautifully shot film. The process involved several steps, but it allowed us to meet Sean’s desire for a complete film DI.”

Calling on iPhone for One Scene
A scene near the end of the film was, for narrative reasons, captured with an iPhone. Daley explains that, although intended to stand out from the rest of the film, the sequence couldn’t appear so different that it shocked the audience. “The switch from 4K scanned film material to iPhone footage happens via a hard cut,” he explains. “But it needed to feel like it was part of the same movie. That was a challenge because the characteristics of Kodak motion picture stock are quite different from an iPhone.”

The iPhone material was put through the same process as the Alexa footage; it was pre-graded, recorded out to film and scanned back to digital. “The grain helps tie it to the rest of the movie,” reports Daley. “And the grain that you see is real; it’s from the negative that the scene was recorded out to. There are no artificial looks and nothing gimmicky about any of the looks in this film.”

The apparent lack of artifice is, in fact, one of the film’s great strengths. Daley notes that even a rainbow that appears in a key moment was captured naturally. “It’s a beautiful movie,” says Daley. “It’s wonderfully directed, photographed and edited. I was very fortunate to be able to add my touch to the imagery that Sean and Alexis captured so beautifully.”

My top workstation accessories

By Brady Betzel

As a working video editor, I’m at my desk and on my computer all day. So when I get home I want my personal workstation to feel as powerful as possible and having the right tools to support that experience are paramount.

I’m talking workstation accessories. I’ve put together a short list based on my personal experience. Some are well known, while some are slightly under the radar. Either way, they all make my editing life easier and more productive.

They make my home-based workstation feel like a full-fledged professional edit suite.

Wacom Intuos Pro Medium
In my work as an offline editor, I started to have some wrist pain when I used a mouse in conjunction with my keyboard. That is when I decided to jump head first into using a Wacom tablet. Within two weeks, all of my pain went away and I felt that I had way more control over drawing objects and shapes. I specifically noticed more precision when working inside of apps like Adobe’s After Effects and Photoshop when drawing accurate lines and shapes with bezier handles.

In addition, you can program minimal macros on the express keys on the side of the tablet. While the newest Wacom Intuos Pro Medium tablet costs a cool $349.95, it will pay for itself with increased efficiency and, in my experience, less wrist pain.

Genelec 8010A Studio Monitors
One workstation accessory that will blow you away is a great set of studio monitors. Genelec is known for making some great studio monitors and the 8010A are a set I wish I could get. These monitors are small —  around 8-inches tall by 4-inches deep and 4-inches wide — but they put out some serious power at 96dBs.

Don’t be fooled by their small appearance; they are a great complement to any serious video and audio power user. They connect via XLR, so you may need to get some converters if you are going straight out of your station, without runing through a mixer. These speakers are priced at $295 each; they aren’t cheap, but they are another important accessory that will further turn your bay into a professional suite.

Tangent Element & Blackmagic Resolve Color Correction Panels
If you work in color correction, or aspire to color correct, color correction panels are a must. They not only make it easier for you to work in apps like Blackmagic DaVinci Resolve, but they free your mind from worrying about where certain things are and let your fingers do the talking. It is incredibly liberating to use color correction panels when doing a color grade — it feels like you have another arm you can use to work.

The entire set of Tangent Element Panels costs over $3,300, but if you are just getting started, the Tangent Element Tk (just the trackballs) can be had for a little over $1,100. What’s nice about the Tangent panels is that they work with multiple apps, including Adobe Premiere, FilmLight Baselight, etc. But if you know you are only going to be using Resolve, the Resolve Micro or Mini panels are a great deal at under $1,000 and $3,000, respectively.

Logitech G13 Advanced Gameboard
This one might sound a bit odd at first, but once you do some research you will see that many professional editors use these types of pads to program macros of multiple button pushes or common tasks. Essentially, this is a macro pad that has 25 programmable keys as well as a thumb controlled joystick. It’s a really intriguing piece of hardware that might be able to take place of your mouse in conjunction with your keyboard. It is competitively priced at only $79.99 and, with a little Internet research on liftgammagain.com, you can even find forums of user’s custom mappings.

Logickeyboard Backlit Keyboard
Obviously, the keyboard is one of the most used workstation accessories. One difficulty is trying to work with one in a dark room. Well, Logickeyboard has a dimmable backlit keyboard series for apps like Resolve and Avid Media Composer.

In addition to being backlit, they also have two powered USB 2.0 ports that really come in handy. These retail for around $140, so they are a little pricey for a keyboard but, take it from me, they will really polish that edit suite.

OWC USB-C Dock
With ports on Mac-based systems being stripped away, a good USB-C dock is a great extension to have in your edit suite. OWC offers a Mini-DisplayPort or HDMI-equipped version in the colors that match your MacBook Pro, if you have one.

In addition, you get five USB 3.1-compatible ports — including two of those being a high-powered charging port and a USB type C port — a Gig-E port, front facing SD card reader, combo audio in/out port and Mini-DisplayPort or HDMI port. These retail for under $150.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

 

Behind the Title: Carbon’s head of color grading, Maria Carretero

NAME: Maria Carretero

COMPANY: Carbon (@carbonvfx)

CAN YOU DESCRIBE YOUR COMPANY?
Carbon, which has locations in New York, Chicago and LA, is a boutique design and visual effects company that focuses on art, ideas, and talent. I am based in Chicago.

WHAT’S YOUR JOB TITLE?
Head of color grading

WHAT DOES THAT ENTAIL?
I work with clients to find color palettes and looks that best tell their story.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
I’m still surprised with how much creativity is involved in the grading process. I especially love coming to solutions through mixing art and technique.

WHAT SYSTEM DO YOU WORK ON?
Blackmagic DaVinci Resolve

ARE YOU SOMETIMES ASKED TO DO MORE THAN JUST COLOR ON PROJECTS?
Sometimes I’m involved in projects from very early on, which usually entails creating tone and color palettes to be used during filming. I’ve also contributed reference images that help the creative team settle on an overall look or color, and have been present at camera tests to check light and exposure. I love being able to use my artistic background in painting and the fine arts to give projects their maximum creative potential.

Jeep “Two Words”

WHAT’S YOUR FAVORITE PART OF THE JOB?
There are moments when I’m grading a piece and I have a strange connection with it as if I’m seeing it clearly for the first time. It’s like I suddenly know it’s going to be a wonderful grade. Moments like that are magical.

WHAT IS YOUR FAVORITE TIME OF THE DAY?
Sunset, when all the lights in the city start turning on.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Painting.

WHY DID YOU CHOOSE THIS PROFESSION?
Color grading chose me. Before I had officially started grading, I spent a lot of my time focusing on my painting, and while grading was something I could technically excel at, my art was the priority at the time. Then digital grading started gaining momentum in Spain and I gradually realized that color grading opportunities were more and more important to me. I feel extremely lucky. I’m self-taught and relied on my incredible network of supporters to give me chances to go further and further.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
I worked on the last five Jeep commercials with a talented group of people from DDB Chicago and the great director Tobias Granström. Some other projects included a huge campaign for Panama Tourism out of VML, the hilarious Liquid Plumbr spot from FBC Chicago and the newest Machine Gun Kelly music video, directed by Steven Caple Jr., which has more than four million views on YouTube.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
I’m proud of a lot of my recent work, but Recalculating for Jeep was an incredibly challenging and fulfilling project. We did a lot of research around the idea of sunsets, focusing on the sophistication of light and keeping it as natural feeling as possible.

Jeep “Recalculating”

WHERE DO YOU FIND INSPIRATION? ART? PHOTOGRAPHY?
Art, life, reading and my previous experiences.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
My cell, my Dolby monitor and Spotify.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
I have Facebook, Twitter, Vimeo, Movidiam and Instagram, where I recently started @carretero.color to feature my color work.

DO YOU LISTEN TO MUSIC WHILE YOU WORK?
I always listen to music when I work. Music is great support — it can make you happy in a second! I listen to a lot of different bands, but Band of Horses, Tame Impala, Neil Young, Flaming Lips, Eels, Devendra, The Kills and Spanish music, like Niña Vintage, are some of my favorites!

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I never let stress control me. Sometimes the challenges of the job are huge, but that’s our work and our industry — we know how to get it done. Challenges are wonderful because they point directly at our creativity. Being uncomfortable sometimes is a good thing. It makes you break down your limitations. Working with a great group of people helps a lot, too, so you can always have fun, even on the hardest days.

Dog in the Night director/DP Fletcher Wolfe

By Cory Choy

Silver Sound Showdown Music + Video Festival is unique in two ways. First, it is both a music video festival and battle of the bands at the same time. Second, every year we pair up the Grand Prize-winners, director and band, and produce a music video with them. The budget is determined by the festival’s ticket sales.

I conceived of the festival, which is held each year at Brooklyn Bowl, as a way to both celebrate and promote artistic collaboration between the film and music communities — two crowds that just don’t seem to intersect often enough. One of the most exciting things for me is then working with extremely talented filmmakers and musicians who have more often than not met for the first time at our festival.

Dog in the Night (song written by winning band Side Saddle) was one of our most ambitious videos to date — using a combination of practical and post effects. It was meticulously planned and executed by director/cinematographer Fletcher Wolfe, who was not only a pleasure to work with, but was gracious enough to sit down with me for a discussion about her process and the experience of collaborating.

What was your favorite part of making Dog in the Night?
As a music video director I consider it my first responsibility to get to know the song and its meaning very intimately. This was a great opportunity to stretch that muscle, as it was the first time I was collaborating with musicians who weren’t already close friends. In fact, I hadn’t even met them before the Showdown. I found it to be a very rewarding experience.

What is Dog in the Night about?
The song Dog in the Night is, quite simply, about a time when the singer Ian (a.k.a. Angler Boy) is enamored with a good friend, but that friend doesn’t share his romantic feelings. Of course, anyone who has been in that position (all of us?) knows that it’s never that simple. You can hear him holding out hope, choosing to float between friendship and possibly dating, and torturing himself in the process.

I decided to use dusk in the city to convey that liminal space between relationship labels. I also wanted to play on the nervous and lonely tenor of the track with images of Angler Boy surrounded by darkness, isolated in the pool of light coming from the lure on his head. I had the notion of an anglerfish roaming aimlessly in an abyss, hoping that another angler would find his light and end his loneliness. The ghastly head also shows that he doesn’t feel like he has anything in common with anybody around him except the girl he’s pining after, who he envisions having the same unusual head.

What did you shoot on?
I am a DP by trade, and always shoot the music videos I direct. It’s all one visual storytelling job to me. I shot on my Alexa Mini with a set of Zeiss Standard Speed lenses. We used the 16mm lens on the Snorricam in order to see the darkness around him and to distort him to accentuate his frantic wanderings. Every lens in the set weighed in at just 1.25lbs, which is amazing.

The camera and lenses were an ideal pairing, as I love the look of both, and their light weight allowed me to get the rig down to 11lbs in order to get the Snorricam shots. We didn’t have time to build our own custom Snorricam vest, so I found one that was ready to rent at Du-All Camera. The only caveats were that it could only handle up to 11lbs, and the vest was quite large, meaning we needed to find a way to hide the shoulders of the vest under Ian’s wardrobe. So, I took a cue from Requiem for a Dream and used winter clothing to hide the bulky vest. We chose a green and brown puffy vest that held its own shape over the rig-vest, and also suited the character.

I chose a non-standard 1.5:1 aspect ratio, because I felt it suited framing for the anglerfish head. To maximize resolution and minimize data, I shot 3.2K at a 1.78:1 aspect ratio and cropped the sides. It’s easy to build custom framelines in the Alexa Mini for accurate framing on set. On the Mini, you can also dial in any frame rate between 0.75-60fps (at 3.2K). Thanks to digital cinema cameras, it’s standard these days to over-crank and have the ability to ramp to slow motion in post. We did do some of that; each time Angler Boy sees Angler Girl, his world turns into slow motion.

In contrast, I wanted his walking around alone to be more frantic, so I did something much less common and undercranked to get a jittery effect. The opening shot was shot at 6fps with a 45-degree shutter, and Ian walked in slow motion to a recording of the track slowed down to quarter-time, so his steps are on the beat. There are some Snorricam shots that were shot at 6fps with a standard 180-degree shutter. I then had Ian spin around to get long motion blur trails of lights around him. I knew exactly what frame rate I wanted for each shot, and we wound up shooting at 6fps, 12fps, 24fps, 48fps and 60fps, each for a different emotion that Angler Boy is having.

Why practical vs. CG for the head?
Even though the fish head is a metaphor for Angler Boy’s emotional state, and is not supposed to be real, I wanted it to absolutely feel real to both the actor and the audience. A practical, and slightly unwieldy, helmet/mask helped Ian find his character. His isolation needed to be tangible, and how much he is drawn to Angler Girl as a kindred spirit needed to be moving. It’s a very endearing and relatable song, and there’s something about homemade, practical effects that checks both those boxes. The lonely pool of light coming from the lure was also an important part of the visuals, and it needed to play naturally on their faces and the fish mask. I wired Lite Gear LEDs into the head, which was the easy part. Our incredibly talented fabricator, Lauren Genutis, had the tough job — fabricating the mask from scratch!

The remaining VFX hurdle then was duplicating the head. We only had the time and money to make one and fit it to both actors with foam inserts. I planned the shots so that you almost never see both actors in the same shot at the same time, which kept the number of composited shots to a minimum. It also served to maintain the emotional disconnect between his reality and hers. When you do see them in the same shot, it’s to punctuate when he almost tells her how he feels. To achieve this I did simple split screens, using the Pen Tool in Premiere to cut the mask around their actions, including when she touches his knee. To be safe, I shot takes where she doesn’t touch his knee, but none of them conveyed what she was trying to tell him. So, I did a little smooshing around of the two shots and some patching of the background to make it so the characters could connect.

Where did you do post?
We were on a very tight budget, so I edited at home, and I always use Adobe Premiere. I went to my usual colorist, Vladimir Kucherov, for the grade. He used Blackmagic Resolve, and I love working with him. He can always see how a frame could be strengthened by a little shaping with vignettes. I’ll finally figure out what nuance is missing, and when I tell him, he’s already started working on that exact thing. That kind of shaping was especially helpful on the day exteriors, since I had hoped for a strong sunset, but instead got two flat, overcast days.

The only place we didn’t see eye to eye on this project was saturation — I asked him to push saturation farther than he normally would advise. I wanted a cartoon-like heightening of Angler Boy’s world and emotions. He’s going through a period in which he’s feeling very deeply, but by the time of writing the song he is able to look back on it and see the humor in how dramatic he was being. I think we’ve all been there.

What did you use VFX for?
Besides having to composite shots of the two actors together, there were just a few other VFX shots, including dolly moves that I stabilized with the Warp Stabilizer plug-in within Premiere. We couldn’t afford a real dolly, so we put a two-foot riser on a Dana Dolly to achieve wide push-ins on Ian singing. We were rushing to catch dusk between rainstorms, and it was tough to level the track on grass.

The final shot is a cartoon night sky composited with a live shot. My very good friend, Julie Gratz of Kaleida Vision, made the sky and animated it. She worked in Adobe After Effects, which communicates seamlessly with Premiere. Julie and I share similar tastes for how unrealistic elements can coexist with a realistic world. She also helped me in prep, giving feedback on storyboards.

Do you like the post process?
I never used to like post. I’ve always loved being on set, in a new place every day, moving physical objects with my hands. But, with each video I direct and edit I get faster and improve my post working style. Now I can say that I really do enjoy spending time alone with my footage, finding all the ways it can convey my ideas. I have fun combining real people and practical effects with the powerful post tools we can access even at home these days. It’s wonderful when people connect with the story, and then ask where I got two anglerfish heads. That makes me feel like a wizard, and who doesn’t like that?! A love of movie magic is why we choose this medium to tell our tales.


Cory Choy, Silver Sound Showdown festival director and co-founder of Silver Sound Studios, produced the video.

Blackmagic intros lower-cost color panels for Resolve, new camera

By Brady Betzel

Yesterday, Blackmagic held a press conference on YouTube introducing a new pro camera — the Ursa Mini Pro 4.6K, which combines high-end digital film quality with the ergonomics and features of a traditional broadcast camera — and two new portable hardware control panels for the DaVinci Resolve (yes, only the Resolve) designed to allow color correction workflows to be mixed in with editing workflows.

For this article, I’m going to focus on the panels.

The color correction hardware market is a small one, usually headed by the same companies who produce color correction software. Tangent is one of the few that produces its own color correction panels. There is also the Avid/Euphonix Artist color correction panel and a few others, but the price jumps incredibly when you step up to panels like the Blackmagic DaVinci Resolve Advanced panels (just under $30,000).

I’ve previously reviewed the Tangent Ripple and Element color correction panels, and I love them. However, besides Tangent there really hasn’t been any mid- to prosumer-level products… until now. Blackmagic is offering the new Micro and Mini color correction panels.

The Blackmagic’s Micro color correction panel (our main image) is well priced at $995, which can be somewhat compared to the Tangent Wave (over $1,500 on B&H‘s site), Tangent Element Tk (over $1,135), or more closely compared to the Avid Artist Color Control Surface ($1,299). You’ll notice all of those are priced way higher than the new Micro panel. You could also throw the Tangent Ripple up for comparison, but that has a much more limited functionality and is much lower in price at around $350. The Micro panel is essentially three trackballs, 12 knobs and 18 keys. It is a collection of the most highly used parts of a color correction panel without any GUI screens. It connects via USB-C, although a USB 3 to USB-C converter will be included.

The Blackmagic Mini color correction panel (pictured right) is priced higher at $2,995 and can be compared to a combo of the Tangent Element Tk with one or two more in the Element set, which retail for $3,320 on www.bhphotovideo.com. The Mini adds two 5-inch displays, eight soft buttons, and eight soft knobs, in addition to everything the Micro panel has. It also has pass-through Ethernet to power and connect the panel, USB-C, and 4-pin XLR 12V DC power connection.

I am really excited to try these color correction panels out for my own — and I will, as the panels are on their way to me as I type. I need to emphasize that these panels only work with Resolve, no other software apps, so these were built with one workflow in mind.

I do wonder if in the future Blackmagic will sell additional panels that add more buttons and knobs or something crazy like a Smartscope through the Ethernet ports so I don’t have to buy additional SDI output hardware. Will everyone be ok with transport controls being placed on the right?

“We are always looking to design new products and features to help with the creative process,” says Blackmagic’s Bob Caniglia. “These new panels were designed to enable our growing number of Resolve users to be able access the power of DaVinci Resolve and Resolve Studio beyond a mouse and keyboard. The Micro and Mini control panels provide the perfect complement to our existing Advanced control panels.”

Blackmagic is really coming for everyone in the production and post world with recent moves like the acquisition of audio company Fairlight and realtime bluescreen and greenscreen removal hardware Ultimatte, providing Avid with their Media Composer DNx IOs, and even releasing an updated version of the Ursa camera, the Ursa Mini Pro. Oh, yeah, and don’t forget they provide one of the top color correction and editing apps on the market in DaVinci Resolve, and the latest color correction hardware like the Micro and Mini panels are primed to bring the next set of colorists into the Resolve world.

Oh, and as not to forget about the camera, the Ursa Mini Pro 4.6K is now available for $5,995. Here are some specs:

•  Digital film camera with 15 stops of dynamic range.
• Super 35mm 4.6K sensor with third-generation Blackmagic color science processing of raw sensor data.
• Interchangeable lens mount with EF mount included as standard. Optional PL and B4 lens mount available separately.
• High-quality 2, 4 and 6 stop ND filters with IR compensation designed to specifically match the colorimetry and color science of Ursa Mini Pro.
• Fully redundant controls including ergonomically designed tactile controls that allow direct access to the most important camera settings such as external power switch, ND filter wheel, ISO, shutter, white balance, record button, audio gain controls, lens and transport control, high frame rate button and more.
• Built-in dual C-Fast 2.0 recorders and dual SD/UHS-II card recorders allow unlimited duration recording in high quality.
• LCD status display for quickly checking timecode, shutter and lens settings, battery, recording status and audio levels.
• Support for CinemaDNG 4.6K RAW files and ProRes 4444 XQ, ProRes 4444, ProRes 422 HQ, ProRes 422, ProRes 422 LT and ProRes 422 Proxy recording at Ultra HD and HD resolutions.
• Supports up to 60 fps 4.6K resolution capture in RAW.
• Features all standard connections, including dual XLR mic/line audio inputs with phantom power, 12G-SDI output for monitoring with camera status graphic overlay and separate XLR 4-pin power output for viewfinder power, headphone jack, LANC remote control and standard 4-pin 12V DC power connection.
• Built-in stereo microphones for recording sound.
• Four-inch foldout touchscreen for on-set monitoring and menu settings.

Review: Apple’s new MacBook Pro

By Brady Betzel

What do you need to know about the latest pro laptop from Apple? Well, the MacBook Pro is fast and light; the new Touch Bar is handy and sharp but not fully realized; the updated keys on the keyboard are surprisingly great; and working with ProRes QuickTime files in resolutions higher than 1920×1080 inside of FCP X, or any NLE for that matter, is blazing fast.

When I was tasked with reviewing the new MacBook Pro, I came into it with an open mind. After all, I did read a few other reviews that weren’t exactly glowing, but I love speed and innovation among professional workstation computers, so I was eager to test it myself.

I am pretty open-minded when it comes to operating systems and hardware. I love Apple products and I love Windows-based PCs. I think both have their place in our industry, and to be quite honest it’s really a bonus for me that I don’t rely heavily on one OS or get too tricked by the Command Key vs. Windows/Alt Key.

Let’s start with the call I had with the Apple folks as they gave me the lowdown on the new MacBook Pro. The Apple reps were nice, energetic, knowledgeable and extremely helpful. While I love Apple products, including this laptop, it’s not the be-all-end-all.

The Touch Bar is nice, but not a revolution. It feels like the first step in an evolution, a version 1 of an innovation that I am excited to see more of in later iterations. When I talked with the Apple folks they briefed me on what Tim Cook showed off in the reveal: emoji buttons, wide gamut display, new speakers and USB-C/Thunderbolt 3 connectivity.

NLEs
They had an FCPX expert on the call, which was nice considering I planned on reviewing the MacBook Pro with a focus on the use of nonlinear editing apps, such as Adobe Premiere Pro, Avid Media Composer and Blackmagic’s Resolve. Don’t get me wrong, FCPX is growing on me — it’s snappy jumping around the timeline with ProRes 5K footage; assigning roles are something I wish every other app would pick up on; and the timeline is more of a breeze to use with the latest update.

The other side to this is that in my 13 years of working in television post I have never worked on a show that primarily used FCP or FCPX to edit or finish on. This doesn’t mean I don’t like the NLE, it simply means I haven’t relied on it in a professional working environment. Like I said, I really like the road it’s heading down, and if they work their way into mainstream broadcast or streaming platforms a little more I am sure I will see it more frequently.

Furthermore, with the ever-growing reduction in reliance on groups of editors and finishing artists apps like FCPX are poised to shine with their innovation. After all that blabbering, in this review I will touch on FCPX, but I really wanted to see how the MacBook Pro performed with the pro NLEs I encounter the most.

Specs
Let’s jump into the specs. I was sent a top-of-the-line 15-inch MacBook Pro with Touch Bar, which costs $3,499 if configured online. It comes with a quad/-core Intel Core i7 2.9GHz (up to 3.8 GHz using Turbo Boost) processor, 16GB of 2133MHz memory, 1TB PCI-e SSD hard drive and Radeon Pro 460 with 4GB of memory. It’s loaded. I think the only thing that can actually be upgraded beyond this configuration would be to include a 2TB hard drive, which would add another $800 to the price tag.

Physically, the MacBook Pro is awesome — very sturdy, very thin and very light. It feels great when holding it and carrying it around. Apple even sent along a Thunderbolt 3 (USB-C) to Thunderbolt 2 adapter, which costs an extra $29 and a USB-C to Lightning Cable that costs an extra $29.

So yes, it feels great. Apple has made a great new MacBook Pro. Is it worth upgrading if you have a new-ish MacBook Pro at home already? Probably not, unless the Touch Bar really gets you going. The speed is not too far off from the previous version. However, if you have a lot of Thunderbolt 3/USB-C-connected peripherals, or plan on moving to them, then it is a good upgrade.

Testing
I ran some processor/graphics card intensive tests while I had the new MacBook Pro and came to the conclusion that FCPX is not that much faster than Adobe Premiere Pro CC 2017 when working with non-ProRes-based media. Yes, FCPX tears through ProRes QuickTimes if you already have your media in that format. What about if you shoot on a camera like the Red and don’t want to transcode to a more edit-friendly codec? Well, that is another story. To test out my NLEs, I grabbed a sample Red 6K 6144×3160 23.98fps clip from the Red sample footage page, strung out a 10-minute-long sequence in all the NLEs and exported both a color-graded version and a non-color-graded version as ProRes HQ QuickTimes files matching the source file’s specs.

In order to work with Red media in some of the NLEs, you must download a few patches: for FCPX you must install the Red Apple workflow installer and for Media Composer you must install the Red AMA plug-in. Premiere doesn’t need anything extra.

Test 1: Red 6K 6144×3160 23.98fps R3D — 10-minute sequence (no color grade or FX) exported as ProRes HQ matching the source file’s specs. Premiere > Media Encoder = 1 hour, 55 minutes. FCPX = 1 hour, 57 minutes. Media Composer = two hours, 42 minutes (Good news, Media Composer’s interface and fonts display correctly on the new display).

You’ll notice that Resolve is missing from this list and that is because I installed Resolve 12.5.4 Studio but then realized my USB dongle won’t fit into the USB-C port — and I am not buying an adapter for a laptop I do not get to keep. So, unfortunately, I didn’t test a true 6K ProRes HQ export from Resolve but in the last test you will see some Resolve results.

Overall, there was not much difference in speeds. In fact, I felt that Premiere Pro CC 2017 played the Red file a little smoother and at a higher frames-per-second count. FCPX struggled a little. Granted a 6K Red file is one of the harder files for a CPU to process with no debayer settings enabled, but Apple touts this as a MacPro semi-replacement for the time being and I am holding them to their word.

Test 2: Red 6K 6144×3160 23.98fps R3D — 10-minute color-graded sequence exported as ProRes HQ matching the source files specs. Premiere > Media Encoder = one hour, 55 minutes. FCPX = one hour, 58 minutes. Media Composer = two hours, 34 minutes.

It’s important to note that the GPU definitely helped out in both Adobe Premiere and FCPX. Little to no extra time was added on the ProRes HQ export. I was really excited to see this as sometimes without a good GPU — resizing, GPU-accelerated effects like color correction and other effects will slow your system to a snail’s pace if it doesn’t fully crash. Media Composer surprisingly speed up its export when I added the color grade as a new color layer in the timeline. By adding the color correction layer to another layer Avid might have forced the Radeon to kick in and help push the file out. Not really sure what that is about to be honest.

Test 3: Red 6K 6144×3160 23.98fps R3D — 10-minute color-graded sequence resized to 1920×1080 on export as ProRes HQ. Premiere > Media Encoder = one hour, 16 minutes. FCPX = one hour, 14 minutes. Media Composer = one hour, 48 minutes. Resolve = one hour, 16 minutes

So after these tests, it seems that exporting and transcoding are all about the same. It doesn’t really come as too big of a surprise that all the NLEs, except for Media Composer, processed the Red file in the same amount of time. Regardless of the NLE, you would need to knock the debayering down to a half or more to start playing these clips at realtime in a timeline. If you have the time to transcode to ProRes you will get much better playback and rendering speed results. Obviously, transcoding all of your files to a codec, like ProRes or Avid DNX, takes way more time up front but could be worth it if you crunched for time on the back end.

In addition to Red 6K files, I also tested ProRes HQ 4K files inside of Premiere and FCPX, and both played them extremely smoothly without hiccups, which is pretty amazing. Just a few years ago I was having trouble playing down 10:1 compressed files in Media Composer and now I can playback superb-quality 4K files without a problem, a tremendous tip of the hat to technology and, specifically, Apple for putting so much power in a thin and light package.

While I was in the mood to test speeds, I hooked up a Thunderbolt 2 SSD RAID (OWC Thunderbay 4 mini) configured in RAID-0 to see what kind of read/write bandwidth I would get running through the Apple Thunderbolt 3 to Thunderbolt 2 adapter. I used both AJA System Test as well as the Blackmagic Disk Speed Test. The AJA test reported a write speed of 929MB/sec. and read speed of 1120MB/sec. The Blackmagic test reported a write speed of 683.1MB/sec. and 704.7MB/sec. from different tests and a read speed of 1023.3MB/sec. I set the test file for both at 4GB. These speeds are faster than what I have previously found when testing this same Thunderbolt 2 SSD RAID on other systems.

For comparison, the AJA test reported a write speed of 1921MB/sec. and read speed of 2134MB/sec. when running on the system drive. The Blackmagic test doesn’t allow for testing on the system drive.

What Else You Need to Know
So what about the other upgrades and improvements? When exporting these R3D files I noticed the fan kicked on when resizing or adding color grading to the files. Seems like the GPU kicked on and heated up which is to be expected. The fan is not the loudest, but it is noticeable.

The battery life on the new MacBook Pro is great when just playing music, surfing the web or writing product reviews. I found that the battery lasted about two days without having to plug in the power adapter. However, when exporting QuickTimes from either Premiere or FCPX the battery life dropped — a lot. I was getting a battery life of one hour and six minutes, which is not good when your export will take two hours. Obviously, you need to plug in when doing heavy work; you don’t really have an option.

This leads me to the new USB-C/Thunderbolt 3 ports — and, yes, you still have a headphone jack (thank goodness they didn’t talk with the iPhone developers). First off, I thought the MagSafe power adapter should have won a Nobel Peace Prize. I love it. It must be responsible for saving millions of dollars in equipment when people trip over a power cord — gracefully disconnecting without breaking or pulling your laptop off the table. However, I am disappointed Apple didn’t create a new type of MagSafe cable with the USB-C port. I will miss it a lot. The good news is you can now plug in your power adapter to either side of the MacBook Pro.

Adapters and dongles will have to be purchased if you pick up a new MacBook Pro. Each time I used an external peripheral or memory card like an SD card, Tangent Ripple Color Correction panel or external hard drive, I was disappointed that I couldn’t plug them in. Nonetheless, a good Thunderbolt 3 dock is a necessity in my opinion. You could survive with dongles but my OCD starts flaring up when I have to dig around my backpack for adapters. I’m just not a fan. I love how Apple dedicated themselves to a fast I/O like USB-C/Thunderbolt 3, but I really wish they gave it another year. Just one old-school USB port would have been nice. I might have even gotten over no SD card reader.

The Touch Bar
I like it. I would even say that I love it — in the apps that are compatible. Right now there aren’t many. Adobe released an update to Adobe Photoshop that added compatibility with the Touch Bar, and it is really handy especially when you don’t have your Wacom tablet available (or a USB dongle to attach it). I love how it gives access to so many levels of functionality to your tools within your immediate reach.

It has super-fast feedback. When I adjusted the contrast on the Touch Bar I found that the MacBook Pro was responding immediately. This becomes even more evident in FCPX and the latest Resolve 12.5.4 update. It’s clear Apple did their homework and made their apps like Mail and Messages work with the Touch Bar (hence emojis on the Touch Bar). FCPX has a sweet ability to scrub the timeline, zoom in to the timeline, adjust text and more from just the Touch Bar — it’s very handy, and after a while I began missing it when using other computers.
In Blackmagic’s latest DaVinci Resolve release, 12.5.4, they have added Touch Bar compatibility. If you can’t plug in your color correction panels, the Touch Bar does a nice job of easing the pain. You can do anything from contrast work to saturation, even adjust the midtones and printer lights, all from the Touch Bar. If you use external input devices a lot, like Wacom tablets or color correction panels, the Touch Bar will be right up your alley.

One thing I found missing was a simple application launcher on the Touch Bar. If you do pick up the new MacBook Pro with Touch Bar, you might want to download Touch Switcher, a free app I found via 9to5mac.com that allows you to have an app launcher on your Touch Bar. You can hide the dock, allowing you more screen real estate and the efficient use of the Touch Bar to launch apps. I am kind of surprised Apple didn’t make something like this standard.

The Display
From a purely superficial and non-scientific point of view, the newly updated P3-compatible wide-gamut display looks great… really great, actually. The colors are rich and vibrant. I did a little digging under the hood and noticed that it is an 8-bit display (data that you can find by locating the pixel depth in the System Information > Graphics/Display), which might limit the color gradations when working in a color space like P3 as opposed to a 10-bit display displaying in a P3 color space. Simply, you have a wider array of colors in P3 but a small amount of color shades to fill it up.

The MacBook Pro display is labeled as 32-bit color meaning the RGB and Alpha channels each have 8 bits, giving a total of 32 bits. Eight-bit color gives 256 shades per color channel while 10-bit gives 1,024 shades per channel, allowing for much smoother transitions between colors and luminance values (imagine a sky at dusk going smoothly from an orange to light blue to dark blue — the more colors per channel allows for a smoother gradient between lights and darks). A 10-bit display would have 30-bit color with each channel having 10 bits.

I tried to hook up a 10-bit display, but the supplied Thunderbolt 3 to Thunderbolt 2 dongle Apple sent me did not work with the mini display port. I did a little digging and it seems people are generally not happy that Apple doesn’t allow this to work, especially since Thunderbolt 2 and mini DisplayPort are the same connection. Some people have been able to get around this by hooking up their display through daisy chaining something like a Thunderbolt 2 RAID.

While I couldn’t directly test an external display when I had the MacBook Pro, I’ve read that people have been able to push 10-bit color out of the USB-C/Thunderbolt 3 ports to an external monitor. So as long as you are at a desk with a monitor you can most likely have 10-bit color output from this system.

I reached out to Apple on the types of adapters they recommend for an external display and they suggest a USB-C to DisplayPort adapter made by Aukey. It retails for $9.99. They also recommend the USB-C to DisplayPort cable from StarTech, which retails for $39.99. Make sure you read the reviews on Amazon because the experience people have with this varies wildly. I was not able to test either of these so I cannot give my personal opinion.

Summing Up
In the end, the new MacBook Pro is awesome. If you own a recent release of the MacBook Pro and don’t have $3,500 to spare, I don’t know if this is the update you will be looking for. If you are trying to find your way around going to a Windows-based PC because of the lack of Mac Pro updates, this may ease the pain slightly. Without more than 16GB of memory and an Intel Xeon or two, however, it might actually slow you down.

The battery life is great when doing light work, one of the longest batteries I’ve used on a laptop. But when doing the heavy work, you need to be near an outlet. When plugged into that outlet be careful no one yanks out your USB-C power adapter as it might throw your MacBook Pro to the ground or break off inside.

I really do love Apple products. They typically just work. I didn’t even touch on the new Touch ID Sensor that can immediately switch you to a different profile or log you in after waking up the MacBook Pro from sleep. I love that you can turn the new MacBook Pro on and it simply works, and works fast.

The latest iteration of FCPX is awesome as well, and just because I don’t see it being used a lot professionally doesn’t mean it shouldn’t be. It’s a well-built NLE that should be given a fairer shake than it has been given. If you are itching for an update to an old MacBook Pro, don’t mind having a dock or carrying around a bunch of dongles, then the 2016 MacBook Pro with the Touch Bar is for you.

The new MacBook Pro chews through ProRes-based media from 1920×1080 to 4K, 6K and higher will play but might slow down. If you are a Red footage user this new MacBook Pro works great, but you still might have to knock the debayering down a couple notches.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: The HP Z1G3 All-in-One workstation

By Brady Betzel

I’ll admit it. I’ve always been impressed with HP’s All-in-One workstations — from their z840 to their zBook mobile workstation and now their HP Z1G3. Yes, I know, the HP line of workstations are not cheap. In fact, you can save quite a bit of money building your own system, but you will probably have tons of headaches unless you are very confident in your computer-building skills. And if you don’t mind standing in the return line at the Fry’s Electronics.

HP spends tons of time and money on ISV certifications for their workstations. ISV certification stands for Independent Software Vendor certification. In plain English it means that HP spends a lot of time and money making sure the hardware inside of your workstation works with the software you use. For an industry pro that means apps like Adobe’s Premiere Pro and After Effects, Avid Media Composer, Autodesk products like 3DS Max and many others.

For this review,  I tested apps like Avid Media Composer, FilmLight’s Baselight for Media Composer color correction plug-in, Adobe Premiere Pro, Adobe Media Encoder and Adobe After Effects, as well as Blackmagic’s Resolve 12.5.2, which chewed through basic color correction. In terms of testing time, I typically keep a review computer system for a couple of months, but with this workstation I really wanted to test it as thoroughly as possible — I’ve had the workstation for three months and counting, and I’ve been running the system through all the appropriate paces.

I always love to review workstations like the HP Z1G3 because of the raw power they possess. While HP sent me one of the top-of-the-line Z1G3 configurations, which retails for a list price of $3,486, they have a pretty reasonable starting price at $1,349. From Intel i3, i5 and i7 configurations all the way up to the all mighty Intel Xeon — the HP Z1G3 can be customized to fit into your workflow whether you just need to check your email or color correct video from your GoPro.

Here are the specs that make up the HP Z1G3 All-in-One workstation I received:

● 23.6-inch UHD/4K non-glare and non-touch display (3840×2160)
● Intel Xeon E3-1270 v5 CPU, 3.6GHz (4 Cores / 8 Threads)
● 64GB DDR4 SODIMM 2133 GHz (4 x 16GB)
● Nvidia Quadro M2000M graphics (4GB)
● Two Z Turbo drives (512GB, PCIe M.2)
● Wireless keyboard and mouse
● Two Thunderbolt 3/USB 3.1 ports
● USB charging port
● Media card reader
● DisplayPort out

As I mentioned earlier, I tested the Z1G3 with many different apps, but recently I’ve been diving deeper into color correction, and luckily for my testing this fits right in. A few of the most strenuous real-world tests for computer systems is running 3D modeling apps like Maxon Cinema 4D and color correction suites like Resolve. Of course, apps like After Effects are great tests as well, but adding nodes on nodes on nodes in Resolve will really tax your CPU, as well as your GPU.

One thing that can really set apart high-end systems like the Z1G3 is the delay when using a precision color correction panel like Tangent’s Elements or Ripple. Sometimes you will move one of the color wheel balls and a half a second later the color wheel moves on screen. I tried adding a few clips and nodes on the timeline and when using the panels, I noticed no discernible delay (at least more than what I would expect). While this isn’t a scientific test, it is crucial for folks looking to plug in external devices.

For more scientific tests I stuck to apps like Cinebench from Maxon, AJA’s System Test and Blackmagic’s Disk Speed Test. In Cinebench, the Z1G3 ranked at the top of the list when compared to similar systems. In AJA’s System Test I tested the read/write speed of the hp-z1g3-aja-system-test-copynon-OS drive (basically the editing or cache drive). It sustained around 1520MB/s read and 1490MB/s write. I say around because I couldn’t get the AJA app to display the entire read/write numbers because of the high-resolution/zoom in Windows, I tried scaling it down to 1920×1080 but no luck. In Blackmagic’s Disk Speed Test, I was running at 1560MB/s read and 1497.3MB/s write. The drive that I ran this test on is HP’s version of the M.2 PCIe SSD powered by Samsung, more affectionately known by HP as a Z-Turbo drive. The only thing better at the moment would be a bunch of these drives arranged in a RAID-0 configuration. Luckily, you can do that through the Thunderbolt 3 port with some spare SSDs you have lying around.

Almost daily I ran Premiere Pro CC, Media Encoder and Resolve Studio 12.5.2. I was really happy with the performance in Premiere. When working with QuickTimes in inter-frame codecs like H.264 and AVC-HD (non-edit friendly codecs), I was able to work without too much stuttering in the timeline. When I used intra-frame codecs like ProRes HQ from a Blackmagic’s Pocket Cinema Camera, Premiere worked great. I even jumped into Adobe’s Lumetri color tools while using Tangent’s Ripple external color correction panel and it worked with little discernable delay. I did notice that Premiere had a little more delay when using the external color correction panel than Media Composer and Resolve, but that seemed to be more of a software problem rather than a workstation problem.

One of my favorite parts about using a system with an Nvidia graphics card, especially a Quadro card like the M2000M, is the ability to encode multiple versions of a file at once. Once I was done editing some timelapses in Premiere, I exported using Media Encoder. I would apply three presets I made: one square 600×600 H.264 for Instagram, one 3840×2160 H.264 for YouTube and an Animated GIF at 480×360 for Twitter. Once I told Media Encoder to encode, it ran all three exports concurrently — a really awesome feature. With the Nvidia Quadro card installed, it really sped along the export.

Media Composer
Another app I wanted to test was Media Composer 8.6.3. Overall Media Composer ran great except for the high-resolution display. As I’ve said in previous reviews, this isn’t really the fault of HP, but more of the software manufacturers who haven’t updated their interfaces to adapt to the latest UHD displays. I had filmed a little hike I took with my five-year-old. I gave him a GoPro while I had my own. Once we got the footage back home, I imported it into Media Composer, grouped the footage and edited it using the multi-cam edit workflow.

Simply put, the multi-camera split was on the left and the clip I had in the sequence was playing simultaneously on the right. Before I grouped the footage into a multi-group, I transcoded the H.264s into DNxHD 175 an intra-frame, edit-friendly codec. The transcode was nearly realtime, so it took 60 minutes to transcode a 60-minute H.264 — which is not bad. In the end, I was able to edit the two-camera multi-group at 1920×1080 resolution with only minor hiccups. Occasionally, I would get caught in fast forward for a few extra seconds when J-K-L editing, but nothing that made me want to throw my keyboard or mouse against the wall.

Once done editing, I installed the FilmLight color correction plug-in for Media Composer. I had a really awesome experience coloring using Baselight in Media Composer on the Z1G3. I didn’t have any slowdowns, and the relationship between using the color correction panel and Baselight was smooth.

Resolve
The last app I tested with HP’s Z1G3 All-in-One Workstation was Blackmagic’s Resolve 12.5.2. Much like my other tests, I concentrated on color correction with the Tangent Ripple and Element-Vs iOS app. I had four or five nodes going in the color correction page before I started to see a slow down. I was using the native H.264 and ProRes HQ files from the cameras, so I didn’t make it easy for Resolve, but it still worked. Once I added a little sharpening to my clips, the HP Z1G3 really started to kick into gear. I heard the faint hum of fans, which up until this point hadn’t kicked in. This is also where the system started to slow down and become sluggish.

Summing Up
The Z1G3 is one of my favorite workstations, period. A while ago, I reviewed the previous All-in-One workstation from HP, the Z1G2, and at the time it was my favorite. One of my few complaints was that, while it was easy to fix, it was very heavy and bulky. When I opened the Z1G3 box, I immediately noticed how much lighter and streamlined the design was. It almost felt like they took away 50 percent of the bulk, which is something I really appreciate. I can tell that one of the main focuses with the Z1G3 was minimizing its footprint and weight, while increasing the power. HP really knocked it out of the park.

One of the only things that I wish was different on the Z1G3 I tested was the graphics card. While the Nvidia Quadro M2000M is a great graphics card, it is a “mobile” version of a Quadro, which has 128 fewer CUDA cores and 26GB/s less bandwidth than its desktop equivalent the M2000. I would love the option of a full-sized Quadro and instead of the mobile version but I also understand the power consumption will go up as well as the form factor, so maybe I give HP a pass here.

In the end, I know everyone reading this review is saying to themselves, “I love my iMac so why would I want the HP Z1G3?” If you are a die-hard Apple user, or you just saw the new Microsoft Surface Studio announcement, then it might be a hard sell, but I love both Windows- and Mac OS-based systems, and the Z1G3 is awesome. What’s even more awesome is that it is easily upgradeable. I took off the back cover, and with simple switch I could have added a 2.5-inch hard drive or two in under a minute. If you are looking for a new powerful workstation and want one that not only stands up to Resolve and Premiere Pro CC, the HP Z1G3 is for you.


Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Earlier this year, Brady was nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.

Review: Red Giant’s Universe 2

By Brady Betzel

Throughout 2016, we have seen some interesting acquisitions in the world of post production software and hardware — Razer bought THX, Blackmagic bought Ultimatte and Fairlight and Boris FX bought GenArts, to name a few. We’ve also seen a tremendous consolidation of jobs. Editors are now being tasked as final audio mixers, final motion graphics creators, final colorists and much more.

Personally, I love doing more than just editing, so knowing tools like Adobe After Effects and DaVinci Resolve, in addition to Avid Media Composer, has really helped me become not only an editor but someone who can jump into After Effects or Resolve and do good work.

hudUnfortunately, for some people it is the nature of the post beast to know everything. Plug-ins play a gigantic part in balancing my workload, available time and the quality of the final product. If I didn’t have plug-ins like Imagineer’s Mocha Pro, Boris’s Continuum Complete, GenArt’s Sapphire and Red Giant’s Universe 2, I would be forced to turn down work because the time it would take to create a finished piece would outweigh the fee I would be able to charge a client.

A while back, I reviewed Red Giant’s Universe when it was in version 1, (check it out here). In the beginning Universe allowed for lifetime, annual and free memberships. It seems the belt has tightened a little for Red Giant as Universe 2 is now $99 a year, $20 a month or a 14-day free trial. No permanent free version or lifetime memberships are offered (if you downloaded the free Universe before June 28, you will still be able to access those free plug-ins in the Legacy group). Moreover, they have doubled the monthly fee from $10 to $20 — definitely trying to get everyone on to the annual subscription train.

Personally, I think this resulted from too much focus on the broad Universe, trying to jam in as many plug-ins/transitions/effects as possible and not working on specific plug-ins within Universe. I actually like the renewed focus of Red Giant toward a richer toolset as opposed to a full toolset.

Digging In
Okay, enough of my anecdotal narrative and on to some technical awesomeness. Red Giant’s Universe 2 is a vast plug-in collection that is compatible with Adobe’s Premiere Pro and After Effects CS6-CC 2015.3; Apple Final Cut Pro X 10.0.9 and later; Apple Motion 5.0.7 and later; Vegas 12 and 13; DaVinci Resolve 11.1 and later; and HitFilm 3 and 4 Pro. You must have a compatible GPU installed as Universe does not have a CPU fallback plan for unsupported machines. Basically you must have 2GB or higher GPU, and don’t forget about Intel as their graphic support has improved a lot lately. For more info on OS compatibility and specific GPU requirements, check out Red Giant’s compatibility page.

Universe 2 is loaded with great plug-ins that, once you dig in, you will want to use all the time. For instance, I really like the ease of use of Universe’s RGB Separation and Chromatic Glow. If you want a full rundown of each and every effect you should download the Universe 2 trial and check this out. In this review I am only going to go over some of the newly added plug-ins — HUD Components,  Line, Logo Motion and Color Stripe — but remember there are a ton more.

I will be bouncing around different apps like Premiere Pro and After Effects. Initially I wanted to see how well Universe 2 worked inside of Blackmagic’s DaVinci Resolve 12.5.2. Resolve gave me a little trouble at first; it began by crashing once I clicked on OpenFX in the Color page. I rebooted completely and got the error message that the OpenFX had been disabled. I did a little research (and by research I mean I typed ”Disabled OpenFX Resolve” into Google), and  stumbled on a post on Blackmagic’s Forum that suggested deleting “C:\ProgramData\Blackmagic Design\Davinci Resolve\Support\OFXPluginCache.xml” might fix it. Once I deleted that and rebooted Resolve, I clicked on the OpenFX tab in the Color Page, waited 10 minutes, and it started working. From that point on it loaded fast. So, barring the Resolve installation hiccup, there were no problems installing in Premiere and After Effects.

Once installed, you will notice that Universe has a few folders inside of your plug-in’s drop down: Universe Blur, Universe Distort, Universe Generators, Universe Glow, Universe Legacy, Universe Motion Graphics, Universe Stylize and Universe Utilities. You may recognize some of these if you have used an earlier version of Universe, but something you will not recognize is that each Universe plug-in now has a “uni.” prefix.

I am still not sure whether I like this or hate this. On one hand it’s easy to search for if you know exactly what you want in apps like Premiere. On the other hand it runs counterintuitive to what I am used to as a grouchy old editor. In the end, I decided to run my tests in After Effects and Premiere. Resolve is great, but for tracking a HUD in 3D space I was more comfortable in After Effects.

HUD Components
First up is HUD Components, located under the Universe Motion Graphics folder and labeled: “uni.HUD Components.” What used to take many Video CoPilot tutorials and many inspirational views of HUD/UI master Jayse Hansen’s (@jayse_) work, now takes me minutes thanks to the new HUD components. Obviously, to make anything on the level of a master like Jayse Hansen will take hundreds of hours and thousands of attempts, but still — with Red Giant HUD Components you can make those sci-fi in-helmet elements quickly.

When you apply HUD Components to a solid layer in After Effects you can immediately see the start of your HUD. To see what the composite over my footage would look like, I went to change the blend mode to Add, which is listed under “Composite Settings.” From there you can see some awesome pre-built looks under the Choose a Preset button. The pre-built elements are all good starting points, but I would definitely dive further into customizing, maybe layer multiple HUDs over each other with different Blend Modes, for example.

Diving further into HUD Components, there are four separate “Elements” that you can customize, each with different images, animations, colors, clone types, and much more. One thing to remember is that when it comes to transformation settings and order of operations work from the top down. For instance, if you change the rotation on element one, it will affect each element under it, which is kind of handy if you ask me. Once you get the hang of how HUD Components works, it is really easy to make some unique UI components. I really like to use the uni.Point Zoom effect (listed under Universe Glow in the Effects & Presets); it gives you a sort of projector-like effect with your HUD component.

There are so many ways to use and apply HUD Components in everyday work, from building dynamic lower thirds with all of the animatable arcs, clones and rotations to building sci-fi elements, applying Holomatrix to it and even Glitch to create awesome motion graphics elements with multiple levels of detail and color. I did try using HUD Components in Resolve when tracking a 3D object but couldn’t quite get the look I wanted, so I ditched it and used After Effects.

Line
Second up is the Line plug-in. While drawing lines along a path in After Effects isn’t necessarily hard, it’s kind of annoying — think having to make custom map graphics to and from different places daily. Line takes the hard work out of making line effects to and from different points. This plug-in also contains the prefix uni. and is located under Universe Motion Graphics labeled uni.Line.

This plug-in is very simple to use and animate. I quickly found a map, applied uni.Line, placed my beginning and end points, animated the line using two keyframes under “Draw On” and bam! I had an instant travel-vlog style graphic that showed me going from California to Australia in under three minutes (yes, I know three minutes seems a little fast to travel to Australia but that’s really how long it took, render and all). Under the Effect Controls you can find preset looks, beginning and ending shape options like circles or arrows, line types, segmented lines and curve types. You can even move the peak of the curve under bezier style option.

Logo Motion
Third is Logo Motion, located under Universe Motion Graphics titled uni.LogoMotion. In a nutshell you can take a pre-built logo (or anything for that matter), pre-compose it, throw the uni.LogoMotion effect on top, apply a preset reveal, tweak your logo animation, dynamically adjust the length of your pre-comp — which directly affects the logo’s wipe on and off — and, finally, render.

This is another plug-in that makes my life as an editor who dabbles in motion graphics really easy. Red Giant even included some lower third animation presets that help create dynamic lower third movements. You can select from some of the pre-built looks, add some motion while the logo is “idle,” adjust things like rotation, opacity and blur under the start and end properties, and even add motion blur. The new preset browser in Universe 2 really helps with plug-ins like Logo Motion where you can audition animations easily before applying them. You can quickly add some life to any logo or object with one or two clicks; if you want to get detailed you can dial in the idle animation and/or transition settings.

Color Stripe
Fourth is Color Stripe, a transition that uses color layers to wipe across and reveal another layer. This one is a pretty niche case use, but is still worth mentioning. In After Effects. transitions are generally a little cumbersome. I found the Universe 2 transitions infinitely easier to use in NLEs like Adobe Premiere. From the always-popular swish pan to exposure blur, there are some transitions you might use once or some you might use a bunch. Color Stripe is a transition that you probably won’t want to use too often, but when you do need it, it will be right at your fingertips. You can choose from different color schemes like analogous, tetradic, or even create a custom scheme to match your project.

In the end, Universe 2 has some effects that are essential once you begin using them, like uni.Unmult, uni.RGB Separation and the awesome uni.Chromatic Glow. The new ones are great too, I really like the ease of use of uni.HUD Components. Since these effects are GPU accelerated you might be surprised at how fast and fluid they work in your project without slowdowns. For anyone who likes apps like After Effects, but can’t afford to spend hours dialing in the perfect UI interface and HUD, Universe 2 is perfect for you. Check out all of the latest Red Giant Universe 2 tools here.

Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Earlier this year, Brady was nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.

Review: Microsoft Surface Pro 4 running Resolve 12.5

By Brady Betzel

Not long ago, I was asked if I wanted to check out Blackmagic’s DaVinci Resolve 12.5 on a Microsoft Surface Pro 4. I was dubious, and wondered, “Do they really think I can edit, color correct, and deliver footage on a tablet?”

I was incredulous. I really thought this seemed like a pipe dream for Microsoft and Blackmagic. Everyone who works in post knows that you need a pretty monstrous workstation to play, let alone edit, media. Especially media with resolutions over 1920×1080 and 10-bit color! Well, let’s see how all of that played out.

Thankfully, I received the higher-end version of the Microsoft Surface Pro 4 tablet. Under the hood it was packing a dual-core 2.2 GHz Intel i7, 6650U CPU, 8GB RAM, NVMe Samsung MZFLV256 (256GB SSD) and an Intel Iris graphics 540 GPU. The display sports a beautiful 3:2 aspect ratio at 2736×1824 resolution; not quite the UHD 16:9/1.78:1 or true 4
K 1.9:1 aspect ratio that would be comfortable when working in video, but it’s not bad. Keep in mind that when working with high-resolution displays like an Apple Retina 5K or this Surface Pro 4, some apps will be hard to read even with the scaling bumped up.

Resolve looks great, but the words and icons might be a bit smaller than what you are used to seeing. The Surface Pro 4 weighs an incredibly light 1.73 pounds, measures 11.5×7.93x.33 inches and has the best stand I’ve ever used on a tablet. This is a big pet peeve of mine – terrible tablet stands — but the Surface sports a great one. I am on the go a lot, so I need a sturdy stand that, preferably, is attached. The Surface has the stand every other tablet manufacturer should copy.

I use Wacom products, so I am used to working with a great stylus, therefore, I didn’t have high expectations for the pen included with the Surface Pro. Boy, I was wrong! I was I happily surprised at how nice it was. While it doesn’t have the 2,048 levels of pressure sensitivity present in the Wacom products that you might be used to, it does have 1,024, with great palm rejection. The weight of the pen was great — like really great — and it mounts on the side of the Surface by a strong magnet.

Aside from the mouse and stylus, the Surface Pro 4 has a 10-point touchscreen, but I didn’t use it very much. I found myself defaulting to the stylus when I wanted to interact directly on the screen, like in Photoshop or adjusting curves inside Resolve. Last, but not least, is the tremendous battery life. I was constantly running Resolve as well as playing music from Spotify and Pandora and the battery would last me most of the day. Once I got into heavy grading where I pumped up the brightness, the battery life went to lasting under two to four hours, which I think is still great.

Resolve
Ok, enough gloating about the Surface hardware and onto the real test: Blackmagic’s DaVinci Resolve 12.5 running on a tablet!

Right off the bat — and as you’ve probably already surmised — I’m going to tell you that the Surface 4 Pro is not going to stand up to a powerhouse like the HP z840 with 64GB of RAM and an Nvidia Quadro M6000. But, what I found the Surface Pro 4 excelling at was proxy-based workflows and simple color matching.

You won’t be able to play 4K clips that cleanly, but the Surface 4 Pro and Resolve will allow you to color correct, grade, add a few nodes for things, such as a vignette or qualifier, and even export your grade. But if I were you and wanted to use the Surface Pro appropriately, a nice simple color balance will run great.

Essentially. the Surface Pro is a great way to travel and grade your footage thanks to Intel’s pretty amazing Iris graphics technology. You should really check out Intel’s backstory on how one of their engineers went to NAB 2015 and talked with the Blackmagic crew and figured out what he needed to do to get Intel-based GPUs to work with Resolve. Check this out. Regardless of whether or not there is hyperbole in that video, it is very true that almost anybody can run Resolve, whether you are on a Surface or an Intel-powered desktop.

Oh, don’t forget that for many people, the free version of Resolve will be all they need. Resolve is an amazing nonlinear editor and professional-level color correction software available at anyone’s fingertips for free. This is a fact that cannot be understated.

Testing
To test the Surface 4 Pro, I found some Red 5K footage that I scaled down to 1920×1080 in a 1920×1080 23.976 project, did a simple edit, colored and exported a final QuickTime. When I had the debayer set all the way to full resolution, my Surface started to crawl (crawl would be the polite term — in fact, it was more like melt. This is why I suggest the proxy workflow. However, when I played back at ¼, more so at ⅛, I was actually able to work. I was running around 10 to 12 frames per second. While I know 12fps isn’t the best playback for a 23.976 5K clip at 1920×1080 resolution, it let me do my job while on the go. I like to call it the “Starbuck’s Test.” If I need more than that I definitely should be at home using a HP z840, or DIY custom-built 4K workhorse, which I am looking to build.

If you really want to get the Surface to sing in Resolve 12.5, you should stick to 1920×1080 resolution footage or smaller. With a couple of serial nodes I was able to consistently get 15fps playback. Yeah, I know this isn’t ideal, but if I’m on the run and can’t use a workstation with dual Nvidia Titans or GTX1080 GPUs, 64GB DDR5 RAM, running footage off a Thunderbolt 3 external SSD RAID (a set-up that would cost north of $5K), the Microsoft Surface Pro 4 is a great alternate solution.

Something that is tough to deal with on the Surface is the small text and icon size in Windows 10. While there might be a way to fix it using registry key hacks, I don’t want to do that. I want to set it and forget it. For all I know, there is a way to make the text the right size, but I couldn’t find it easily.

There has to be a way this can be fixed, right? If you know of a true fix let me know on Twitter @allbetzroff. I would really love to know. I tried bumping up the icon/text zoom within Resolve and messing around with the zoom in the Window’s Control Panel, with no luck.

Another issue with using a tablet to color correct and grade is the lack of elegance and fluidity that professional color correction panels allow. If you do color at any sort of professional level you should probably have at the very least something like the Tangent Ripple or Element panels. Using a touch screen, mouse and/or stylus to edit and color correct gets old fast on a tablet.

Using the Tangent Ripple, which is surprisingly portable, I felt the elegance I know and love when using Resolve with a panel. (I will be doing a Tangent Ripple review later for some more in-depth analysis). I did love the ability to use the stylus to get in and fine-tune Power Windows and curves in Resolve, but you will definitely need some extra equipment if you find yourself doing more than a couple adjustments — much like any computer, and not just the Surface.

Summing Up
In the end, the Microsoft Surface Pro 4 (my version goes for around $1,600) is an exceptional tablet. I love isurface-pro-4-portst. In addition to running Resolve 12.5. I also installed the Adobe Suite of tools and did some editing in Premiere, effects in After Effects, transcoding in Media Encoder and even round-tripped my sequence between Resolve and Premiere.

The Surface Pro 4 is a great “away-from-home” computer to run very high-end apps like Resolve 12.5, Premiere Pro CC, and even apps like After Effects with hard core plug-ins like Imagineer System’s Mocha Pro 5.

While the touchscreen and stylus are great for occasional use, you should plan on investing in something like the Tangent Ripple color panel if you will be coloring a ton in Resolve or any other app — it’s even priced well at $350.

From the amazing battery life to the surprisingly snappy response of the Intel Iris 540 GPU inside of pro video editing and color correcting apps like Resolve, the Microsoft Surface Pro 4 is the Windows tablet you need in your mobile multimedia creator life.


Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Earlier this year, Brady was nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.

Sam Daley on color grading HBO’s ‘Show Me a Hero’

By Ellen Wixted

David Simon’s newest and much-anticipated six-part series Show Me a Hero premiered on HBO in the US in mid-August. Like The Wire, which Simon created, Show Me a Hero explores race and community — this time through the lens of housing desegregation in late-‘80s Yonkers, New York. Co-written by Simon and journalist William F. Zorzi, the show was directed by Paul Haggis with Andrij Parekh as cinematographer, and produced by Simon, Haggis, Zorzi, Gail Mutrux and Simon’s long-time collaborator, Nina Noble. Technicolor PostWorks‘ Sam Daley served as the colorist. I caught up with him recently to talk about the show.

A self-described “film guy,” New York-based Daley has worked as colorist on films ranging from Martin Scorsese’s The Departed to Lena Dunham’s Girls with commercial projects rounding out his portfolio. When I asked Daley what stood out about his experience on Show Me a Hero, his answer was quick: “The work I did on the dailies paid off hugely when we got to finishing.” Originally brought into the project as dailies colorist, Daley’s scope quickly expanded to include finishing — and his unusual workflow set the stage for high-impact results.

Sam Daly

Sam Daly

Daley’s background positioned him perfectly for his role. After graduating from film school and working briefly in production, Daley worked in a film lab before moving into post production. Daley’s deep knowledge of photochemical processing, cameras and filters turned him into a resource for colorists he worked alongside and piqued his interest in the craft. He spent years paying his dues before eventually becoming known for his work as a colorist. “People tend to get pigeonholed, and I was known for my work on dailies,” Daley notes. “But ultimately the cinematographers I worked with insisted that I do both dailies and finishing, as Ed Lachman (cinematographer) did when we worked together on Mildred Pierce.”

The Look
Daley and Show me a Hero’s cinematographer, Andrij Parekh, had collaborated on previous projects, and Parekh’s clear vision from the project’s earliest stages set the stage for success. “Andreij came up with this beautiful color treatment, and created a look book that included references to Giorgio de Chirico’s painted architecture, art deco artist Tamara de Lempicka’s highly stylized faces, and films from the 1970s, including The Conformist, The Insider, The Assassination of Richard Nixon and The Yards. Sometimes look books are aspirational, but Andrij’s footage delivered the look he wanted‚ and that gave me permission to be aggressive with the grade,” says Daley. “Because we’ve worked together before, I came in with an understanding of where he likes his images to be.”

bar before

Parekh shot the series using the Arri Alexa and Leica Summilux-C lenses. Since the show is set in the late ‘80s, a key goal for the production was to ground the look of the show firmly in that era. A key visual element was to have different visual treatments for the series’ two worlds to underscore how separate they are: the cool, stark political realm, and the warmer, brighter world of the housing projects. The team’s relatively simple test process validated the approach, and introduced Daley to the Colorfront On-Set Dailies system, which proved to be a valuable addition to his pipeline.

“Colorfront is really robust for dailies, but primitive for finishing — it offers simple color controls that can be translated by other systems later. Using it for the first time reminded me of when I was training to be a colorist — when everything tactile was very new to me — and it dawned on me that to create a period look you don’t have to add a nostalgic tint or grain. With Colorfront I was able to create the kind of look that would have been around in the ’80s with simple primary grades, contrast, and saturation adjustments.”

meeting before

“This is the crazy thing: by limiting my toolset I was able to get super creative and deliver a look that doesn’t feel at all modern. In a sense, the system handcuffed me — but Andrij wasn’t looking for a lot of razzle-dazzle. Using Colorfront enabled me to create the spine of an appropriate period style that makes the show look like it was created in the ‘80s. Everyone loved the way the dailies looked, and they were watching them for months. By the time we got to finishing, we had something that was 90% of the way there.”

Blackmagic’s DaVinci Resolve 11 was used for finishing, a process that was unusually straightforward because of the up-front work done on the dailies. “Because all shots were already matched, final grading was done scene by scene. We changed the tone of some scenes, but the biggest decision we made was to desaturate everything by an additional 7% to make the flesh tones less buzzy and to set the look more firmly in the period.”

Belushi beforeBelushi after

Daley was enthusiastic about the production overall, and HBO’s role in setting a terrific stage for moving the art of TV forward. “HBO was awesome — and they always seem to provide the extra breathing space needed to do great work. This show in particular felt like a symphony, where everyone had the same goal.”

I asked Daley about his perspective on collaboration, and his answer was surprising. “’The past is prologue.’ Everything you did in the past is preparation for what you’re doing now, and that includes relationships. Andrij and I had a high level of trust and confidence going into this project. I wasn’t nervous because I knew what he wanted, and he trusted that if I was pushing a look it was for a reason. We weren’t tentative, and as a result the project turned into a dream job that went smoothly from production through post.”  He assures this is true for every client — you always have to give 110 percent. “The project I’m working on today is the most important project I’ve ever worked on.”

Daley’s advice for aspiring colorists? “Embrace technology. I was a film guy who resisted digital for a long time, but working on Tiny Furniture threw all of my preconceptions about digital out the window. The feature was shot using a Canon 7D because the budget was micro and the producer already owned the camera. The success of that movie made me stop being an old school film snob — now I look at new tech and think ‘bring it on.’”