Tag Archives: Blackmagic Resolve

Point 360 grows team with senior colorist Charlie Tucker

Senior colorist Charlie Tucker has joined Burbank’s Point 360. He comes to the facility from Technicolor, and brings with him over 20 years of color grading experience.

The UK-born Tucker’s credits include TV shows such as The Vampire Diaries and The Originals on CW, Wet Hot American Summer and A Futile & Stupid Gesture on Netflix, as well as Amazon’s Lore. He also just completed YouTube Red’s show Cobra Kai. Tucker, who joined the company just last week, will be working on Blackmagic Resolve.

Now at Point 360, Tucker reteams with Jason Kavner, who took the helm as senior VP of episodic sales in 2017. Tucker also joins fellow senior colorist Aidan Stanford, whose recent credits include the Academy Award-winning feature Get Out and the film Happy Death Day. Stanford’s recent episodic work includes the FX series You’re the Worst and ABC’s Fresh Off the Boat.

When prodded to sum up his feelings regarding joining Point 360, Tucker said, “I am chuffed to bits to now be part of and call Point 360 my home. It is a bloody lovely facility that has a welcoming, collaborative feel, which is refreshing to find within this pressure cooker we call Hollywood. The team I am privileged to join is a brilliant, talented and very experienced group of industry professionals who truly enjoy what they do, and I know my clients will love my new coloring bay and the creative vibe that Point 360 has created.”

Director HaZ Dulull on his sci-fi offering The Beyond

By Randi Altman

Director Hasraf “HaZ” Dulull is no stranger to making movies. Before jumping into writing and directing short sci-fi films, he was a visual effects supervisor and producer. His short film resume includes Project Kronos, I.R.I.S. and Sync. Recently, his first feature film, The Beyond, was released by Gravitas Ventures.

When I first met HaZ a few years back, we were both at an Adobe event — on a canal boat in Amsterdam during IBC. We started talking about visual effects, the industry and his drive to make movies.

This Brit is friendly, intelligent and incredibly hands-on in all aspects of what he does. His latest is The Beyond, which he describes as “a cerebral science-fiction feature film that blends the realism of documentary with the fantastical, ‘big idea’ nature of the science-fiction films of today.” The Beyond tells the story of a ground-breaking mission that sent astronauts — modified with advanced robotics — through a newly discovered wormhole known as the Void. When the mission returns unexpectedly, the space agency races to discover what the astronauts encountered on their first-of-its-kind interstellar space journey.

HaZ on set

HaZ was so hands-on that he provided some of the film’s visual effects and edited the film. Here is the trailer. If you like what you see, the film is available for purchase or rent on most digital platforms.

When I reached out to HaZ to talk about The Beyond, he was in Vancouver working on an eight-part TV series for Disney called Fast Layne. “I directed episodes 1 and 2, and am currently directing episodes 7 and 8,” he says. “The beauty of starting and ending the series is it allowed me to set the show’s style and tone.”

It seems he can’t sit still! Let’s find out more about how he works and The Beyond

Can you talk about prepro? How much of that included visual effects prepro?
Most people who know me will say I’m obsessed with prep. I had about six months of hardcore prep on this, from doing little storyboards, known as HaZ-Grams, right through to previs of the key sequences.

But even during the script-writing stage (six months before actual prep), I was coming up with visuals to support the ideas I was writing in the script. Sometimes I would knock up a test VFX scene just to see how complex it would be to create this idea I was writing in the script. Prep worked hand in hand with the script development and the budgeting of the film. The film was self-financed and later additional financing came in (during post production of the film), so I wanted to ensure everything was mapped out technically, as there was no “fix it in post” scenarios in this film — I wouldn’t allow it.

During location scouting, I would have my iPhone with me and shoot a bunch of footage and still imagery, so when I went back home I could write those locations into the script to make them work with the scenarios depicted in the film.

As part of prep we actually shot a test scene to really see if this mocku-mentary format would work to tell a grounded sci-fi story. This was also used to attract crew and other casting to the project, as well as get distributors primed early on.

Many shots from that test actually made it into the final movie —I wasn’t kidding about not wasting any budget or material on this production! So prep pretty much helped shape the script too, as I knew I wasn’t in the financial position to write stuff and then go and build it. I had to reverse engineer it in a way. In the film we have tons of locations, such as the Space Centre with actual real rockets. We also had a team in Iceland shooting alien landscapes, and we even shot some scenes in Malaysia to give the film a global feel — with each of those opportunities the script was tweaked to make full use of those location opportunities we had.

You shot with Blackmagic cameras. Was that your choice? The DP’s? Have you shot with these before?
From the start, I knew we were going to shoot on Blackmagic cameras. This was mainly down to the fact my DP Adam Batchelor — who had shot Sync with me and the proof of concept tests we did for this film — was a Blackmagic advocate and knew the cameras inside out, but more importantly he was able to get cinematic imagery using those cameras.

Blackmagic was very supportive of the film and have been of my career since my short films, so they came on as one of the executive producers on the film. No one had ever shot a full feature film using just the Blackmagic cameras. We also then used a Resolve pipeline to delivery. So The Beyond is the perfect case study for it.

Can you talk about that workflow? Any hiccups? 
I think the only hiccups were the fact we were using a beta version of Resolve 14, so there were the expected crashes, etc. That would usually be seen as risky on a feature film, but luckily we didn’t have a distributor in place with a release date, so the risk was minimal.

The good thing was I would generate an error log report from Resolve and send it over to Blackmagic, who would then instantly send out a new patch. So we were looked after rather than being left on our own to scream at the monitor.

We stuck with a Pro Res 4444 QuickTime workflow for all material from footage to VFX renders, and enabled proxy on the fly within Resolve. This was great as it meant I was working with the highest-resolution imagery within Resolve, and it was fairly fast too. Things started to slow down when I had multiple layers of VFX and composites/groups, which I then had to render out as a new clip and bring back in.

How did you and the DP develop the look you wanted? Any scenes stick out that you guys worked on?
I was very fortunate to get Max Horton, who had worked on films like Gravity, to come onboard to grade this film at the Dolby Vision lab in London’s Soho. We also did an HDR version of the film, which I think is the first indie film to have an HDR treatment done to it.

We had three to four days of grading with Max, and I was in the room with him the whole time. This was because I had already done a first-pass temp grade myself while editing the film in the beta version of Resolve 14. This made the workflow as simple as exporting my Resolve file and then the material hand-over to Max, who would load up the Resolve file, link up the material and work from there.

Max kept everything photographically like a documentary but with a slight cinematic flair to it. The big challenge was matching all the various sources of material from the various Blackmagic cameras (Ursa Mini Pro, the Production Camera and the Pocket Camera) to the DJI Osmo, drone footage and stock footage.

How many VFX shots were there? Who did them?
There were around 750 visual effects shots. I designed all the VFX scenes and handled a huge portion of the compositing myself, including invisible effects shots, all the space scenes, alien planet scenes, memory scenes and tons more — this would not have been possible without the support of my VFX team who worked on their assigned sequences and shots and also generated tons of CGI assets for me to use to create my shots in comp.

My VFX team members included my long-time collaborator John Sellings, who was the VFX supervisor for all the Human 2.0 sequences. Filmmore, in Amsterdam and Brussels, handled Human 2.0 scenes in the transcode bay with in-house VFX supervisor Hans Van Helden. London’s Squint VFX handled the Human 2.0 scenes in wake-up lab. Charles Wilcocks was the Human 2.0 CG supervisor who worked on the shape and look of the Human 2.0.

Hussin Khan looked after the Malaysian team, which provided rotoscoping support and basic comps. Dan Newlands was our on-set tracking supervisor. He ensured all data was captured correctly and supervised anything tracking related in the Human 2.0 scenes.

Another long-time collaborator was Andrea Tedeschi, who handled the CG and comps for the spacecraft carrier at the end of the film, as well as rendering out the CG astronaut passes. Rhys Griffith handled the rigging for the Human 2.0 characters in Maya, and also looked after the CG passes for the alpha Human 2.0 scenes using Blender. Aleksandr Uusmees provided all the particles and simulation rendered out of Houdini as CG passes/elements, which I then used to create the wormhole effects, alien spheres and other shots that needed those elements.

JM Blay designed and created the standalone motion graphics sequences to visualize the Human 2.0 medical procedure, as well as mission trajectory graphics. He also created several “kit-bash” graphics assets for me to use, including UI graphics, from his After Effects files.

Territory Studio created the awesome end titles and credits sequence, which you can read more about on their site.

As a VFX pro yourself, do you find that you are harder to please because it’s your wheelhouse?
Oh boy. Ask any of the VFX guys on the team and they will say I am a beast to work with because I am hands-on, and also I know how long things take. But on the flip side that had its advantages, as they knew they were not going to get revision after revision, because with each brief I also presented a proposed methodology, and made sure we locked down on that first before proceeding with the shots.

Was this your biggest directing job to date? Can you talk about any surprises?
It wasn’t my biggest directing job to date, as during post production of The Beyond my second sci-fi film Origin Unknown (starring Katee Sackhoff from Battlestar Galactica, The Flash) was green-lit and that had its own set of challenges. We can talk more about that when the film is released theatrically and VOD later this year via Kew Media.

This was, however, my biggest producing job to date; there were so many logistics and resources to manage whilst directing too. The cool thing about the way we made this film was that most of the crew were on my short films, including some of the key cast too, so we embraced the guerrilla nature of the production and focused on maximizing our resources to the fullest within the time and budget constraints.

What did you learn on this film that will help on your next?
The other hat I was wearing was the producer hat, and one thing I had to embrace was the sheer amount of paperwork! I may have taken the same filmmaking approach as I did on my short films — guerrilla and thinking outside the box technically and creatively— but making a commercial feature film, I had to learn to deal with things like clearances, E&O (errors and omission) insurance, chain of title, script report and a whole bunch of paperwork required before a distributor will pick up your film.

Thankfully my co-producer Paula Crickard, who is currently wrapping post on Terry Gilliam’s Don Quixote, came in during the post stage of the film and helped.

The other thing I learned was the whole sales angle — getting a reputable distributor on board to sell the film in all worldwide territories and how to navigate that process with rights and IP and more contracts etc. The advise I got from other filmmakers is getting the right distributor is a big part in how your film will be released, and to me it was important the distributor was into the film and not just the trailer, but also what their marketing and sales strategy were. The Beyond was never designed to be a theatrical film and therefore I wanted someone that had a big reach in the VOD world through their brand, especially since The Beyond doesn’t have big-name actors in there.

What was the most challenging scene or scenes? Why and how did you overcome those challenges?
The Human 2.0 scenes were the most challenging because they had to look photoreal due to it being a documentary narrative. We did first try and do it all in-camera using a built suit, but it wasn’t achieving the look we wanted and the actors would feel uncomfortable with it, and also to do it properly with practical would cost a fortune. So we went with a full-digital solution for the Human 2.0 bodies, by having the actors wear a tight grey suit with tracking markers on and we restricted our camera moves for simplicity to enable object tracking to work as accurately as possible. We also shot multiple reference footage from all angles to help with match moving. Having an on set-tracking supervisor helped massively and allowed us to make this happen within the budget, while looking and feeling real.

Our biggest issue came when our actress made very tiny movements due to breathing in close-up shots. Because our Human 2.0 was human consciousness in a synthetic shell, breathing didn’t make sense and we began making up for it by freezing the image or doing some stabilization, which ended up being nearly impossible for the very close-up shots.

In the end, I had to think outside the box, so I wrote a few lines into the script that explained that the Human 2.0 was breathing to make it psychologically more acceptable to other humans. Those two lines saved us weeks and possibly months of time.

Being a VFX movie you would expect us to use a form of greenscreen or bluescreen, but we didn’t — in fact, the only stage used was for the “white room” astronaut scene, which was shot over at Asylum FX in London. There was an actor wearing an astronaut suit in a bright photography room, and we used brightly exposed lighting to give a surreal feeling. We used VFX to augment it.

As a writer and a director, how was it seeing your vision through from start to finish.
It didn’t really hit me until I watched the press screening of it at the Dolby Vision office in Soho. It had the fully mixed sound and the completed grade. I remember looking across at my DP and other team members thinking, “Whoa! It looks and feels like a feature film, and we did that in a year!”

You edited the film yourself?
Yes, I was the editor on the film! I shoot for the edit. I started off using Adobe Premiere CC for the early offline and then quickly moved over to Resolve 14, where I did the majority of the editing. It was great because I was doing a lot of online editorial tasks like stabilizing, basic VFX, pan and scans, as well as establishing temp looks while editing. So in a way there was no offline and online editorial, as it was all part of one workflow. We did all our deliverables out of Resolve 14, too.

Color plays big role in director Sean Baker’s The Florida Project

Director Sean Baker is drawing wide praise for his realistic portrait of life on the fringe in America in his new film The Florida Project. Baker applies a light touch to the story of a precocious six-year-old girl living in the shadow of Disney World, giving it the feel of a slice-of-life documentary. That quality is carried through in the film’s natural look. Where Baker shot his previous film, Tangerine, entirely with an iPhone, The Florida Project was recorded almost wholly on anamorphic 35mm film by cinematographer Alexis Zabe.

Sam Daley

Post finishing for the film was completed at Technicolor PostWorks New York, which called on a traditional digital intermediate workflow to accommodate Baker’s vision. The work began with scanning the 35mm negative to 2K digital files for dailies and editorial. It ended months later with rescanning at 4K and 6K resolution, editorial conforming and color grading in the facility’s 4K DI theater. Senior colorist Sam Daley applied the final grade via Blackmagic Resolve v.12.5.

Shooting on film was a perfect choice, according to Daley, as it allowed Baker and Zabe to capture the stark contrasts of life in Central Florida. “I lived in Florida for six years, so I’m familiar with the intensity of light and how it affects color,” says Daley. “Pastels are prominent in the Florida color palette because of the way the sun bleaches paint.”

He adds that Zabe used Kodak Vision3 50D and 250D stock for daylight scenes shot in the hot Florida sun, noting, “The slower stock provided a rich color canvas, so much so, that at times we de-emphasized the greenery so it didn’t feel hyper real.”

The film’s principal location is a rundown motel, ironically named the Magic Castle. It does not share the sun-bleached look of other businesses and housing complexes in the area as it has been freshly painted a garish shade of purple.

Baker asked Daley to highlight such contrasts in the grade, but to do so subtly. “There are many colorful locations in the movie,” Daley says. “The tourist traps you see along the highway in Kissimmee are brightly colored. Blue skies and beautiful sunsets appear throughout the film. But it was imperative not to allow the bright colors in the background to distract from the characters in the foreground. The very first instruction that I got from Sean was to make it look real, then dial it up a notch.”

Mixing Film and Digital for Night Shots
To make use of available light, nighttime scenes were not shot on film, but rather were captured digitally on an Arri Alexa. Working in concert with color scientists from Technicolor PostWorks New York and Technicolor Hollywood, Daley helmed a novel workflow to make the digital material blend with scenes that were film-original. He first “pre-graded” the digital shots and then sent them to Technicolor Hollywood where they were recorded out to film. After processing at FotoKem, the film outs were returned to Technicolor Hollywood and scanned to 4K digital files. Those files were rushed back to New York via Technicolor’s Production Network where Daley then dropped them into his timeline for final color grading. The result of the complex process was to give the digitally acquired material a natural film color and grain structure.

“It would have been simpler to fly the digitally captured scenes into my timeline and put on a film LUT and grain FX,” explains Daley, “but Sean wanted everything to have a film element. So, we had to rethink the workflow and come up with a different way to make digital material integrate with beautifully shot film. The process involved several steps, but it allowed us to meet Sean’s desire for a complete film DI.”

Calling on iPhone for One Scene
A scene near the end of the film was, for narrative reasons, captured with an iPhone. Daley explains that, although intended to stand out from the rest of the film, the sequence couldn’t appear so different that it shocked the audience. “The switch from 4K scanned film material to iPhone footage happens via a hard cut,” he explains. “But it needed to feel like it was part of the same movie. That was a challenge because the characteristics of Kodak motion picture stock are quite different from an iPhone.”

The iPhone material was put through the same process as the Alexa footage; it was pre-graded, recorded out to film and scanned back to digital. “The grain helps tie it to the rest of the movie,” reports Daley. “And the grain that you see is real; it’s from the negative that the scene was recorded out to. There are no artificial looks and nothing gimmicky about any of the looks in this film.”

The apparent lack of artifice is, in fact, one of the film’s great strengths. Daley notes that even a rainbow that appears in a key moment was captured naturally. “It’s a beautiful movie,” says Daley. “It’s wonderfully directed, photographed and edited. I was very fortunate to be able to add my touch to the imagery that Sean and Alexis captured so beautifully.”

My top workstation accessories

By Brady Betzel

As a working video editor, I’m at my desk and on my computer all day. So when I get home I want my personal workstation to feel as powerful as possible and having the right tools to support that experience are paramount.

I’m talking workstation accessories. I’ve put together a short list based on my personal experience. Some are well known, while some are slightly under the radar. Either way, they all make my editing life easier and more productive.

They make my home-based workstation feel like a full-fledged professional edit suite.

Wacom Intuos Pro Medium
In my work as an offline editor, I started to have some wrist pain when I used a mouse in conjunction with my keyboard. That is when I decided to jump head first into using a Wacom tablet. Within two weeks, all of my pain went away and I felt that I had way more control over drawing objects and shapes. I specifically noticed more precision when working inside of apps like Adobe’s After Effects and Photoshop when drawing accurate lines and shapes with bezier handles.

In addition, you can program minimal macros on the express keys on the side of the tablet. While the newest Wacom Intuos Pro Medium tablet costs a cool $349.95, it will pay for itself with increased efficiency and, in my experience, less wrist pain.

Genelec 8010A Studio Monitors
One workstation accessory that will blow you away is a great set of studio monitors. Genelec is known for making some great studio monitors and the 8010A are a set I wish I could get. These monitors are small —  around 8-inches tall by 4-inches deep and 4-inches wide — but they put out some serious power at 96dBs.

Don’t be fooled by their small appearance; they are a great complement to any serious video and audio power user. They connect via XLR, so you may need to get some converters if you are going straight out of your station, without runing through a mixer. These speakers are priced at $295 each; they aren’t cheap, but they are another important accessory that will further turn your bay into a professional suite.

Tangent Element & Blackmagic Resolve Color Correction Panels
If you work in color correction, or aspire to color correct, color correction panels are a must. They not only make it easier for you to work in apps like Blackmagic DaVinci Resolve, but they free your mind from worrying about where certain things are and let your fingers do the talking. It is incredibly liberating to use color correction panels when doing a color grade — it feels like you have another arm you can use to work.

The entire set of Tangent Element Panels costs over $3,300, but if you are just getting started, the Tangent Element Tk (just the trackballs) can be had for a little over $1,100. What’s nice about the Tangent panels is that they work with multiple apps, including Adobe Premiere, FilmLight Baselight, etc. But if you know you are only going to be using Resolve, the Resolve Micro or Mini panels are a great deal at under $1,000 and $3,000, respectively.

Logitech G13 Advanced Gameboard
This one might sound a bit odd at first, but once you do some research you will see that many professional editors use these types of pads to program macros of multiple button pushes or common tasks. Essentially, this is a macro pad that has 25 programmable keys as well as a thumb controlled joystick. It’s a really intriguing piece of hardware that might be able to take place of your mouse in conjunction with your keyboard. It is competitively priced at only $79.99 and, with a little Internet research on liftgammagain.com, you can even find forums of user’s custom mappings.

Logickeyboard Backlit Keyboard
Obviously, the keyboard is one of the most used workstation accessories. One difficulty is trying to work with one in a dark room. Well, Logickeyboard has a dimmable backlit keyboard series for apps like Resolve and Avid Media Composer.

In addition to being backlit, they also have two powered USB 2.0 ports that really come in handy. These retail for around $140, so they are a little pricey for a keyboard but, take it from me, they will really polish that edit suite.

OWC USB-C Dock
With ports on Mac-based systems being stripped away, a good USB-C dock is a great extension to have in your edit suite. OWC offers a Mini-DisplayPort or HDMI-equipped version in the colors that match your MacBook Pro, if you have one.

In addition, you get five USB 3.1-compatible ports — including two of those being a high-powered charging port and a USB type C port — a Gig-E port, front facing SD card reader, combo audio in/out port and Mini-DisplayPort or HDMI port. These retail for under $150.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

 

Behind the Title: Carbon’s head of color grading, Maria Carretero

NAME: Maria Carretero

COMPANY: Carbon (@carbonvfx)

CAN YOU DESCRIBE YOUR COMPANY?
Carbon, which has locations in New York, Chicago and LA, is a boutique design and visual effects company that focuses on art, ideas, and talent. I am based in Chicago.

WHAT’S YOUR JOB TITLE?
Head of color grading

WHAT DOES THAT ENTAIL?
I work with clients to find color palettes and looks that best tell their story.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
I’m still surprised with how much creativity is involved in the grading process. I especially love coming to solutions through mixing art and technique.

WHAT SYSTEM DO YOU WORK ON?
Blackmagic DaVinci Resolve

ARE YOU SOMETIMES ASKED TO DO MORE THAN JUST COLOR ON PROJECTS?
Sometimes I’m involved in projects from very early on, which usually entails creating tone and color palettes to be used during filming. I’ve also contributed reference images that help the creative team settle on an overall look or color, and have been present at camera tests to check light and exposure. I love being able to use my artistic background in painting and the fine arts to give projects their maximum creative potential.

Jeep “Two Words”

WHAT’S YOUR FAVORITE PART OF THE JOB?
There are moments when I’m grading a piece and I have a strange connection with it as if I’m seeing it clearly for the first time. It’s like I suddenly know it’s going to be a wonderful grade. Moments like that are magical.

WHAT IS YOUR FAVORITE TIME OF THE DAY?
Sunset, when all the lights in the city start turning on.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Painting.

WHY DID YOU CHOOSE THIS PROFESSION?
Color grading chose me. Before I had officially started grading, I spent a lot of my time focusing on my painting, and while grading was something I could technically excel at, my art was the priority at the time. Then digital grading started gaining momentum in Spain and I gradually realized that color grading opportunities were more and more important to me. I feel extremely lucky. I’m self-taught and relied on my incredible network of supporters to give me chances to go further and further.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
I worked on the last five Jeep commercials with a talented group of people from DDB Chicago and the great director Tobias Granström. Some other projects included a huge campaign for Panama Tourism out of VML, the hilarious Liquid Plumbr spot from FBC Chicago and the newest Machine Gun Kelly music video, directed by Steven Caple Jr., which has more than four million views on YouTube.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
I’m proud of a lot of my recent work, but Recalculating for Jeep was an incredibly challenging and fulfilling project. We did a lot of research around the idea of sunsets, focusing on the sophistication of light and keeping it as natural feeling as possible.

Jeep “Recalculating”

WHERE DO YOU FIND INSPIRATION? ART? PHOTOGRAPHY?
Art, life, reading and my previous experiences.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
My cell, my Dolby monitor and Spotify.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
I have Facebook, Twitter, Vimeo, Movidiam and Instagram, where I recently started @carretero.color to feature my color work.

DO YOU LISTEN TO MUSIC WHILE YOU WORK?
I always listen to music when I work. Music is great support — it can make you happy in a second! I listen to a lot of different bands, but Band of Horses, Tame Impala, Neil Young, Flaming Lips, Eels, Devendra, The Kills and Spanish music, like Niña Vintage, are some of my favorites!

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I never let stress control me. Sometimes the challenges of the job are huge, but that’s our work and our industry — we know how to get it done. Challenges are wonderful because they point directly at our creativity. Being uncomfortable sometimes is a good thing. It makes you break down your limitations. Working with a great group of people helps a lot, too, so you can always have fun, even on the hardest days.

Dog in the Night director/DP Fletcher Wolfe

By Cory Choy

Silver Sound Showdown Music + Video Festival is unique in two ways. First, it is both a music video festival and battle of the bands at the same time. Second, every year we pair up the Grand Prize-winners, director and band, and produce a music video with them. The budget is determined by the festival’s ticket sales.

I conceived of the festival, which is held each year at Brooklyn Bowl, as a way to both celebrate and promote artistic collaboration between the film and music communities — two crowds that just don’t seem to intersect often enough. One of the most exciting things for me is then working with extremely talented filmmakers and musicians who have more often than not met for the first time at our festival.

Dog in the Night (song written by winning band Side Saddle) was one of our most ambitious videos to date — using a combination of practical and post effects. It was meticulously planned and executed by director/cinematographer Fletcher Wolfe, who was not only a pleasure to work with, but was gracious enough to sit down with me for a discussion about her process and the experience of collaborating.

What was your favorite part of making Dog in the Night?
As a music video director I consider it my first responsibility to get to know the song and its meaning very intimately. This was a great opportunity to stretch that muscle, as it was the first time I was collaborating with musicians who weren’t already close friends. In fact, I hadn’t even met them before the Showdown. I found it to be a very rewarding experience.

What is Dog in the Night about?
The song Dog in the Night is, quite simply, about a time when the singer Ian (a.k.a. Angler Boy) is enamored with a good friend, but that friend doesn’t share his romantic feelings. Of course, anyone who has been in that position (all of us?) knows that it’s never that simple. You can hear him holding out hope, choosing to float between friendship and possibly dating, and torturing himself in the process.

I decided to use dusk in the city to convey that liminal space between relationship labels. I also wanted to play on the nervous and lonely tenor of the track with images of Angler Boy surrounded by darkness, isolated in the pool of light coming from the lure on his head. I had the notion of an anglerfish roaming aimlessly in an abyss, hoping that another angler would find his light and end his loneliness. The ghastly head also shows that he doesn’t feel like he has anything in common with anybody around him except the girl he’s pining after, who he envisions having the same unusual head.

What did you shoot on?
I am a DP by trade, and always shoot the music videos I direct. It’s all one visual storytelling job to me. I shot on my Alexa Mini with a set of Zeiss Standard Speed lenses. We used the 16mm lens on the Snorricam in order to see the darkness around him and to distort him to accentuate his frantic wanderings. Every lens in the set weighed in at just 1.25lbs, which is amazing.

The camera and lenses were an ideal pairing, as I love the look of both, and their light weight allowed me to get the rig down to 11lbs in order to get the Snorricam shots. We didn’t have time to build our own custom Snorricam vest, so I found one that was ready to rent at Du-All Camera. The only caveats were that it could only handle up to 11lbs, and the vest was quite large, meaning we needed to find a way to hide the shoulders of the vest under Ian’s wardrobe. So, I took a cue from Requiem for a Dream and used winter clothing to hide the bulky vest. We chose a green and brown puffy vest that held its own shape over the rig-vest, and also suited the character.

I chose a non-standard 1.5:1 aspect ratio, because I felt it suited framing for the anglerfish head. To maximize resolution and minimize data, I shot 3.2K at a 1.78:1 aspect ratio and cropped the sides. It’s easy to build custom framelines in the Alexa Mini for accurate framing on set. On the Mini, you can also dial in any frame rate between 0.75-60fps (at 3.2K). Thanks to digital cinema cameras, it’s standard these days to over-crank and have the ability to ramp to slow motion in post. We did do some of that; each time Angler Boy sees Angler Girl, his world turns into slow motion.

In contrast, I wanted his walking around alone to be more frantic, so I did something much less common and undercranked to get a jittery effect. The opening shot was shot at 6fps with a 45-degree shutter, and Ian walked in slow motion to a recording of the track slowed down to quarter-time, so his steps are on the beat. There are some Snorricam shots that were shot at 6fps with a standard 180-degree shutter. I then had Ian spin around to get long motion blur trails of lights around him. I knew exactly what frame rate I wanted for each shot, and we wound up shooting at 6fps, 12fps, 24fps, 48fps and 60fps, each for a different emotion that Angler Boy is having.

Why practical vs. CG for the head?
Even though the fish head is a metaphor for Angler Boy’s emotional state, and is not supposed to be real, I wanted it to absolutely feel real to both the actor and the audience. A practical, and slightly unwieldy, helmet/mask helped Ian find his character. His isolation needed to be tangible, and how much he is drawn to Angler Girl as a kindred spirit needed to be moving. It’s a very endearing and relatable song, and there’s something about homemade, practical effects that checks both those boxes. The lonely pool of light coming from the lure was also an important part of the visuals, and it needed to play naturally on their faces and the fish mask. I wired Lite Gear LEDs into the head, which was the easy part. Our incredibly talented fabricator, Lauren Genutis, had the tough job — fabricating the mask from scratch!

The remaining VFX hurdle then was duplicating the head. We only had the time and money to make one and fit it to both actors with foam inserts. I planned the shots so that you almost never see both actors in the same shot at the same time, which kept the number of composited shots to a minimum. It also served to maintain the emotional disconnect between his reality and hers. When you do see them in the same shot, it’s to punctuate when he almost tells her how he feels. To achieve this I did simple split screens, using the Pen Tool in Premiere to cut the mask around their actions, including when she touches his knee. To be safe, I shot takes where she doesn’t touch his knee, but none of them conveyed what she was trying to tell him. So, I did a little smooshing around of the two shots and some patching of the background to make it so the characters could connect.

Where did you do post?
We were on a very tight budget, so I edited at home, and I always use Adobe Premiere. I went to my usual colorist, Vladimir Kucherov, for the grade. He used Blackmagic Resolve, and I love working with him. He can always see how a frame could be strengthened by a little shaping with vignettes. I’ll finally figure out what nuance is missing, and when I tell him, he’s already started working on that exact thing. That kind of shaping was especially helpful on the day exteriors, since I had hoped for a strong sunset, but instead got two flat, overcast days.

The only place we didn’t see eye to eye on this project was saturation — I asked him to push saturation farther than he normally would advise. I wanted a cartoon-like heightening of Angler Boy’s world and emotions. He’s going through a period in which he’s feeling very deeply, but by the time of writing the song he is able to look back on it and see the humor in how dramatic he was being. I think we’ve all been there.

What did you use VFX for?
Besides having to composite shots of the two actors together, there were just a few other VFX shots, including dolly moves that I stabilized with the Warp Stabilizer plug-in within Premiere. We couldn’t afford a real dolly, so we put a two-foot riser on a Dana Dolly to achieve wide push-ins on Ian singing. We were rushing to catch dusk between rainstorms, and it was tough to level the track on grass.

The final shot is a cartoon night sky composited with a live shot. My very good friend, Julie Gratz of Kaleida Vision, made the sky and animated it. She worked in Adobe After Effects, which communicates seamlessly with Premiere. Julie and I share similar tastes for how unrealistic elements can coexist with a realistic world. She also helped me in prep, giving feedback on storyboards.

Do you like the post process?
I never used to like post. I’ve always loved being on set, in a new place every day, moving physical objects with my hands. But, with each video I direct and edit I get faster and improve my post working style. Now I can say that I really do enjoy spending time alone with my footage, finding all the ways it can convey my ideas. I have fun combining real people and practical effects with the powerful post tools we can access even at home these days. It’s wonderful when people connect with the story, and then ask where I got two anglerfish heads. That makes me feel like a wizard, and who doesn’t like that?! A love of movie magic is why we choose this medium to tell our tales.


Cory Choy, Silver Sound Showdown festival director and co-founder of Silver Sound Studios, produced the video.

Blackmagic intros lower-cost color panels for Resolve, new camera

By Brady Betzel

Yesterday, Blackmagic held a press conference on YouTube introducing a new pro camera — the Ursa Mini Pro 4.6K, which combines high-end digital film quality with the ergonomics and features of a traditional broadcast camera — and two new portable hardware control panels for the DaVinci Resolve (yes, only the Resolve) designed to allow color correction workflows to be mixed in with editing workflows.

For this article, I’m going to focus on the panels.

The color correction hardware market is a small one, usually headed by the same companies who produce color correction software. Tangent is one of the few that produces its own color correction panels. There is also the Avid/Euphonix Artist color correction panel and a few others, but the price jumps incredibly when you step up to panels like the Blackmagic DaVinci Resolve Advanced panels (just under $30,000).

I’ve previously reviewed the Tangent Ripple and Element color correction panels, and I love them. However, besides Tangent there really hasn’t been any mid- to prosumer-level products… until now. Blackmagic is offering the new Micro and Mini color correction panels.

The Blackmagic’s Micro color correction panel (our main image) is well priced at $995, which can be somewhat compared to the Tangent Wave (over $1,500 on B&H‘s site), Tangent Element Tk (over $1,135), or more closely compared to the Avid Artist Color Control Surface ($1,299). You’ll notice all of those are priced way higher than the new Micro panel. You could also throw the Tangent Ripple up for comparison, but that has a much more limited functionality and is much lower in price at around $350. The Micro panel is essentially three trackballs, 12 knobs and 18 keys. It is a collection of the most highly used parts of a color correction panel without any GUI screens. It connects via USB-C, although a USB 3 to USB-C converter will be included.

The Blackmagic Mini color correction panel (pictured right) is priced higher at $2,995 and can be compared to a combo of the Tangent Element Tk with one or two more in the Element set, which retail for $3,320 on www.bhphotovideo.com. The Mini adds two 5-inch displays, eight soft buttons, and eight soft knobs, in addition to everything the Micro panel has. It also has pass-through Ethernet to power and connect the panel, USB-C, and 4-pin XLR 12V DC power connection.

I am really excited to try these color correction panels out for my own — and I will, as the panels are on their way to me as I type. I need to emphasize that these panels only work with Resolve, no other software apps, so these were built with one workflow in mind.

I do wonder if in the future Blackmagic will sell additional panels that add more buttons and knobs or something crazy like a Smartscope through the Ethernet ports so I don’t have to buy additional SDI output hardware. Will everyone be ok with transport controls being placed on the right?

“We are always looking to design new products and features to help with the creative process,” says Blackmagic’s Bob Caniglia. “These new panels were designed to enable our growing number of Resolve users to be able access the power of DaVinci Resolve and Resolve Studio beyond a mouse and keyboard. The Micro and Mini control panels provide the perfect complement to our existing Advanced control panels.”

Blackmagic is really coming for everyone in the production and post world with recent moves like the acquisition of audio company Fairlight and realtime bluescreen and greenscreen removal hardware Ultimatte, providing Avid with their Media Composer DNx IOs, and even releasing an updated version of the Ursa camera, the Ursa Mini Pro. Oh, yeah, and don’t forget they provide one of the top color correction and editing apps on the market in DaVinci Resolve, and the latest color correction hardware like the Micro and Mini panels are primed to bring the next set of colorists into the Resolve world.

Oh, and as not to forget about the camera, the Ursa Mini Pro 4.6K is now available for $5,995. Here are some specs:

•  Digital film camera with 15 stops of dynamic range.
• Super 35mm 4.6K sensor with third-generation Blackmagic color science processing of raw sensor data.
• Interchangeable lens mount with EF mount included as standard. Optional PL and B4 lens mount available separately.
• High-quality 2, 4 and 6 stop ND filters with IR compensation designed to specifically match the colorimetry and color science of Ursa Mini Pro.
• Fully redundant controls including ergonomically designed tactile controls that allow direct access to the most important camera settings such as external power switch, ND filter wheel, ISO, shutter, white balance, record button, audio gain controls, lens and transport control, high frame rate button and more.
• Built-in dual C-Fast 2.0 recorders and dual SD/UHS-II card recorders allow unlimited duration recording in high quality.
• LCD status display for quickly checking timecode, shutter and lens settings, battery, recording status and audio levels.
• Support for CinemaDNG 4.6K RAW files and ProRes 4444 XQ, ProRes 4444, ProRes 422 HQ, ProRes 422, ProRes 422 LT and ProRes 422 Proxy recording at Ultra HD and HD resolutions.
• Supports up to 60 fps 4.6K resolution capture in RAW.
• Features all standard connections, including dual XLR mic/line audio inputs with phantom power, 12G-SDI output for monitoring with camera status graphic overlay and separate XLR 4-pin power output for viewfinder power, headphone jack, LANC remote control and standard 4-pin 12V DC power connection.
• Built-in stereo microphones for recording sound.
• Four-inch foldout touchscreen for on-set monitoring and menu settings.

Review: Apple’s new MacBook Pro

By Brady Betzel

What do you need to know about the latest pro laptop from Apple? Well, the MacBook Pro is fast and light; the new Touch Bar is handy and sharp but not fully realized; the updated keys on the keyboard are surprisingly great; and working with ProRes QuickTime files in resolutions higher than 1920×1080 inside of FCP X, or any NLE for that matter, is blazing fast.

When I was tasked with reviewing the new MacBook Pro, I came into it with an open mind. After all, I did read a few other reviews that weren’t exactly glowing, but I love speed and innovation among professional workstation computers, so I was eager to test it myself.

I am pretty open-minded when it comes to operating systems and hardware. I love Apple products and I love Windows-based PCs. I think both have their place in our industry, and to be quite honest it’s really a bonus for me that I don’t rely heavily on one OS or get too tricked by the Command Key vs. Windows/Alt Key.

Let’s start with the call I had with the Apple folks as they gave me the lowdown on the new MacBook Pro. The Apple reps were nice, energetic, knowledgeable and extremely helpful. While I love Apple products, including this laptop, it’s not the be-all-end-all.

The Touch Bar is nice, but not a revolution. It feels like the first step in an evolution, a version 1 of an innovation that I am excited to see more of in later iterations. When I talked with the Apple folks they briefed me on what Tim Cook showed off in the reveal: emoji buttons, wide gamut display, new speakers and USB-C/Thunderbolt 3 connectivity.

NLEs
They had an FCPX expert on the call, which was nice considering I planned on reviewing the MacBook Pro with a focus on the use of nonlinear editing apps, such as Adobe Premiere Pro, Avid Media Composer and Blackmagic’s Resolve. Don’t get me wrong, FCPX is growing on me — it’s snappy jumping around the timeline with ProRes 5K footage; assigning roles are something I wish every other app would pick up on; and the timeline is more of a breeze to use with the latest update.

The other side to this is that in my 13 years of working in television post I have never worked on a show that primarily used FCP or FCPX to edit or finish on. This doesn’t mean I don’t like the NLE, it simply means I haven’t relied on it in a professional working environment. Like I said, I really like the road it’s heading down, and if they work their way into mainstream broadcast or streaming platforms a little more I am sure I will see it more frequently.

Furthermore, with the ever-growing reduction in reliance on groups of editors and finishing artists apps like FCPX are poised to shine with their innovation. After all that blabbering, in this review I will touch on FCPX, but I really wanted to see how the MacBook Pro performed with the pro NLEs I encounter the most.

Specs
Let’s jump into the specs. I was sent a top-of-the-line 15-inch MacBook Pro with Touch Bar, which costs $3,499 if configured online. It comes with a quad/-core Intel Core i7 2.9GHz (up to 3.8 GHz using Turbo Boost) processor, 16GB of 2133MHz memory, 1TB PCI-e SSD hard drive and Radeon Pro 460 with 4GB of memory. It’s loaded. I think the only thing that can actually be upgraded beyond this configuration would be to include a 2TB hard drive, which would add another $800 to the price tag.

Physically, the MacBook Pro is awesome — very sturdy, very thin and very light. It feels great when holding it and carrying it around. Apple even sent along a Thunderbolt 3 (USB-C) to Thunderbolt 2 adapter, which costs an extra $29 and a USB-C to Lightning Cable that costs an extra $29.

So yes, it feels great. Apple has made a great new MacBook Pro. Is it worth upgrading if you have a new-ish MacBook Pro at home already? Probably not, unless the Touch Bar really gets you going. The speed is not too far off from the previous version. However, if you have a lot of Thunderbolt 3/USB-C-connected peripherals, or plan on moving to them, then it is a good upgrade.

Testing
I ran some processor/graphics card intensive tests while I had the new MacBook Pro and came to the conclusion that FCPX is not that much faster than Adobe Premiere Pro CC 2017 when working with non-ProRes-based media. Yes, FCPX tears through ProRes QuickTimes if you already have your media in that format. What about if you shoot on a camera like the Red and don’t want to transcode to a more edit-friendly codec? Well, that is another story. To test out my NLEs, I grabbed a sample Red 6K 6144×3160 23.98fps clip from the Red sample footage page, strung out a 10-minute-long sequence in all the NLEs and exported both a color-graded version and a non-color-graded version as ProRes HQ QuickTimes files matching the source file’s specs.

In order to work with Red media in some of the NLEs, you must download a few patches: for FCPX you must install the Red Apple workflow installer and for Media Composer you must install the Red AMA plug-in. Premiere doesn’t need anything extra.

Test 1: Red 6K 6144×3160 23.98fps R3D — 10-minute sequence (no color grade or FX) exported as ProRes HQ matching the source file’s specs. Premiere > Media Encoder = 1 hour, 55 minutes. FCPX = 1 hour, 57 minutes. Media Composer = two hours, 42 minutes (Good news, Media Composer’s interface and fonts display correctly on the new display).

You’ll notice that Resolve is missing from this list and that is because I installed Resolve 12.5.4 Studio but then realized my USB dongle won’t fit into the USB-C port — and I am not buying an adapter for a laptop I do not get to keep. So, unfortunately, I didn’t test a true 6K ProRes HQ export from Resolve but in the last test you will see some Resolve results.

Overall, there was not much difference in speeds. In fact, I felt that Premiere Pro CC 2017 played the Red file a little smoother and at a higher frames-per-second count. FCPX struggled a little. Granted a 6K Red file is one of the harder files for a CPU to process with no debayer settings enabled, but Apple touts this as a MacPro semi-replacement for the time being and I am holding them to their word.

Test 2: Red 6K 6144×3160 23.98fps R3D — 10-minute color-graded sequence exported as ProRes HQ matching the source files specs. Premiere > Media Encoder = one hour, 55 minutes. FCPX = one hour, 58 minutes. Media Composer = two hours, 34 minutes.

It’s important to note that the GPU definitely helped out in both Adobe Premiere and FCPX. Little to no extra time was added on the ProRes HQ export. I was really excited to see this as sometimes without a good GPU — resizing, GPU-accelerated effects like color correction and other effects will slow your system to a snail’s pace if it doesn’t fully crash. Media Composer surprisingly speed up its export when I added the color grade as a new color layer in the timeline. By adding the color correction layer to another layer Avid might have forced the Radeon to kick in and help push the file out. Not really sure what that is about to be honest.

Test 3: Red 6K 6144×3160 23.98fps R3D — 10-minute color-graded sequence resized to 1920×1080 on export as ProRes HQ. Premiere > Media Encoder = one hour, 16 minutes. FCPX = one hour, 14 minutes. Media Composer = one hour, 48 minutes. Resolve = one hour, 16 minutes

So after these tests, it seems that exporting and transcoding are all about the same. It doesn’t really come as too big of a surprise that all the NLEs, except for Media Composer, processed the Red file in the same amount of time. Regardless of the NLE, you would need to knock the debayering down to a half or more to start playing these clips at realtime in a timeline. If you have the time to transcode to ProRes you will get much better playback and rendering speed results. Obviously, transcoding all of your files to a codec, like ProRes or Avid DNX, takes way more time up front but could be worth it if you crunched for time on the back end.

In addition to Red 6K files, I also tested ProRes HQ 4K files inside of Premiere and FCPX, and both played them extremely smoothly without hiccups, which is pretty amazing. Just a few years ago I was having trouble playing down 10:1 compressed files in Media Composer and now I can playback superb-quality 4K files without a problem, a tremendous tip of the hat to technology and, specifically, Apple for putting so much power in a thin and light package.

While I was in the mood to test speeds, I hooked up a Thunderbolt 2 SSD RAID (OWC Thunderbay 4 mini) configured in RAID-0 to see what kind of read/write bandwidth I would get running through the Apple Thunderbolt 3 to Thunderbolt 2 adapter. I used both AJA System Test as well as the Blackmagic Disk Speed Test. The AJA test reported a write speed of 929MB/sec. and read speed of 1120MB/sec. The Blackmagic test reported a write speed of 683.1MB/sec. and 704.7MB/sec. from different tests and a read speed of 1023.3MB/sec. I set the test file for both at 4GB. These speeds are faster than what I have previously found when testing this same Thunderbolt 2 SSD RAID on other systems.

For comparison, the AJA test reported a write speed of 1921MB/sec. and read speed of 2134MB/sec. when running on the system drive. The Blackmagic test doesn’t allow for testing on the system drive.

What Else You Need to Know
So what about the other upgrades and improvements? When exporting these R3D files I noticed the fan kicked on when resizing or adding color grading to the files. Seems like the GPU kicked on and heated up which is to be expected. The fan is not the loudest, but it is noticeable.

The battery life on the new MacBook Pro is great when just playing music, surfing the web or writing product reviews. I found that the battery lasted about two days without having to plug in the power adapter. However, when exporting QuickTimes from either Premiere or FCPX the battery life dropped — a lot. I was getting a battery life of one hour and six minutes, which is not good when your export will take two hours. Obviously, you need to plug in when doing heavy work; you don’t really have an option.

This leads me to the new USB-C/Thunderbolt 3 ports — and, yes, you still have a headphone jack (thank goodness they didn’t talk with the iPhone developers). First off, I thought the MagSafe power adapter should have won a Nobel Peace Prize. I love it. It must be responsible for saving millions of dollars in equipment when people trip over a power cord — gracefully disconnecting without breaking or pulling your laptop off the table. However, I am disappointed Apple didn’t create a new type of MagSafe cable with the USB-C port. I will miss it a lot. The good news is you can now plug in your power adapter to either side of the MacBook Pro.

Adapters and dongles will have to be purchased if you pick up a new MacBook Pro. Each time I used an external peripheral or memory card like an SD card, Tangent Ripple Color Correction panel or external hard drive, I was disappointed that I couldn’t plug them in. Nonetheless, a good Thunderbolt 3 dock is a necessity in my opinion. You could survive with dongles but my OCD starts flaring up when I have to dig around my backpack for adapters. I’m just not a fan. I love how Apple dedicated themselves to a fast I/O like USB-C/Thunderbolt 3, but I really wish they gave it another year. Just one old-school USB port would have been nice. I might have even gotten over no SD card reader.

The Touch Bar
I like it. I would even say that I love it — in the apps that are compatible. Right now there aren’t many. Adobe released an update to Adobe Photoshop that added compatibility with the Touch Bar, and it is really handy especially when you don’t have your Wacom tablet available (or a USB dongle to attach it). I love how it gives access to so many levels of functionality to your tools within your immediate reach.

It has super-fast feedback. When I adjusted the contrast on the Touch Bar I found that the MacBook Pro was responding immediately. This becomes even more evident in FCPX and the latest Resolve 12.5.4 update. It’s clear Apple did their homework and made their apps like Mail and Messages work with the Touch Bar (hence emojis on the Touch Bar). FCPX has a sweet ability to scrub the timeline, zoom in to the timeline, adjust text and more from just the Touch Bar — it’s very handy, and after a while I began missing it when using other computers.
In Blackmagic’s latest DaVinci Resolve release, 12.5.4, they have added Touch Bar compatibility. If you can’t plug in your color correction panels, the Touch Bar does a nice job of easing the pain. You can do anything from contrast work to saturation, even adjust the midtones and printer lights, all from the Touch Bar. If you use external input devices a lot, like Wacom tablets or color correction panels, the Touch Bar will be right up your alley.

One thing I found missing was a simple application launcher on the Touch Bar. If you do pick up the new MacBook Pro with Touch Bar, you might want to download Touch Switcher, a free app I found via 9to5mac.com that allows you to have an app launcher on your Touch Bar. You can hide the dock, allowing you more screen real estate and the efficient use of the Touch Bar to launch apps. I am kind of surprised Apple didn’t make something like this standard.

The Display
From a purely superficial and non-scientific point of view, the newly updated P3-compatible wide-gamut display looks great… really great, actually. The colors are rich and vibrant. I did a little digging under the hood and noticed that it is an 8-bit display (data that you can find by locating the pixel depth in the System Information > Graphics/Display), which might limit the color gradations when working in a color space like P3 as opposed to a 10-bit display displaying in a P3 color space. Simply, you have a wider array of colors in P3 but a small amount of color shades to fill it up.

The MacBook Pro display is labeled as 32-bit color meaning the RGB and Alpha channels each have 8 bits, giving a total of 32 bits. Eight-bit color gives 256 shades per color channel while 10-bit gives 1,024 shades per channel, allowing for much smoother transitions between colors and luminance values (imagine a sky at dusk going smoothly from an orange to light blue to dark blue — the more colors per channel allows for a smoother gradient between lights and darks). A 10-bit display would have 30-bit color with each channel having 10 bits.

I tried to hook up a 10-bit display, but the supplied Thunderbolt 3 to Thunderbolt 2 dongle Apple sent me did not work with the mini display port. I did a little digging and it seems people are generally not happy that Apple doesn’t allow this to work, especially since Thunderbolt 2 and mini DisplayPort are the same connection. Some people have been able to get around this by hooking up their display through daisy chaining something like a Thunderbolt 2 RAID.

While I couldn’t directly test an external display when I had the MacBook Pro, I’ve read that people have been able to push 10-bit color out of the USB-C/Thunderbolt 3 ports to an external monitor. So as long as you are at a desk with a monitor you can most likely have 10-bit color output from this system.

I reached out to Apple on the types of adapters they recommend for an external display and they suggest a USB-C to DisplayPort adapter made by Aukey. It retails for $9.99. They also recommend the USB-C to DisplayPort cable from StarTech, which retails for $39.99. Make sure you read the reviews on Amazon because the experience people have with this varies wildly. I was not able to test either of these so I cannot give my personal opinion.

Summing Up
In the end, the new MacBook Pro is awesome. If you own a recent release of the MacBook Pro and don’t have $3,500 to spare, I don’t know if this is the update you will be looking for. If you are trying to find your way around going to a Windows-based PC because of the lack of Mac Pro updates, this may ease the pain slightly. Without more than 16GB of memory and an Intel Xeon or two, however, it might actually slow you down.

The battery life is great when doing light work, one of the longest batteries I’ve used on a laptop. But when doing the heavy work, you need to be near an outlet. When plugged into that outlet be careful no one yanks out your USB-C power adapter as it might throw your MacBook Pro to the ground or break off inside.

I really do love Apple products. They typically just work. I didn’t even touch on the new Touch ID Sensor that can immediately switch you to a different profile or log you in after waking up the MacBook Pro from sleep. I love that you can turn the new MacBook Pro on and it simply works, and works fast.

The latest iteration of FCPX is awesome as well, and just because I don’t see it being used a lot professionally doesn’t mean it shouldn’t be. It’s a well-built NLE that should be given a fairer shake than it has been given. If you are itching for an update to an old MacBook Pro, don’t mind having a dock or carrying around a bunch of dongles, then the 2016 MacBook Pro with the Touch Bar is for you.

The new MacBook Pro chews through ProRes-based media from 1920×1080 to 4K, 6K and higher will play but might slow down. If you are a Red footage user this new MacBook Pro works great, but you still might have to knock the debayering down a couple notches.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: The HP Z1G3 All-in-One workstation

By Brady Betzel

I’ll admit it. I’ve always been impressed with HP’s All-in-One workstations — from their z840 to their zBook mobile workstation and now their HP Z1G3. Yes, I know, the HP line of workstations are not cheap. In fact, you can save quite a bit of money building your own system, but you will probably have tons of headaches unless you are very confident in your computer-building skills. And if you don’t mind standing in the return line at the Fry’s Electronics.

HP spends tons of time and money on ISV certifications for their workstations. ISV certification stands for Independent Software Vendor certification. In plain English it means that HP spends a lot of time and money making sure the hardware inside of your workstation works with the software you use. For an industry pro that means apps like Adobe’s Premiere Pro and After Effects, Avid Media Composer, Autodesk products like 3DS Max and many others.

For this review,  I tested apps like Avid Media Composer, FilmLight’s Baselight for Media Composer color correction plug-in, Adobe Premiere Pro, Adobe Media Encoder and Adobe After Effects, as well as Blackmagic’s Resolve 12.5.2, which chewed through basic color correction. In terms of testing time, I typically keep a review computer system for a couple of months, but with this workstation I really wanted to test it as thoroughly as possible — I’ve had the workstation for three months and counting, and I’ve been running the system through all the appropriate paces.

I always love to review workstations like the HP Z1G3 because of the raw power they possess. While HP sent me one of the top-of-the-line Z1G3 configurations, which retails for a list price of $3,486, they have a pretty reasonable starting price at $1,349. From Intel i3, i5 and i7 configurations all the way up to the all mighty Intel Xeon — the HP Z1G3 can be customized to fit into your workflow whether you just need to check your email or color correct video from your GoPro.

Here are the specs that make up the HP Z1G3 All-in-One workstation I received:

● 23.6-inch UHD/4K non-glare and non-touch display (3840×2160)
● Intel Xeon E3-1270 v5 CPU, 3.6GHz (4 Cores / 8 Threads)
● 64GB DDR4 SODIMM 2133 GHz (4 x 16GB)
● Nvidia Quadro M2000M graphics (4GB)
● Two Z Turbo drives (512GB, PCIe M.2)
● Wireless keyboard and mouse
● Two Thunderbolt 3/USB 3.1 ports
● USB charging port
● Media card reader
● DisplayPort out

As I mentioned earlier, I tested the Z1G3 with many different apps, but recently I’ve been diving deeper into color correction, and luckily for my testing this fits right in. A few of the most strenuous real-world tests for computer systems is running 3D modeling apps like Maxon Cinema 4D and color correction suites like Resolve. Of course, apps like After Effects are great tests as well, but adding nodes on nodes on nodes in Resolve will really tax your CPU, as well as your GPU.

One thing that can really set apart high-end systems like the Z1G3 is the delay when using a precision color correction panel like Tangent’s Elements or Ripple. Sometimes you will move one of the color wheel balls and a half a second later the color wheel moves on screen. I tried adding a few clips and nodes on the timeline and when using the panels, I noticed no discernible delay (at least more than what I would expect). While this isn’t a scientific test, it is crucial for folks looking to plug in external devices.

For more scientific tests I stuck to apps like Cinebench from Maxon, AJA’s System Test and Blackmagic’s Disk Speed Test. In Cinebench, the Z1G3 ranked at the top of the list when compared to similar systems. In AJA’s System Test I tested the read/write speed of the hp-z1g3-aja-system-test-copynon-OS drive (basically the editing or cache drive). It sustained around 1520MB/s read and 1490MB/s write. I say around because I couldn’t get the AJA app to display the entire read/write numbers because of the high-resolution/zoom in Windows, I tried scaling it down to 1920×1080 but no luck. In Blackmagic’s Disk Speed Test, I was running at 1560MB/s read and 1497.3MB/s write. The drive that I ran this test on is HP’s version of the M.2 PCIe SSD powered by Samsung, more affectionately known by HP as a Z-Turbo drive. The only thing better at the moment would be a bunch of these drives arranged in a RAID-0 configuration. Luckily, you can do that through the Thunderbolt 3 port with some spare SSDs you have lying around.

Almost daily I ran Premiere Pro CC, Media Encoder and Resolve Studio 12.5.2. I was really happy with the performance in Premiere. When working with QuickTimes in inter-frame codecs like H.264 and AVC-HD (non-edit friendly codecs), I was able to work without too much stuttering in the timeline. When I used intra-frame codecs like ProRes HQ from a Blackmagic’s Pocket Cinema Camera, Premiere worked great. I even jumped into Adobe’s Lumetri color tools while using Tangent’s Ripple external color correction panel and it worked with little discernable delay. I did notice that Premiere had a little more delay when using the external color correction panel than Media Composer and Resolve, but that seemed to be more of a software problem rather than a workstation problem.

One of my favorite parts about using a system with an Nvidia graphics card, especially a Quadro card like the M2000M, is the ability to encode multiple versions of a file at once. Once I was done editing some timelapses in Premiere, I exported using Media Encoder. I would apply three presets I made: one square 600×600 H.264 for Instagram, one 3840×2160 H.264 for YouTube and an Animated GIF at 480×360 for Twitter. Once I told Media Encoder to encode, it ran all three exports concurrently — a really awesome feature. With the Nvidia Quadro card installed, it really sped along the export.

Media Composer
Another app I wanted to test was Media Composer 8.6.3. Overall Media Composer ran great except for the high-resolution display. As I’ve said in previous reviews, this isn’t really the fault of HP, but more of the software manufacturers who haven’t updated their interfaces to adapt to the latest UHD displays. I had filmed a little hike I took with my five-year-old. I gave him a GoPro while I had my own. Once we got the footage back home, I imported it into Media Composer, grouped the footage and edited it using the multi-cam edit workflow.

Simply put, the multi-camera split was on the left and the clip I had in the sequence was playing simultaneously on the right. Before I grouped the footage into a multi-group, I transcoded the H.264s into DNxHD 175 an intra-frame, edit-friendly codec. The transcode was nearly realtime, so it took 60 minutes to transcode a 60-minute H.264 — which is not bad. In the end, I was able to edit the two-camera multi-group at 1920×1080 resolution with only minor hiccups. Occasionally, I would get caught in fast forward for a few extra seconds when J-K-L editing, but nothing that made me want to throw my keyboard or mouse against the wall.

Once done editing, I installed the FilmLight color correction plug-in for Media Composer. I had a really awesome experience coloring using Baselight in Media Composer on the Z1G3. I didn’t have any slowdowns, and the relationship between using the color correction panel and Baselight was smooth.

Resolve
The last app I tested with HP’s Z1G3 All-in-One Workstation was Blackmagic’s Resolve 12.5.2. Much like my other tests, I concentrated on color correction with the Tangent Ripple and Element-Vs iOS app. I had four or five nodes going in the color correction page before I started to see a slow down. I was using the native H.264 and ProRes HQ files from the cameras, so I didn’t make it easy for Resolve, but it still worked. Once I added a little sharpening to my clips, the HP Z1G3 really started to kick into gear. I heard the faint hum of fans, which up until this point hadn’t kicked in. This is also where the system started to slow down and become sluggish.

Summing Up
The Z1G3 is one of my favorite workstations, period. A while ago, I reviewed the previous All-in-One workstation from HP, the Z1G2, and at the time it was my favorite. One of my few complaints was that, while it was easy to fix, it was very heavy and bulky. When I opened the Z1G3 box, I immediately noticed how much lighter and streamlined the design was. It almost felt like they took away 50 percent of the bulk, which is something I really appreciate. I can tell that one of the main focuses with the Z1G3 was minimizing its footprint and weight, while increasing the power. HP really knocked it out of the park.

One of the only things that I wish was different on the Z1G3 I tested was the graphics card. While the Nvidia Quadro M2000M is a great graphics card, it is a “mobile” version of a Quadro, which has 128 fewer CUDA cores and 26GB/s less bandwidth than its desktop equivalent the M2000. I would love the option of a full-sized Quadro and instead of the mobile version but I also understand the power consumption will go up as well as the form factor, so maybe I give HP a pass here.

In the end, I know everyone reading this review is saying to themselves, “I love my iMac so why would I want the HP Z1G3?” If you are a die-hard Apple user, or you just saw the new Microsoft Surface Studio announcement, then it might be a hard sell, but I love both Windows- and Mac OS-based systems, and the Z1G3 is awesome. What’s even more awesome is that it is easily upgradeable. I took off the back cover, and with simple switch I could have added a 2.5-inch hard drive or two in under a minute. If you are looking for a new powerful workstation and want one that not only stands up to Resolve and Premiere Pro CC, the HP Z1G3 is for you.


Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Earlier this year, Brady was nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.

Review: Red Giant’s Universe 2

By Brady Betzel

Throughout 2016, we have seen some interesting acquisitions in the world of post production software and hardware — Razer bought THX, Blackmagic bought Ultimatte and Fairlight and Boris FX bought GenArts, to name a few. We’ve also seen a tremendous consolidation of jobs. Editors are now being tasked as final audio mixers, final motion graphics creators, final colorists and much more.

Personally, I love doing more than just editing, so knowing tools like Adobe After Effects and DaVinci Resolve, in addition to Avid Media Composer, has really helped me become not only an editor but someone who can jump into After Effects or Resolve and do good work.

hudUnfortunately, for some people it is the nature of the post beast to know everything. Plug-ins play a gigantic part in balancing my workload, available time and the quality of the final product. If I didn’t have plug-ins like Imagineer’s Mocha Pro, Boris’s Continuum Complete, GenArt’s Sapphire and Red Giant’s Universe 2, I would be forced to turn down work because the time it would take to create a finished piece would outweigh the fee I would be able to charge a client.

A while back, I reviewed Red Giant’s Universe when it was in version 1, (check it out here). In the beginning Universe allowed for lifetime, annual and free memberships. It seems the belt has tightened a little for Red Giant as Universe 2 is now $99 a year, $20 a month or a 14-day free trial. No permanent free version or lifetime memberships are offered (if you downloaded the free Universe before June 28, you will still be able to access those free plug-ins in the Legacy group). Moreover, they have doubled the monthly fee from $10 to $20 — definitely trying to get everyone on to the annual subscription train.

Personally, I think this resulted from too much focus on the broad Universe, trying to jam in as many plug-ins/transitions/effects as possible and not working on specific plug-ins within Universe. I actually like the renewed focus of Red Giant toward a richer toolset as opposed to a full toolset.

Digging In
Okay, enough of my anecdotal narrative and on to some technical awesomeness. Red Giant’s Universe 2 is a vast plug-in collection that is compatible with Adobe’s Premiere Pro and After Effects CS6-CC 2015.3; Apple Final Cut Pro X 10.0.9 and later; Apple Motion 5.0.7 and later; Vegas 12 and 13; DaVinci Resolve 11.1 and later; and HitFilm 3 and 4 Pro. You must have a compatible GPU installed as Universe does not have a CPU fallback plan for unsupported machines. Basically you must have 2GB or higher GPU, and don’t forget about Intel as their graphic support has improved a lot lately. For more info on OS compatibility and specific GPU requirements, check out Red Giant’s compatibility page.

Universe 2 is loaded with great plug-ins that, once you dig in, you will want to use all the time. For instance, I really like the ease of use of Universe’s RGB Separation and Chromatic Glow. If you want a full rundown of each and every effect you should download the Universe 2 trial and check this out. In this review I am only going to go over some of the newly added plug-ins — HUD Components,  Line, Logo Motion and Color Stripe — but remember there are a ton more.

I will be bouncing around different apps like Premiere Pro and After Effects. Initially I wanted to see how well Universe 2 worked inside of Blackmagic’s DaVinci Resolve 12.5.2. Resolve gave me a little trouble at first; it began by crashing once I clicked on OpenFX in the Color page. I rebooted completely and got the error message that the OpenFX had been disabled. I did a little research (and by research I mean I typed ”Disabled OpenFX Resolve” into Google), and  stumbled on a post on Blackmagic’s Forum that suggested deleting “C:\ProgramData\Blackmagic Design\Davinci Resolve\Support\OFXPluginCache.xml” might fix it. Once I deleted that and rebooted Resolve, I clicked on the OpenFX tab in the Color Page, waited 10 minutes, and it started working. From that point on it loaded fast. So, barring the Resolve installation hiccup, there were no problems installing in Premiere and After Effects.

Once installed, you will notice that Universe has a few folders inside of your plug-in’s drop down: Universe Blur, Universe Distort, Universe Generators, Universe Glow, Universe Legacy, Universe Motion Graphics, Universe Stylize and Universe Utilities. You may recognize some of these if you have used an earlier version of Universe, but something you will not recognize is that each Universe plug-in now has a “uni.” prefix.

I am still not sure whether I like this or hate this. On one hand it’s easy to search for if you know exactly what you want in apps like Premiere. On the other hand it runs counterintuitive to what I am used to as a grouchy old editor. In the end, I decided to run my tests in After Effects and Premiere. Resolve is great, but for tracking a HUD in 3D space I was more comfortable in After Effects.

HUD Components
First up is HUD Components, located under the Universe Motion Graphics folder and labeled: “uni.HUD Components.” What used to take many Video CoPilot tutorials and many inspirational views of HUD/UI master Jayse Hansen’s (@jayse_) work, now takes me minutes thanks to the new HUD components. Obviously, to make anything on the level of a master like Jayse Hansen will take hundreds of hours and thousands of attempts, but still — with Red Giant HUD Components you can make those sci-fi in-helmet elements quickly.

When you apply HUD Components to a solid layer in After Effects you can immediately see the start of your HUD. To see what the composite over my footage would look like, I went to change the blend mode to Add, which is listed under “Composite Settings.” From there you can see some awesome pre-built looks under the Choose a Preset button. The pre-built elements are all good starting points, but I would definitely dive further into customizing, maybe layer multiple HUDs over each other with different Blend Modes, for example.

Diving further into HUD Components, there are four separate “Elements” that you can customize, each with different images, animations, colors, clone types, and much more. One thing to remember is that when it comes to transformation settings and order of operations work from the top down. For instance, if you change the rotation on element one, it will affect each element under it, which is kind of handy if you ask me. Once you get the hang of how HUD Components works, it is really easy to make some unique UI components. I really like to use the uni.Point Zoom effect (listed under Universe Glow in the Effects & Presets); it gives you a sort of projector-like effect with your HUD component.

There are so many ways to use and apply HUD Components in everyday work, from building dynamic lower thirds with all of the animatable arcs, clones and rotations to building sci-fi elements, applying Holomatrix to it and even Glitch to create awesome motion graphics elements with multiple levels of detail and color. I did try using HUD Components in Resolve when tracking a 3D object but couldn’t quite get the look I wanted, so I ditched it and used After Effects.

Line
Second up is the Line plug-in. While drawing lines along a path in After Effects isn’t necessarily hard, it’s kind of annoying — think having to make custom map graphics to and from different places daily. Line takes the hard work out of making line effects to and from different points. This plug-in also contains the prefix uni. and is located under Universe Motion Graphics labeled uni.Line.

This plug-in is very simple to use and animate. I quickly found a map, applied uni.Line, placed my beginning and end points, animated the line using two keyframes under “Draw On” and bam! I had an instant travel-vlog style graphic that showed me going from California to Australia in under three minutes (yes, I know three minutes seems a little fast to travel to Australia but that’s really how long it took, render and all). Under the Effect Controls you can find preset looks, beginning and ending shape options like circles or arrows, line types, segmented lines and curve types. You can even move the peak of the curve under bezier style option.

Logo Motion
Third is Logo Motion, located under Universe Motion Graphics titled uni.LogoMotion. In a nutshell you can take a pre-built logo (or anything for that matter), pre-compose it, throw the uni.LogoMotion effect on top, apply a preset reveal, tweak your logo animation, dynamically adjust the length of your pre-comp — which directly affects the logo’s wipe on and off — and, finally, render.

This is another plug-in that makes my life as an editor who dabbles in motion graphics really easy. Red Giant even included some lower third animation presets that help create dynamic lower third movements. You can select from some of the pre-built looks, add some motion while the logo is “idle,” adjust things like rotation, opacity and blur under the start and end properties, and even add motion blur. The new preset browser in Universe 2 really helps with plug-ins like Logo Motion where you can audition animations easily before applying them. You can quickly add some life to any logo or object with one or two clicks; if you want to get detailed you can dial in the idle animation and/or transition settings.

Color Stripe
Fourth is Color Stripe, a transition that uses color layers to wipe across and reveal another layer. This one is a pretty niche case use, but is still worth mentioning. In After Effects. transitions are generally a little cumbersome. I found the Universe 2 transitions infinitely easier to use in NLEs like Adobe Premiere. From the always-popular swish pan to exposure blur, there are some transitions you might use once or some you might use a bunch. Color Stripe is a transition that you probably won’t want to use too often, but when you do need it, it will be right at your fingertips. You can choose from different color schemes like analogous, tetradic, or even create a custom scheme to match your project.

In the end, Universe 2 has some effects that are essential once you begin using them, like uni.Unmult, uni.RGB Separation and the awesome uni.Chromatic Glow. The new ones are great too, I really like the ease of use of uni.HUD Components. Since these effects are GPU accelerated you might be surprised at how fast and fluid they work in your project without slowdowns. For anyone who likes apps like After Effects, but can’t afford to spend hours dialing in the perfect UI interface and HUD, Universe 2 is perfect for you. Check out all of the latest Red Giant Universe 2 tools here.

Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Earlier this year, Brady was nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.