Tag Archives: Brady Betzel

Review: Mzed.com’s Directing Color With Ollie Kenchington

By Brady Betzel

I am constantly looking to educate myself, no matter what the source — or subject. Whether I am learning how to make a transition in Adobe After Effects from an eSports editor on YouTube to Warren Eagles teaching color correction in Blackmagic’s DaVinci Resolve on FXPHD.com, I’m always beefing up my skills. I even learn from bad tutorials — they teach you what not to do!

But when you come across a truly remarkable learning experience, it is only fair to share with the rest of the world. Last year I saw an ad for an MZed.com course called “Directing Color With Ollie Kenchington,” and was immediately interested. These days you can pretty much find any technical tutorial you can dream of on YouTube, but truly professional, higher education-like, theory-based education series are very hard to come by. Even ones you need to pay for aren’t always worth their price of admission, which is a huge let down.

Ollie sharing his wisdom.

Once I gained access to MZed.com I wanted to watch every educational series they had. From lighting techniques with ASC member Shane Hurlbut to the ARRI Amira Camera Primer, there are over 150 hours of education available from industry leaders. However, I found my way to Directing Color…

I am often asked if I think people should go to college or a film school. My answer? If you have the money and time, you should go to college followed by film school (or do both together, if the college offers it). Not only will you learn a craft, but you will most likely spend hundreds of hours studying and visualizing the theory behind it. For example, when someone asks me about the science behind camera lenses, I can confidently answer them thanks to my physics class based on lenses and optics from California Lutheran University (yes, a shameless plug).

In my opinion, a two-, four- or even 10-year education allows me to live in the grey. I am comfortable arguing for both sides of a debate, as well as the options that are in between —  the grey. I feel like my post-high school education really allowed me to recognize and thrive in the nuances of debate. Leaving me to play devil’s advocate maybe a little too much, but also having civil and proactive discussions with others without being demeaning or nasty — something we are actively missing these days. So if living in the grey is for you, I really think a college education supplemented by online or film school education is valuable (assuming you make the decision that the debt is worth it like I did).

However, I know that is not an option for everyone since it can be very expensive — trust me, I know. I am almost done paying off my undergraduate fees while still paying off my graduate ones, which I am still two or three classes away from finishing. That being said, Directing Color With Ollie Kenchington is the only online education series I have seen so far that is on the same level as some of my higher education classes. Not only is the content beautifully shot and color corrected, but Ollie gives confident and accessible lessons on how color can be used to draw the viewer’s attention to multiple parts of the screen.

Ollie Kenchington is a UK-based filmmaker who runs Korro Films. From the trailer of his Directing Color series, you can immediately see the beauty of Ollie’s work and know that you will be in safe hands. (You can read more about his background here.)

The course raises the online education bar and will elevate the audiences idea of professional insight. The first module “Creating a Palette” covers the thoughts behind creating a color palette for a small catering company. You may even want to start with the last Bonus Module “Ox & Origin” to get a look at what Ollie will be creating throughout the seven modules and about an hour and a half of content.

While Ollie goes over “looks,” the beauty of this course is that he goes through his internal thought processes including deciding on palettes based on color theory. He didn’t just choose teal and orange because it looks good, he chooses his color palette based on complementary colors.

Throughout the course Ollie covers some technical knowledge, including calibrating monitors and cameras, white balancing and shooting color charts to avoid having wrong color balance in post. This is so important because if you don’t do these simple steps, your color correction session while be much harder. And wasting time on fixing incorrect color balance takes time away from the fun of color grading. All of this is done through easily digestible modules that range from two to 20 minutes.

The modules include Creating a Palette; Perceiving Color; Calibrating Color; Color Management; Deconstructing Color 1 – 3 and the Bonus Module Ox & Origin.

Without giving away the entire content in Ollie’s catalog, my favorite modules in this course are the on-set modules. Maybe because I am not on-set that often, but I found the “thinking out loud” about colors helpful. Knowing why reds represent blood, which raise your heart rate a little bit, is fascinating. He even goes through practical examples of color use in films such as in Whiplash.

In the final “Deconstructing Color” modules, Ollie goes into a color bay (complete with practical candle backlighting) and dives in Blackmagic’s DaVinci Resolve. He takes this course full circle to show how since he had to rush through a scene he can now go into Resolve and add some lighting to different sides of someone’s face since he took time to set up proper lighting on set, he can focus on other parts of his commercial.

Summing Up
I want to watch every tutorial MZed.com has to offer. From “Philip Bloom’s Cinematic Masterclass” to Ollie’s other course “Mastering Color.” Unfortunately, as of my review, you would have to pay an additional fee to watch the “Mastering Color” series. It seems like an unfortunate trend in online education to charge a fee and then when an extra special class comes up, charge more, but this class will supposedly be released to the standard subscribers in due time.

MZed.com has two subscription models: MZed Pro, which is $299 for one year of streaming the standard courses, and MZed Pro Premium for $399. This includes the standard courses for one year and the ability to choose one “Premium” course.

“Philip Bloom’s Cinematic Master Class” was the Premium course I was signed up for initially, but you you can decide between this one and the “Mastering Color” course. You will not be disappointed regardless of which one you choose. Even their first course “How to Photograph Everyone” is chock full of lighting and positioning instruction that can be applied in many aspects of videography.

I really was impressed with Directing Color with Ollie Kenchington, and if the other course are this good MZed.com will definitely become a permanent bookmark for me.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: Red Giant’s Trapcode Suite 15

By Brady Betzel

We are now comfortably into 2019 and enjoying the Chinese Year of the Pig — or at least I am! So readers, you might remember that with each new year comes a Red Giant Trapcode Suite update. And Red Giant didn’t disappoint with Trapcode Suite 15.

Every year Red Giant adds more amazing features to its already amazing particle generator and emitter toolset, Trapcode Suite, and this year is no different. Trapcode Suite 15 is keeping tools like 3D Stroke, Shine, Starglow, Sound Keys, Lux, Tao, Echospace and Horizon while significantly updating Particular, Form and Mir.

I won’t be covering each plugin in this review but you can check out what each individual plugin does on the Red Giant’s website.

Particular 4
The bread and butter of the Trapcode Suite has always been Particular, and Version 4 continues to be a powerhouse. The biggest differences between using a true 3D app like Maxon’s Cinema 4D or Autodesk Maya and Adobe After Effects (besides being pseudo 3D) are features like true raytraced rendering and interacting particle systems with fluid dynamics. As I alluded to, After Effects isn’t technically a 3D app, but with plugins like Particular you can create pseudo-3D particle systems that can affect and be affected by different particle emitters in your scenes. Trapcode Suite 15 and, in particular (all the pun intended), Particular 4, have evolved to another level with the latest update to include Dynamic Fluids. Dynamic Fluids essentially allows particle systems that have the fluid-physics engine enabled to interact with one another as well as create mind-blowing liquid-like simulations inside of After Effects.

What’s even more impressive is that with the Particular Designer and over 335 presets, you don’t  need a master’s degree to make impressive motion graphics. While I love to work in After Effects, I don’t always have eight hours to make a fluidly dynamic particle system bounce off 3D text, or have two systems interact with each other for a text reveal. This is where Particular 4 really pays for itself. With a little research and tutorial watching, you will be up and rendering within 30 minutes.

When I was using Particular 4, I simply wanted to recreate the Dynamic Fluid interaction I had seen in one of their promos. Basically, two emitters crashing into each other in a viscus-like fluid, then interacting. While it isn’t necessarily easy, if you have a slightly above-beginner amount of After Effects knowledge you will be able to do this. Apply the Particular plugin to a new solid object and open up the Particular Designer in Effect Controls. From there you can designate emitter type, motion, particle type, particle shadowing, particle color and dispersion types, as well as add multiple instances of emitters, adjust physics and much more.

The presets for all of these options can be accessed by clicking the “>” symbol in the upper left of the Designer interface. You can access all of the detailed settings and building “Blocks” of each of these categories by clicking the “<” in the same area. With a few hours spent watching tutorials on YouTube, you can be up and running with particle emitters and fluid dynamics. The preset emitters are pretty amazing, including my favorite, the two-emitter fluid dynamic systems that interact with one another.

Form 4
The second plugin in the Trapcode Suite 15 that has been updated is Trapcode Form 4. Form is a plugin that literally creates forms using particles that live forever in a unified 3D space, allowing for interaction. Form 4 adds the updated Designer, which makes particle grids a little more accessible and easier to construct for non-experts. Form 4 also includes the latest Fluid Dynamics update that Particular gained. The Fluid Dynamics engine really adds another level of beauty to Form projects, allowing you to create fluid-like particle grids from the 150 included presets or even your own .obj files.

My favorite settings to tinker with are Swirl and Viscosity. Using both settings in tandem can help create an ooey-gooey liquid particle grid that can interact with other Form systems to build pretty incredible scenes. To test out how .obj models worked within form, I clicked over to www.sketchfab.com and downloaded an .obj 3D model. If you search for downloadable models that do not cost anything, you can use them in your projects under Creative Commons licensing protocols, as long as you credit the creator. When in doubt always read the licensing (You can find more info on creative commons licensing here, but in this case you can use them as great practice models.

Anyway, Form 4 allows us to import .obj files, including animated .obj sequences as well as their textures. I found a Day of the Dead-type skull created by JMUHIST, pointed form to the .obj as well as its included texture, added a couple After Effect’s lights, a camera, and I was in business. Form has a great replicator feature (much like Element3D). There are a ton of options, including fog distance under visibility, animation properties, and even the ability to quickly add a null object linked to your model for quick alignment of other elements in the scene.

Mir 3
Up last is Trapcode Mir 3. Mir 3 is used to create 3D terrains, objects and wireframes in After Effects. In this latest update, Mir has added the ability to import .obj models and textures. Using fractal displacement mapping, you can quickly create some amazing terrains. From mountain-like peaks to alien terrains, Mir is a great supplement when using plugins like Video Copilot Element 3D to add endless tunnels or terrains to your 3D scenes quickly and easily.

And if you don’t have or own Element 3D, you will really enjoy the particle replication system. Use one 3D object and duplicate, then twist, distort and animate multiple instances of them quickly. The best part about all of these Trapcode Suite tools is that they interact with the cameras and lighting native to After Effects, making it a unified animating experience (instead of animating separate camera and lighting rigs like in the old days). Two of my favorite features from the last update are the ability to use quad- or triangle-based polygons to texture your surfaces. This can give an 8-bit or low-poly feel quickly, as well as a second pass wireframe to add a grid-like surface to your terrain.

Summing Up
Red Giant’s Trapcode Suite 15 is amazing. If you have a previous version of the Trapcode Suite, you’re in luck: the upgrade is “only” $199. If you need to purchase the full suite, it will cost you $999. Students get a bit of a break at $499.

If you are on the fence about it, go watch Daniel Hashimoto’s Cheap Tricks: Aquaman Underwater Effects tutorial (Part 1 and Part 2). He explains how you can use all of the Red Giant Trapcode Suite effects with other plugins like Video CoPilot’s Element 3D and Red Giant’s Universe and offers up some pro tips when using www.sketchfab.com to find 3D models.

I think I even saw him using Video CoPilot’s FX Console, which is a free After Effects plugin that makes accessing plugins much faster in After Effects. You may have seen his work as @ActionMovieKid on Twitter or @TheActionMovieKid on Instagram. He does some amazing VFX with his kids — he’s a must follow. Red Giant made a power move to get him to make tutorials for them! Anyway, his Aquaman Underwater Effects tutorial take you step by step through how to use each part of the Trapcode Suite 15 in an amazing way. He makes it look a little too easy, but I guess that is a combination of his VFX skills and the Trapcode Suite toolset.

If you are excited about 3D objects, particle systems and fluid dynamics you must check out Trapcode Suite 15 and its latest updates to Particular, Mir and Form.

After I finished the Trapcode Suite 15 review, Red Giant released the Trapcode Suite 15.1 update. The 15.1 update includes Text and Mask Emitters for Form and Particular 4.1, updated Designer, Shadowlet particle type matching, shadowlet softness and 21 additional presets.

This is a free update that can be downloaded from the Red Giant website.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

 

Review: Boris FX’s Continuum and Mocha Pro 2019

By Brady Betzel

I realize I might sound like a broken record, but if you are looking for the best plugin to help with object removals or masking, you should seriously consider the Mocha Pro plugin. And if you work inside of Avid Media Composer, you should also seriously consider Boris Continuum and/or Sapphire, which can use the power of Mocha.

As an online editor, I consistently use Continuum along with Mocha for tight blur and mask tracking. If you use After Effects, there is even a whittled-down version of Mocha built in for free. For those pros who don’t want to deal with Mocha inside of an app, it also comes as a standalone software solution where you can copy and paste tracking data between apps or even export the masks, object removals or insertions as self-contained files.

The latest releases of Continuum and Mocha Pro 2019 continue the evolution of Boris FX’s role in post production image restoration, keying and general VFX plugins, at least inside of NLEs like Media Composer and Adobe Premiere.

Mocha Pro

As an online editor I am alway calling on Continuum for its great Chroma Key Studio, Flicker Fixer and blurring. Because Mocha is built into Continuum, I am able to quickly track (backwards and forwards) difficult shapes and even erase shapes that the built-in Media Composer tools simply can’t do. But if you are lucky enough to own Mocha Pro you also get access to some amazing tools that go beyond planar tracking — such as automated object removal, object insertion, stabilizing and much more.

Boris FX’s latest updates to Boris Continuum and Mocha Pro go even further than what I’ve already mentioned and have resulted in a new version naming, this round we are at 2019 (think of it as Version 12). They have also created the new Application Manager, which makes it a little easier to find the latest downloads. You can find them here. This really helps when jumping between machines and you need to quickly activate and deactivate licenses.

Boris Continuum 2019
I often get offline edits effects from a variety plugins — lens flares, random edits, light flashes, whip transitions, and many more — so I need Continuum to be compatible with offline clients. I also need to use it for image repair and compositing.

In this latest version of Continuum, BorisFX has not only kept plugins like Primatte Studio, they have brought back Particle Illusion and updated Mocha and Title Studio. Overall, Continuum and Mocha Pro 2019 feel a lot snappier when applying and rendering effects, probably because of the overall GPU-acceleration improvements.

Particle Illusion has been brought back from the brink of death in Continuum 2019 for a 64-bit keyframe-able particle emitter system that can even be tracked and masked with Mocha. In this revamp of Particle Illusion there is an updated interface, realtime GPU-based particle generation, expanded and improved emitter library (complete with motion-blur-enabled particle systems) and even a standalone app that can design systems to be used in the host app — you cannot render systems inside of the standalone app.

While Particle Illusion is a part of the entire Continuum toolset that works with OFX apps like Blackmagic’s DaVinci Resolve, Media Composer, After Effects, and Premiere, it seems to work best in applications like After Effects, which can handle composites simply and naturally. Inside the Particle Illusion interface you can find all of the pre-built emitters. If you only have a handful make sure you download additional emitters, which you can find in the Boris FX App Manager.

       
Particle Illusion: Before and After

I had a hard time seeing my footage in a Media Composer timeline inside of Particle Illusion, but I could still pick my emitter, change specs like life and opacity, exit out and apply to my footage. I used Mocha to track some fire from Particle Illusion to a dumpster I had filmed. Once I dialed in the emitter, I launched Mocha and tracked the dumpster.

The first time I went into Mocha I didn’t see the preset tracks for the emitter or the world in which the emitter lives. The second time I launched Mocha, I saw track points. From there you can track where you want your emitter to track and be placed. Once you are done and happy with your track, jump back to your timeline where it should be reflected. In Media Composer I noticed that I had to go to the Mocha options and change the option from Mocha Shape to no shape. Essentially, the Mocha shape will act like a matte and cut off anything outside the matte.

If you are inside of After Effects, most parameters can now be keyframed and parented (aka pick-whipped) natively in the timeline. The Particle Illusion plugin is a quick, easy and good-looking tool to add sparks, Milky Way-like star trails or even fireworks to any scene. Check out @SurfacedStudio’s tutorial on Particle Illusion to get a good sense of how it works in Adobe Premiere Pro.

Continuum Title Studio
When inside of Media Composer (prior to the latest release 2018.12), there were very few ways to create titles that were higher resolution than HD (1920×1080) — the New Blue Titler was the only other option if you wanted to stay within Media Composer.

Title Studio within Media Composer

At first, the Continuum Title Studio interface appeared to be a mildly updated Boris Red interface — and I am allergic to the Boris Red interface. Some of the icons for the keyframing and the way properties are adjusted looks similar and threw me off. I tried really hard to jump into Title Studio and love it, but I really never got comfortable with it.

On the flip side, there are hundreds of presets that could help build quick titles that render a lot faster than New Blue Titler did. In some of the presets I noticed the text was placed outside of 16×9 Title Safety, which is odd since that is kind of a long standing rule in television. In the author’s defense, they are within Action Safety, but still.

If you need a quick way to make 4K titles, Title Studio might be what you want. The updated Title Studio includes realtime playback using the GPU instead of the CPU, new materials, new shaders and external monitoring support using Blackmagic hardware (AJA will be coming at some point). There are some great pre-sets including pre-built slates, lower thirds, kinetic text and even progress bars.

If you don’t have Mocha Pro, Continuum can still access and use Mocha to track shapes and masks. Almost every plugin can access Mocha and can track objects quickly and easily.
That brings me to the newly updated Mocha, which has some new features that are extremely helpful including a Magnetic Spline tool, prebuilt geometric shapes and more.

Mocha Pro 2019
If you loved the previous version of Mocha, you are really going to love Mocha Pro 2019. Not only do you get the Magnetic Lasso, pre-built geometric shapes, the Essentials interface and high-resolution display support, but BorisFX has rewritten the Remove Module code to use GPU video hardware. This increases render speeds about four to five times. In addition, there is no longer a separate Mocha VR software suite. All of the VR tools are included inside of Mocha Pro 2019.

If you are unfamiliar with what Mocha is, then I have a treat for you. Mocha is a standalone planar tracking app as well as a native plugin that works with Media Composer, Premiere and After Effects, or through OFX in Blackmagic’s Fusion, Foundry’s Nuke, Vegas Pro and Hitfilm.

Mocha tracking

In addition (and unofficially) it will work with Blackmagic DaVinci Resolve by way of importing the Mocha masks through Fusion. While I prefer to use After Effects for my work, importing Mocha masks is relatively painless. You can watch colorist Dan Harvey run through the process of importing Mocha masks to Resolve through Fusion, here.

But really, Mocha is a planar tracker, which means it tracks multiple points in a defined area that works best in flat surfaces or at least segmented surfaces, like the side of a face, ear, nose, mouth and forehead tracked separately instead of all at once. From blurs to mattes, Mocha tracks objects like glue and can be a great asset for an online editor or colorist.

If you have read any of my plugin reviews you probably are sick of me spouting off about Mocha, saying how it is probably the best plugin ever made. But really, it is amazing — especially when incorporated with plugins like Continuum and Sapphire. Also, thanks to the latest Media Composer with Symphony option you can incorporate the new Color Correction shapes with Mocha Pro to increase the effectiveness of your secondary color corrections.

Mocha Pro Remove module

So how fast is Mocha Pro 2019’s Remove Module these days? Well, it used to be a very slow process, taking lots of time to calculate an object’s removal. With the latest Mocha Pro 2019 release, including improved GPU support, the render time has been cut down tremendously. In my estimation, I would say three to four times the speed (that’s on the safe side). In Mocha Pro 2019 removal jobs that take under 30 seconds would have taken four to five minutes in previous versions. It’s quite a big improvement in render times.

There are a few changes in the new Mocha Pro, including interface changes and some amazing tool additions. There is a new drop-down tab that offers different workflow views once you are inside of Mocha: Essentials, Classic, Big Picture and Roto. I really wish the Essentials view was out when I first started using Mocha, because it gives you the basic tools you need to get a roto job done and nothing more.

For instance, just giving access to the track motion objects (Translation, Scale, Rotate, Skew and Perspective) with big shiny buttons helps to eliminate my need to watch YouTube videos on how to navigate the Mocha interface. However, if like me you are more than just a beginner, the Classic interface is still available and one I reach for most often — it’s literally the old interface. Big Screen hides the tools and gives you the most screen real estate for your roto work. My favorite after Classic is Roto. The Roto interface shows just the project window and the classic top toolbar. It’s the best of both worlds.

Mocha Pro 2019 Essentials Interface

Beyond the interface changes are some additional tools that will speed up any roto work. This has been one of the longest running user requests. I imagine the most requested feature that BorisFX gets for Mocha is the addition of basic shapes, such as rectangles and circles. In my work, I am often drawing rectangles around license plates or circles around faces with X-splines, so why not eliminate a few clicks and have that done already? Answering my need, Mocha now has elliptical and rectangular shapes ready to go in both X-splines and B-splines with one click.

I use Continuum and Mocha hand in hand. Inside of Media Composer I will use tools like Gaussian Blur or Remover, which typically need tracking and roto shapes created. Once I apply the Continuum effect, I launch Mocha from the Effect Editor and bam, I am inside Mocha. From here I track the objects I want to affect, as well as any objects I don’t want to affect (think of it like an erase track).

Summing Up
I can save tons of time and also improve the effectiveness of my work exponentially when working in Continuum 2019 and Mocha Pro 2019. It’s amazing how much more intuitive Mocha is to track with instead of the built-in Media Composer and Symphony trackers.

In the end, I can’t say enough great things about Continuum and especially Mocha Pro. Mocha saves me tons of time in my VFX and image restoration work. From removing camera people behind the main cast in the wilderness to blurring faces and license plates, using Mocha in tandem with Continuum is a match made in post production heaven.

Rendering in Continuum and Mocha Pro 2019 is a lot faster than previous versions, really giving me a leg up on efficiency. Time is money right?! On top of that, using Mocha Pro’s magic Object removal and Modules takes my image restoration work to the next level, separating me from other online editors who use standard paint and tracking tools.

In Continuum, Primatte Studio gives me the leg up on greenscreen keys with its exceptional ability to auto analyze a scene and perform 80% of the keying work before I dial-in the details. Whenever anyone asks me what tools I couldn’t live without, I without a doubt always say Mocha.
If you want a real Mocha Pro education you need to watch all of Mary Poplin’s tutorials. You can find them on YouTube. Check out this one on how to track and replace a logo using Mocha Pro 2019 in Adobe After Effects. You can also find great videos at Borisfx.com.

Mocha point parameter tracking

I always feel like there are tons of tools inside of the Mocha Pro toolset that go unused simply because I don’t know about them. One I recently learned about in a Surfaced Studio tutorial was the Quick Stabilize function. It essentially stabilizes the video around the object you are tracking allowing you to more easily rotoscope your object with it sitting still instead of moving all over the screen. It’s an amazing feature that I just didn’t know about.

As I was finishing up this review I saw that Boris FX came out with a training series, which I will be checking out. One thing I always wanted was a top-down set of tutorials like the ones on Mocha’s YouTube page but organized and sent along with practical footage to practice with.

You can check out Curious Turtle’s “More Than The Essentials: Mocha in After Effects” on their website where I found more Mocha training. There is even a great search parameter called Getting Started on BorisFX.com. Definitely check them out. You can never learn enough Mocha!


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: iOgrapher Multi Case for mobile filmmaking

By Brady Betzel

Thanks to the amazing iPhone X, Google Pixel and Samsung Galaxy, almost everyone has a high-end video camera on their person at all times and this is helping to spur on mobile filmmaking and vlogging.

From YouTube to Instagram to movies like Unsane (Steven Soderbergh) or Tangerine (Sean Baker) — and regardless of whether you think a $35,000 camera setup tells a story better than a $1,000 cell phone (looking at you Apple Phone XS Max) — mobile filmmaking is here to stay and will only get better.

iOgrapher’s latest release is the iOgrapher Multi Case, a compact mobile filmmaking mounting solution that works with today’s most popular phones. iOgrapher has typically created solutions that were tied to the mobile device being used for filmmaking, such as an iPhone, the latest Samsung Galaxy phones, iPads or even action cameras like a GoPro Hero 7 Black.

With the new iOgrapher Multi Case you can fit any mobile device that measures more than 5 ½” x 2 ¼” and less than 6 ½” by 3 ⅜”. Unfortunately, you won’t be fitting an iPad or a GoPro in the iOgrapher Multi Case, but don’t fret! iOgrapher makes rigs for those as well. On the top of the Multi Case are two cold shoe mounts for lights, microphones or any other device, like a GoPro. To mount things with ¼” 20 screw mounts in the cold shoes you will need to find a cold shoe to ¼” 20 adapter, which is available on iOgrapher’s accessory page. You can also find these at Monoprice or Amazon for real cheap.

And if you are looking to order more mounts you may want to order some extra cold shoe adapters that can be mounted on the handles of the iOgrapher Multi Case in the additional ¼” 20 screw mounts. The mounts on the handles are great for adding in additional lighting or microphones. I’ve even found that if you are going to be doing some behind-the-scenes filming or need another angle for your shooting, a small camera like a GoPro can be easily mounted and angled. With all this mounting you should assume that you are going to be using the iOgrapher on a sturdy tripod. Just for fun, I mounted the iOgrapher Multi Case onto a GoPro 3-Way Grip, which can also be used as a light tripod. It wasn’t exactly stable but it worked. I wouldn’t suggest using it for more than an emergency shooting situation though.

On the flip side (all pun intended), the iOgrapher can be solidly mounted vertically with the ¼” 20 screw mounts on the handles. With Instagram making headway with vertical video in their Instagram Stories, iOgrapher took that idea and built that into their Multi Case, further cementing grumbling from the old folks who just don’t get vertical video.

Testing
I tried out both a Samsung Galaxy s8+ as well as an iPhone 7+ with their cases on inside of the iOgrapher Multi Case. Both fit. The iPhone 7+ was stretching the boundaries of the Multi Case, but it did fit and worked well. The way the phones are inserted into the Multi Case is by a spring-loaded bottom piece. From the left or top side, if you are shooting vertically, you push the bottom of the mobile device into the corner covered slots of the iOgrapher Multi Case until the top or the left side can be secured under the left or top side of the Multi Case. It’s really easy.

I was initially concerned with the spring loading of the case; I wasn’t sure if the springs would be resilient enough to handle the constant pulling in and out of the phones, but the springs are high quality and held up beautifully. I even tried inserting my mobile phones tons of times and didn’t notice any issues with the springs or my phones.

Take care when inserting your phone into the Multi Case if you have a protective shield on the screen of your device. If you aren’t extra careful it can pull or snag on the cover — especially with the tight fit of a case. Just pay attention and there will be nothing to worry about. The simple beauty of the iOgrapher is that with a wider grip of your filmmaking device, you have a larger area to distribute any shaking coming from your hands, essentially helping stabilize your filmmaking without the need for a full-fledged gimbal.

If you accidentally drop your iOgrapher you may get a scratch, but for the most part they are built sturdy and can withstand punishment, whether it’s from your four year old or from weather. If you want to get a little fancy, you can buy affordable lights like the Litra Torch (check out my review) to attach to the cold shoe mounts, or even a Rode microphone (don’t forget the TRS to TRRS adapter if you are plugging into an iPhone), and you are off and running.

Summing Up
I have been really intrigued with iOgrapher’s products since day one. They are an affordable and sturdy way to jump into filmmaking using cameras everyone carries with them every day: their phones.

Whether you are a high school student looking to get steady and professional mobile video, or a journalist looking for a quick way to make the most of your shots with just a phone, light, mic and tripod mount, the iOgrapher Multi Case will unlock your mobile filmmaking potential.

The iOgrapher Multi Case is a very durable protective case for your mobile filmmaking devices that is a steal at $79.99. If you are a parent that is looking for an inexpensive way to try and tease your child’s interest in video take a look at www.iographer.com and grab a few accessories like a Manfrotto light and Rode VideoMicro to add some subtle lighting and pick up the best quality audio.

Make sure to check out Dave Basulto’s — the creator of iOgrapher — demo of the iOgrapher Multi Case, including trying out the fit of different phones.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

You can now export ProRes on a PC with Adobe’s video apps

By Brady Betzel

Listen up post pros! You can now natively export ProRes from a Windows 10-based PC for $20.99 with the latest release of Adobe’s Premiere, After Effects and Media Encoder.

I can’t overstate how big of a deal this is. Previously, the only way to export ProRes from a PC was to use a knock-off reverse-engineered codec that would mimic the process — creating footage that would often fail QC checks at networks — or be in possession of a high-end app like Fusion, Nuke, Nucoda or Scratch. The only other way would be to have a Cinedeck in your hands and output your files in realtime through it. But, starting today, you can export native ProRes 4444 and ProRes 422 from your Adobe Creative Cloud Suite apps like Premiere Pro, After Effects, and Media Encoder. Have you wanted to use those two or three Nvidia GTX 1080ti graphics cards that you can’t stuff into a Mac Pro? Well, now you can. No more being tied to AMD for ProRes exports.

Apple seems to be leaving their creative clients in the dust. Unless you purchase an iMac Pro or MacBook Pro, you have been stuck using a 2013 Mac Pro to export or encode your files to ProRes specifications. A lot of customers, who had given Apple the benefit of the doubt and stuck around for a year or two longer than they probably should have waiting for a new Mac Pro — allegedly being released in 2019 — began to transition over to Windows-based platforms. All the while, most would keep that older Mac just to export ProRes files while using the more powerful and updated Windows PC to do their daily tasks.

Well, that day is now over and, in my opinion, leads me to believe that Apple is less concerned with keeping their professional clients than ever before. That being said, I love that Apple has finally opened their ProRes codecs up to the Adobe Creative Cloud.

Let’s hope it can become a system-wide feature, or at least added to Blackmagic’s Resolve and Avid’s Media Composer. You can individually rent Adobe Premiere Pro or After Effects for $20.99 month, rent the entire Adobe Creative Cloud library for $52.99 a month or, if you are a student or teacher, you can take advantage of the best deal around for $19.99 a month, which gives you ALL the Creative Cloud apps.

Check out Adobe’s blog about the latest Windows ProRes export features.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: DJI’s Mavic Air lightweight drone

By Brady Betzel

Since the first DJI Phantom was released in January of 2013, drones found a place in our industry. Turn on almost any television show airing on National Geographic and you will see some sort of drone videography at work. DJI and GoPro have revolutionized how everyone films over the last decade.

Nowadays drones are expected to be a part of every cameraperson’s kit. Once DJI released their second-generation flagship drone, the Mavic Pro, the quality of footage and still frame images went from prosumer-level to professional. One thing that has always been a tough sell for me with drones is the physical size of the unmanned aerial vehicles. The original DJI flagship drone, the Phantom, is a little big, you essentially need a duffle-sized backpack to carry it and its accessories. But now DJI has upped the ante with a smaller footprint — the Mavic Air.

The Mavic Air is absolutely the best drone I have ever had my hands on — from being the size of a few iPhones stacked on top of each other to recording high-quality footage that is 100% being used on television shows airing around the world. It’s not only the easiest drone to use with or without a remote, but it is by far the best picture I have seen from a consumer-level drone for under $1,000.

The Mavic Air is small, lightweight, and packed with amazing technology to help itself avoid slamming into the sides of buildings or trees. You can find all the nerdy tech specs here.

While there are super high-end drones flying Red Monstros around, sometimes there are restrictions that require the crew or cameraperson to downsize their equipment to only what is vital. So a drone that takes up a fraction of your carry-on luggage and will still yield 4K footage acceptable for broadcast is a win. Obviously, you won’t be getting the same sensors that you will find in the

Digging In
The Mavic Air has many features that set it apart from the pack. SmartCapture allows anyone to fly the drone without a remote, instead you just use a few specific gestures from your hands. An updated slow-motion feature allows the Mavic Air to shoot up to 1080p, 120fps for those uber-epic sweeps in slow motion. There are multiple Quickshot modes you can find in the DJI app — like the two newest: Asteroid and Boomerang.

DJI is known for advancing drone technology and keeping their prices relatively low. One of the most advanced features DJI consistently works on is flight sensors. Flight Autonomy 2.0 and Advanced Pilot Assistance Systems are the latest advances in technology for the Mavic Air. Flight Autonomy 2.0 takes information from the seven onboard infrared sensors to create its own 3D environmental map to avoid crashing. The Advanced Pilot Assistance System (APAS), which has to be enabled, will automatically tell the Mavic Air to avoid obstacles while flying.

Taking Flight
So really how is it to fly and work with the Mavic Air? It’s very easy. The drone is ultra-portable, and the remote folds up nicely as well — nice and tight in fact, and you can then unfold it and install the newly removable joysticks for flight. You mount your phone on the bottom and connect it with one of the three cables provided to you. I have a Samsung Galaxy, so I used a USB-C connection. I downloaded and updated the DJI Go App, connected the USB-C cable to my phone (which is a little clumsy and could hopefully be a little better integrated in the future), paired the remote to the Mavic Air and was flying… that was unless I had to update firmware. Almost every time I went to fly one piece of equipment — if not more — needed to be updated. While it doesn’t take a long time, it is annoying. Especially when you have three young boys staring at you to fly this bad boy around. But once you get up and running, the Mavic is simple to fly.

I was most impressed with how it handled wind. The Mavic Air lives up to its name, and while it is definitely tiny, it can fight for its position in wind with the best of them. The sensors are amazing as well. You can try your hardest (unless you are in sports mode — don’t try to fly into anything as the sensors are disabled) to run into stuff and the Mavic Air stops dead in its tracks.

The Mavic Air’s filming capabilities are just as impressive as its flying capabilities. I like to set my DJI footage to D-Cinelike to get a flatter color profile in my video, allowing for a little more range in the shadows and highlights when I am color correcting. However, the stock DJI color settings are amazing. Another trick is to knock the sharpening down to medium or off and add that back in when finishing your video. The Mavic Air records using a 3-axis stabilized camera for ultra-smooth video up to 4K (UHD) at 30fps in the newly upped 100Mb/s H.264/MPEG-4 AVC recording format. Not quite the H.265 compression, but I’m sure that will come in the next version. I would love to see DJI offer a built-in neutral density filter on their drones — this would really help get that cinematic look without sacrificing highlight and shadow detail.

In terms of batteries, I was sent two, which I desperately needed; they only lasted about 20 minutes apiece. The batteries take around an hour to charge, but when you buy the Fly More Combo for $999 you also get a sweet four-battery charger to charge them all at once. Check out all the goodies you get with the Fly More Combo.

You will want to buy a decent-sized memory card, probably a 128GB, but a 64GB would be fine. When inserting the memory card into the Air it can take a little practice, the slot and cover are a little clunky and hard to use.

Summing Up
In the end, the DJI Mavic Air is the best drone I have used hands down. From the ultra-portable size (due to its compact folding ability) to the amazing shooting modes, you get everything you would want in a drone for under $1,000 with the Fly More Combo. The Mavic Air is just the right balance of technology and fun that will make you want to fly your drone.

Sometimes I get intimidated when flying a drone because they are so large and distracting, but not the Mavic Air — it is tiny and unassuming but packed with raw power to capture amazing images for broadcast or personal use.

While we typically don’t rate our reviewed products, I will just this once. I would rate the Mavic Air a 10, and can only hope that they next iteration embraces the Hasselblad history to stretch the Mavic Air into even further professional directions.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Apple updates FCPX, Motion and Compressor

By Brady Betzel

While Apple was busy releasing the new Mac Mini’s last month, they were also quietly prepping some new updates. Apple has releasing free updates to FCPX, Motion and Compressor.

CatDV inside of FCPX

The FCPX 10.4.4 update includes Workflow Extensions, batch sharing, Comparison Viewer, built-in video noise reduction, timecode window and more. The Workflow Extensions are sure to take the bulk of the update cake: At launch Apple has announced Shutterstock, Frame.io and CatDV will have extensions directly usable inside of FCPX instead of through a web browser. Frame.io looks to be the most interesting extension with realtime reflection of who is watching your video and at what timecode they are at, a.k.a, “Presence.”

Frame.io being rebuilt from the ground up using Swift will make its venture inside of FCPX extremely streamlined and fast. Notwithstanding Internet bandwidth, Frame.io inside of FCPX looks to be the go-to approval system that FCPX editors will use. I am not quite sure why Apple didn’t create their own approval and note-taking system, but they didn’t go wrong working with Frame.io. Since many editors use this as their main approval system, FCPX users will surely love this implementation directly inside of the app.

When doing color correction, it is essential to compare your current work with either other images or the source image, and luckily for FCPX colorists you can now do this with the all new Comparison Viewer. Essentially, the Comparison Viewer will allow you to compare anything to the clip you are color grading.

One feature of this that I really love is that you can have access to scopes on both the reference image and your working image. If you understand how scopes work, color matching via parade or waveforms can often be quicker than by eyeball match.

Frame.io inside of FCPX

Final Cut Pro 10.4.4 has a few other updates like Batch Share, which allows you to cue a bunch of exports or projects in one step, Timecode Window (which is a “why wasn’t this there already” feature) is essential when editing video footage, and video noise reduction has been added as a built-in feature with adjustable amounts and sharpness. There are a few other updates like Tiny Planet, which allows you to quickly make that spherical 360-degree video look, not really an important technical update but fun nonetheless.

Motion
With Version 5.4.2, Apple has put the advanced color correction toolset from FCPX directly inside of Motion. In addition, you can now add custom LUTs to your work. Apple has added the Tiny Planet effect as well as a Comic filter inside Motion. Those aren’t incredibly impressive, but the addition of the color correction toolkit is an essential addition to Motion and will provide a lot of use.

Compressor
Compressor 4.4.2 in my opinion is the sleeper update. Apple has finally updated Compressor to a 64-bit engine to take advantage of all of your memory, as well as improved overall performance with huge files. And it will still work with legacy 32-bit formats. Closed captions can now be burned into a video, including the SRT format. Compressor has also added automatic configuration to apply correct frame rate, field order and color space to your MXF and QuickTime outputs.

The FCPX, Motion and Compressor updates are available now for free if you have previously purchased the apps. If not FCPX retails for $299.99. Motion and Compressor are $49.99 each.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: Puget Systems Genesis I custom workstation

By Brady Betzel

With so many companies building custom Windows-based PCs these days, what really makes for a great build? What would make me want to pay someone to build me a PC versus building it myself? In this review, I will be going through a custom-built PC sent to me to review by Puget Systems. In my opinion, besides the physical components, Puget Systems is the cream of the crop of custom -built PCs. Over the next few paragraphs I will focus on how Puget Systems identified the right custom-built PC solution for me (specifically for post), how my experience was before, during and after receiving the system and, finally, specs and benchmarks of the system itself.

While quality components are definitely a high priority when building a new workstation, the big thing that sets Puget Systems’ apart from the rest of the custom-built PC pack is the personal and highly thorough support. I usually don’t get the full customer experience when reviewing custom builds. Typically, I am sent a workstation and maybe a one-sheet to accompany the system. To Puget System’s credit they went from top to tail when helping me put together the system I would test. Not only did I receive a completely newly built and tested system, but I talked to a customer service rep, Jeff Stubbers, who followed up with me along the way.

First, I spoke with Jeff over the phone. We talked about my price range and what I was looking to do with the system. I usually get told what I should buy — by the way, I am not a person that likes to be told what I want. I have a lot of experience not only working on high-end workstations but have been building and supporting them essentially my entire life. I actively research the latest and greatest technology. Jeff from Puget Systems definitely took the correct approach; he started by asking which apps I use and how I use them. When using After Effects, am I doing more 3D work or simple lower thirds and titles. Do I use and do I plan to continue using Avid Media Composer, Adobe Premiere Pro or Blackmagic’s DaVinci Resolve the most?

Essentially, my answers were that I use After Effects sparingly, but I do use it. I use Avid Media Composer professionally more than Premiere, but I see more and more Premiere projects coming my way. However, I think Resolve is the future, so I would love to tailor my system toward that. Oh and I dabble in Maxon Cinema 4D as well. So in theory, I need a system that does everything, which is kind of a tall order.

I told Jeff that I would love to stay below $10,000, but need the system to last a few years. Essentially, I was taking the angle of a freelance editor/colorist buying an above mid-range system. After we configured the system, Jeff continued to detail benchmarks that Puget Systems performs on a continuing basis and why two GTX 1080ti cards are going to benefit me instead of just one, as well as why an Intel i9 processor would specifically benefit my work in Resolve.

After we finished on the phone I received an email from Jeff that contained a link to webpage that continually would update me on the details and how my workstation was being built — complete with pictures of my actual system. There are also some links to very interesting articles and benchmarks on the Puget System’s website. They perform more pertinent benchmarks for post production pros than I have seen from any other company. Usually you see a few generic Premiere or Resolve benchmarks, but nothing like Puget System’s, even if you don’t buy a system from them you should read their benchmarks.

While my system went through the build and ship process, I saw pictures and comments about who did what in the process over at Puget Systems. Beth was my installer. She finished and sent the system to Kyle who ran benchmarks. Kyle then sent it to Josh for quality control. Josh discovered the second GTX 1080ti was installed in a reduced bandwidth PCIe slot and would be sent back to Beth for correction. I love seeing this transparency! It not only gives me the feeling that Puget Systems is telling me the truth, but that they have nothing to hide. This really goes a long way with me. Once my system was run through a second quality control pass, it was shipped to me in four days. From start to finish, I received my system in 12 days. Not a short amount of time, but for what Puget Systems put the system through, it was worth it.

Opening the Box
I received the Genesis I workstation in a double box. A nice large box with sturdy foam corners encasing the Fractal Design case box. There was also an accessories box. Within the accessories box were a few cables and an awesome three-ring binder filled with details of my system, the same pictures of my system, including thermal imaging pictures from the website, all of the benchmarks performed on my system (real-world benchmarks like Cinebench and even processing in Adobe Premiere) and a recovery USB 3.0 drive. Something I really appreciated was that I wasn’t given all of the third-party manuals and cables I didn’t need, only what I needed. I’ve received other custom-built PCs where the company just threw all of the manuals and cables into a Ziploc and called it a day.

I immediately hooked the system up and turned it on… it was silent. Incredibly silent. The Fractal Design Define R5 Titanium case was lined with a sound-deadening material that took whatever little sound was there and made it zero.

Here are the specs of the Puget System’s Genesis I I was sent:
– Gigabyte X299 Designare EX motherboard
– Intel Core i9 7940X 3.1GHz 14 Core 19.25MB 165W CPU
– Eight Crucial DDR4-2666 16GB RAM
– EVGA GeForce GTX 1080 TI 11GB gaming video card
– Onboard sound card
– Integrated WiFi+Bluetooth networking
– Samsung 860 Pro 512GB SATA3 2.5-inch SSD hard drive — primary drive
– Samsung 970 Pro 1TB M.2 SSD hard drive — secondary drive.
– Asus 24x DVD-RW SATA (Black) CD / DVD-ROM
– Fractal Design Define R5 titanium case
– EVGA SuperNova 1200W P2 power supply
– Noctua NH-U12DX i4 CPU cooling
– Arctic Cooling MX-2 thermal compound
– Windows 10 Pro 64-bit operating system
– Warranty: Lifetime labor and tech support, one-year parts warranty
– LibreOffice software: courtesy install
– Chrome software: courtesy install
– Adobe Creative Cloud Desktop App software: courtesy Install
– Resolve 1-3 GPU

System subtotal: $8,358.38. The price is right in my opinion, and mixed with the support and build detail it’s a bargain.

System Performance
I ran some system benchmarks and tests that I find helpful as a video editor and colorist who uses plugins and other tools on a daily basis. I am becoming a big fan of Resolve, so I knew I needed to test this system inside of Blackmagic’s Resolve 15. I used a similar sequence between Adobe Premiere and Resolve 15: a 10-minute, 23.98fps, UHD/3840×2160 sequence with mixed format footage from 4K and 8K Red, ARRI Raw UHD and ProRes4444. I added some Temporal Noise Reduction to half of the clips, including the 8K Red footage, resizes to all clips, all on top of a simple base grade.

First, I did a simple Smart User cache test by enabling the User Cache at DNxHR HQX 10-bit to the secondary Samsung 1TB drive. It took about four minutes and 34 seconds. From there I tried to playback the media un-cached, and I was able to playback everything except the 8K media in realtime. I was able to playback the 8K Red media at Quarter Res Good (Half Res would go between 18-20fps playback). The sequence played back well. I also wanted to test the export speeds. The first test was an H.264 export without cache on the same sequence. I set the H.264 output in Resolve to 23.98fps, UHD, auto-quality, no frame reordering, force highest quality debayer/resizes and encoding profile: main. The file took 11 minutes and 57 seconds. The second test was a DNxHR HQX 10-bit QuickTime with the same sequence, it took seven minutes and 44 seconds.

To compare these numbers I recently ran a similar test on an Intel i9-based MacBook Pro and with the Blackmagic eGPU with Radeon Pro 580 attached, the H.264 export took 16 minutes and 21 seconds, while a ProRes4444 took 22 minutes and 57 seconds. While not comparing apples to apples, this is still a good comparison in terms of a speed increase you can have with a desktop system and a pair of Nvidia GTX 1080ti graphics cards. With the impending release of the Nvidia GTX 2080 cards, you may want to consider getting those instead.

While in Premiere I ran similar tests with a very similar sequence. To export an H.264 (23.98fps, UHD, no cache used during export, VBR 10Mb/s target rate, no frame reordering) it took nine minutes and 15 seconds. Going a step further it took 47 minutes to export an H.265. Similarly, doing a DNxHR HQX 10-bit QuickTime export took 24 minutes.

I also ran the AJA System test on the 1TB spare drive (UHD, 16GB test file size, ProRes HQ). The read speed was 2951MB/sec and the write speed was 2569MB/sec. Those are some very respectable drive speeds, especially for a cache or project drive. If possible you would probably want to add another drive for exports or to have your RAW media stored on in order to maximize input/output speeds.

Up next was Cinebench R15: OpenGL — 153.02fps, Ref. Match 99.6%, CPU — 2905 cb, CPU (single core) — 193cb and MP Ratio 15.03x. Lastly, I ran a test that I recently stumbled upon: the Superposition Benchmark from Unigine. While it is more of a gaming benchmark, I think a lot of people use this and might glean some useful information from it. The overall score was 7653 (fps: min 45.58, avg 57.24, max 72.11, GPU degrees Celsius: min 36, max 85, GPU use: max 98%.

Summing Up
In the end, I am very skeptical of custom-build PC shops. Typically, I don’t see the value in the premium they set when you can probably build it yourself with parts you choose from PCpartpicker.com. However, Puget Systems is the exception — their support and build-quality are top notch. From the initial phone conversation to the up-to-the minute images and custom-build updates online, to the final delivery, and even follow-up conversations, Puget Systems is by far the most thorough and worthwhile custom-build PC maker I have encountered.

Check out their high-end custom build PCs and tons of benchmark testing and recommendations on their website.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

HP offerings from Adobe Max 2018

By Brady Betzel

HP workstations have been a staple in the post community, especially for anyone not using a Mac or the occasional DIY/custom build from companies like Puget Systems or CyberPower PCs. The difference comes with customers who need workstation-level components and support. Typically, a workstation is run through much tougher and stringent tests so the client can be assured of 24/7/365 up-time. HP continues to evolve and become, in my opinion, a leader for all non-Apple dedicated workflows.

At Adobe Max 2018, HP announced updated components to its Z by HP line of mobile workstations, including the awesome ZBook Studio x360, ZBook Studio, ZBook 15 and ZBook 17. I truly love HP’s mobile workstation offerings. The only issue I constantly come up against is can I — or any freelance worker for that matter — justify the cost of their systems?

I always want the latest and greatest, and I feel I can get that with the updated performance options in this latest update to the ZBook line. They include the increased 6-core Intel i9 processors; expanded memory of up to 32GB (or 128GB in some instances); a really interesting M.2 SSD RAID-1 configuration from the factory that allows for constant mirroring of your boot drive (if one drive fails, the other will take over right where you left off); the ZBook Studio and Studio x360 getting a GPU increase with the Nvidia Quadro P2000; and the anti-glare touchscreen on the x360. This is all in addition to HP’s DreamColor option, which allows for 100% Adobe RGB coverage and 600 nits of brightness. But again, this all comes at a high cost when you max out the workstation with enough RAM and GPU horsepower. But there is some good news for those that don’t have a corporate budget to pull from: HP has introduced the pilot program Z Club.

The Z Club is essentially a leasing program for HP’s Z series products. At the moment, HP will take 100 creators for this pilot program, which will allow you to select a bundle of Z products and accessories that fit your creative lifestyle for a monthly cost. This is exactly how you solve the problem of getting prosumer and freelance workers who can’t quite justify a $5,000 price tag for purchase, but can justify a $100 a month payment. HP has touted categories of products for editors, photographers and many others. With monthly payments that range from $100 to $250, depending on what you order, this is much more manageable for mid-range end users who need the power of a workstation but up until now couldn’t afford it.

So what will you get if you are accepted to the Z Club pilot program? You can choose the products you want and not pay for three months. And you can continue or return your products, you can switch products and you will have access to a Z Club concierge service for any questions and troubleshooting.

On the call I had with HP, they mentioned that a potential bundle for a video editor could be an HP Z series mobile workstation or desktop, along with a DreamColor display, and an external RAID storage system to top it off.

In the end, I think HP (much like Blackmagic’s Resolve in the NLE/color world) is at the front of the pack. They are listening to what creatives are saying about Apple — how this giant company is not listening to their customers in an efficient and price-conscious way. Creating essentially a leasing program for mid- to high-range products with support is the future. It’s essentially Apple’s own iPhone program but with computers!

Hopefully this program takes off, and if you are lucky enough to be accepted into the pilot program, I would be curious to hear your experience, so please reach out. But with HP making strides in the workstation security initiatives like Sure Start, a privacy mode for mobile systems, and military-grade testing known as MIL-spec, HP is going from being a standard in the media and entertainment post industry. For those leaving Apple for a Windows-based PC, you should apply for the Z Club pilot program. Go to www.hp.com to find out more or follow along on Twitter @AdobeMax, @HP or using #AdobeMax.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: Blackmagic’s eGPU and Intel i9 MacBook Pro 2018

By Brady Betzel

Blackmagic’s eGPU is worth the $699 price tag. You can buy it from Apple’s website, where it is being sold exclusively for the time being. Wait? What? You wanted some actual evidence as to why you should buy the BMD eGPU?

Ok, here you go…

MacBook Pro With Intel i9
First, I want to go over the latest Apple MacBook Pro, which was released (or really just updated) this past July. With some controversial fanfare, the 2018 MacBook Pro can now be purchased with the blazingly fast Intel i9, 2.6GHz (Turbo Boost up to 4.3GHz) six-core processor. In addition, you can add up to 32GB of 2400MHz DDR4 onboard memory. The Radeon Pro 560x GPU with 4GB of GDDR5 memory and even a 4TB SSD storage drive. It has four Thunderbolt 3 ports and, for some reason, a headphone jack. Apple is also touting its improved butterfly keyboard switches as well as its True Tone display technology. If you want to read more about that glossy info head over to Apple’s site.

The 2018 MacBook Pro is a beast. I am a big advocate for the ability to upgrade and repair computers, so Apple’s venture to create what is essentially a leased computer ecosystem that needs to be upgraded every year or two usually puts a bad taste in my mouth.

However, the latest MacBook Pros are really amazing… and really expensive. The top-of-the-line MacBook Pro I was provided with for this review would cost $6,699! Yikes! If I was serious, I would purchase everything but the $2,000 upgrade from the 2TB SSD drive to the 4TB, and it would still cost $4,699. But I suppose that’s not a terrible price for such an intense processor (albeit not technically workstation-class).

Overall, the MacBook Pro is a workhorse that I put through its video editing and color correcting paces using three of the top four professional nonlinear editors: Adobe Premiere, Apple FCP X and Blackmagic’s Resolve 15 (the official release). More on those results in a bit, but for now, I’ll just say a few things: I love how light and thin it is. I don’t like how hot it can get. I love how fast it charges. I don’t like how fast it loses charge when doing things like transcoding or exporting clips. A 15-minute export can drain the battery over 40% while playing Spotify for eight hours will hardly drain the battery at all (maybe 20%).

Blackmagic’s eGPU with Radeon Pro 580 GPU
One of the more surprising releases from Blackmagic has been this eGPU offering. I would never have guessed they would have gone into this area, and certainly would never have guessed they would have gone with a Radeon card, but here we are.

Once you step back from the initial, “Why in the hell wouldn’t they let it be user-replaceable and also not brand dependent” shock, it actually makes sense. If you are Mac OS user, you probably can do a lot in terms of external GPU power already. When you buy a new iMac, iMac Pro or MacBook Pro, you are expecting it to work, full stop.

However, if you are a DIT or colorist that is more mobile than that sweet million-dollar color bay you dream of, you need more. This is where the BMD eGPU falls nicely into place. You plug it in and instantly see it populate in the menu bar. In addition, the eGPU acts as a dock with four USB 3 ports, two Thunderbolt 3 ports and an HDMI port. The MacBook Pro will charge off of the eGPU as well, which eliminates the need for your charger at your docking point.

On the go, the most decked out MacBook Pro can handle its own. So it’s no surprise that FCP X runs remarkably fast… faster than everything else. However, you have to be invested in an FCP X workflow and paradigm — and while I’m not there yet, maybe the future will prove me wrong. Recently, I saw someone on Twitter who developed an online collaboration workflow, so people are excited about it.

Anyway, many of the nonlinear editors I work with can also play on the MacBook Pro, even with 4K Red, ARRI and, especially, ProRes footage. Keep in mind though, with the 2K, 4K, or whatever K footage, you will need to set the debayer to around “half good” if you want a fluid timeline. Even with the 4GB Radeon 560x I couldn’t quite play realtime 4K footage without some sort of compromise in quality.

But with the Blackmagic eGPU, I significantly improved my playback capabilities — and not just in Resolve 15. I did try and plug the eGPU into a PC with Windows 10 I was reviewing at the same time and it was recognized, but I couldn’t get all the drivers sorted out. So it’s possible it will work in Windows, but I couldn’t get it there.

Before I get to the Resolve testing, I did some benchmarking. First I ran Cinebench R15 without the eGPU attached and got the following scores: OpenGL – 99.21fps, reference match 99.5%, CPU – 947cb, CPU (single core) 190cb and MP ratio of 5.00x. With the GPU attached: Open GL — 60.26fps, reference match 99.5%, CPU — 1057 cb, CPU (single core) 186cb and MP ratio of 5.69x. Then I ran Unigine’s Valley Benchmark 1.0 without the eGPU, which got 21.3fps and a score of 890 (minimum 12.4fps/maximum 36.2fps). With the eGPU it got 25.6fps and a score of 1073 (minimum 19.2 fps/max 37.1fps)

Resolve 15 Test
I based all of my tests on a similar (although not exact for the different editing applications) 10-minute timeline, 23.98fps, 3840×2160, 4K and 8K RAW Red footage (R3D files) and Alexa (.ari and ProRes444XQ) UHD footage, all with edit page resizes, simple color correction and intermittent sharpening and temporal noise reduction (three frames, better, medium, 10, 10 and 5).

Playback: Without the eGPU I couldn’t play 23.98fps, 4K Red R3D without being set to half-res. With the eGPU I could playback at full-res in realtime (this is what I was talking about in sentence one of this review). The ARRI footage would play at full res, but would go between 1fps and 7fps at full res. The 8K Red footage would play in realtime when set to quarter-res.

One of the most re-assuring things I noticed when watching my Activity Monitor’s GPU history readout was that Resolve uses both GPUs at once. Not all of the apps did.

Resolve 15 Export Tests
In the following tests, I disabled all cache or optimized media options, including Performance Mode.

Test 1: H.264 at 23.98fps, UHD, auto-quality, no frame reordering, force highest-quality debayer/resizes and encoding profile Main)
a. Without eGPU (Radeon Pro 560x): 22 minutes, 16 seconds
b. With BMD eGPU (Radeon Pro 580): 16 minutes and 21 seconds

Test 2: H.265 10-bit, 23.98/UHD, auto quality, no frame reordering, force highest-quality debayer/resizes)
a. Without eGPU: stopped rendering after 10 frames
b. With BMD eGPU: same result

Test 3:
ProRes4444 at 23.98/UHD
a. Without eGPU: 27 min and 29 seconds
b. With BMD eGPU: 22 minutes and 57 seconds

Test 4:
– Edit page cache – enabled Smart User Cache at ProResHQ
a. Without eGPU: 17 minutes and 28 seconds
b. With BMD eGPU: 12 minutes and 22 seconds

Adobe Premiere Pro v.12.1.2
I performed similar testing in Adobe Premiere Pro using a 10-minute timeline at 23.98fps, 3840×2160, 4K and 8K RAW Red footage (R3D files) and Alexa (DNxHR SQ 8-bit) UHD footage, all with Effect Control tab resizes and simple Lumetri color correction, including sharpening and intermittent denoise (16) under the HSL Secondary tab in Lumetri applied to shadows only.

In order to ensure your eGPU will be used inside of Adobe Premiere, you must use Metal as your encoder. To enable it go to File > Project Settings > General and change the renderer to Mercury Playback Engine GPU acceleration Metal — (OpenCL will only use the internal GPU for processing.)

Premiere did not handle the high-resolution media as aptly as Resolve had, but it did help a little. However, I really wanted to test the export power with the added eGPU horsepower. I almost always send my Premiere sequences to Adobe Media Encoder to do the processing, so that is where my exports were processed.

Adobe Media Encoder
Test 1: H.264 (No render used during exports: 23.98/UHD, 80Mb/s, software encoding doesn’t allow for profile setup)
a. Open CL with no eGPU: about 140 minutes (sorry had to chase the kids around and couldn’t watch this snail crawl)
b. Metal no eGPU: about 137 minutes (chased the kids around again, and couldn’t watch this snail crawl, either)
c. Open CL with eGPU: wont work, Metal only
d. Metal with eGPU: one hour

Test 2: H.265
a. Without eGPU: failed (interesting result)
b. With eGPU: 40 minutes

Test 3: ProRes4444
a. Without eGPU: three hours
b. With eGPU: one hour and 14 minutes

FCP X
FCP X is an interesting editing app, and it is blazing fast at handling ProRes media. As I mentioned earlier, it hasn’t been in my world too much, but that isn’t because I don’t like it. It’s because professionally I haven’t run into it. I love the idea of roles, and would really love to see that playout in other NLEs. However, my results speak for themselves.

One caveat to using the eGPU in FCP X is that you must force it to work inside of the NLE. At first, I couldn’t get it to work. The Activity Monitor would show no activity on the eGPU. However, thanks to a Twitter post, James Wells (@9voltDC) sent me to this, which allows you to force FCP X to use the eGPU. It took a few tries but I did get it to work, and funny enough I saw times when all three GPUs were being used inside of FCP X, which was pretty good to see. This is one of those use-at-your-own risk things, but it worked for me and is pretty slick… if you are ok with using Terminal commands. This also allows you to force the eGPU onto other apps like Cinebench.

Anyways here are my results with the BMD eGPU exporting from FCP X:

Test 1: H.264
a. Without eGPU: eight minutes
b. With eGPU: eight minutes and 30 seconds

Test 2: H.265: Not an option

Test 3: ProRes4444
a. Without eGPU: nine minutes
b. With eGPU: six minutes and 30 seconds

Summing Up
In the end, the Blackmagic eGPU with Radeon Pro 580 GPU is a must buy if you use your MacBook Pro with Resolve 15. There are other options out there though, like the Razer Core v2 or the Akitio Node Pro.

From this review I can tell you that the Blackmagic eGPU is silent even when processing 8K Red RAW footage (even when the MacBook Pro fans are going at full speed), and it just works. Plug it in and you are running, no settings, no drivers, no cards to install… it just runs. And sometimes when I have three little boys running around my house, I just want that peace of mind and I want things to just work like the Blackmagic eGPU.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.