Tag Archives: Brady Betzel

Review: Boris FX’s Continuum and Mocha Pro 2019

By Brady Betzel

I realize I might sound like a broken record, but if you are looking for the best plugin to help with object removals or masking, you should seriously consider the Mocha Pro plugin. And if you work inside of Avid Media Composer, you should also seriously consider Boris Continuum and/or Sapphire, which can use the power of Mocha.

As an online editor, I consistently use Continuum along with Mocha for tight blur and mask tracking. If you use After Effects, there is even a whittled-down version of Mocha built in for free. For those pros who don’t want to deal with Mocha inside of an app, it also comes as a standalone software solution where you can copy and paste tracking data between apps or even export the masks, object removals or insertions as self-contained files.

The latest releases of Continuum and Mocha Pro 2019 continue the evolution of Boris FX’s role in post production image restoration, keying and general VFX plugins, at least inside of NLEs like Media Composer and Adobe Premiere.

Mocha Pro

As an online editor I am alway calling on Continuum for its great Chroma Key Studio, Flicker Fixer and blurring. Because Mocha is built into Continuum, I am able to quickly track (backwards and forwards) difficult shapes and even erase shapes that the built-in Media Composer tools simply can’t do. But if you are lucky enough to own Mocha Pro you also get access to some amazing tools that go beyond planar tracking — such as automated object removal, object insertion, stabilizing and much more.

Boris FX’s latest updates to Boris Continuum and Mocha Pro go even further than what I’ve already mentioned and have resulted in a new version naming, this round we are at 2019 (think of it as Version 12). They have also created the new Application Manager, which makes it a little easier to find the latest downloads. You can find them here. This really helps when jumping between machines and you need to quickly activate and deactivate licenses.

Boris Continuum 2019
I often get offline edits effects from a variety plugins — lens flares, random edits, light flashes, whip transitions, and many more — so I need Continuum to be compatible with offline clients. I also need to use it for image repair and compositing.

In this latest version of Continuum, BorisFX has not only kept plugins like Primatte Studio, they have brought back Particle Illusion and updated Mocha and Title Studio. Overall, Continuum and Mocha Pro 2019 feel a lot snappier when applying and rendering effects, probably because of the overall GPU-acceleration improvements.

Particle Illusion has been brought back from the brink of death in Continuum 2019 for a 64-bit keyframe-able particle emitter system that can even be tracked and masked with Mocha. In this revamp of Particle Illusion there is an updated interface, realtime GPU-based particle generation, expanded and improved emitter library (complete with motion-blur-enabled particle systems) and even a standalone app that can design systems to be used in the host app — you cannot render systems inside of the standalone app.

While Particle Illusion is a part of the entire Continuum toolset that works with OFX apps like Blackmagic’s DaVinci Resolve, Media Composer, After Effects, and Premiere, it seems to work best in applications like After Effects, which can handle composites simply and naturally. Inside the Particle Illusion interface you can find all of the pre-built emitters. If you only have a handful make sure you download additional emitters, which you can find in the Boris FX App Manager.

       
Particle Illusion: Before and After

I had a hard time seeing my footage in a Media Composer timeline inside of Particle Illusion, but I could still pick my emitter, change specs like life and opacity, exit out and apply to my footage. I used Mocha to track some fire from Particle Illusion to a dumpster I had filmed. Once I dialed in the emitter, I launched Mocha and tracked the dumpster.

The first time I went into Mocha I didn’t see the preset tracks for the emitter or the world in which the emitter lives. The second time I launched Mocha, I saw track points. From there you can track where you want your emitter to track and be placed. Once you are done and happy with your track, jump back to your timeline where it should be reflected. In Media Composer I noticed that I had to go to the Mocha options and change the option from Mocha Shape to no shape. Essentially, the Mocha shape will act like a matte and cut off anything outside the matte.

If you are inside of After Effects, most parameters can now be keyframed and parented (aka pick-whipped) natively in the timeline. The Particle Illusion plugin is a quick, easy and good-looking tool to add sparks, Milky Way-like star trails or even fireworks to any scene. Check out @SurfacedStudio’s tutorial on Particle Illusion to get a good sense of how it works in Adobe Premiere Pro.

Continuum Title Studio
When inside of Media Composer (prior to the latest release 2018.12), there were very few ways to create titles that were higher resolution than HD (1920×1080) — the New Blue Titler was the only other option if you wanted to stay within Media Composer.

Title Studio within Media Composer

At first, the Continuum Title Studio interface appeared to be a mildly updated Boris Red interface — and I am allergic to the Boris Red interface. Some of the icons for the keyframing and the way properties are adjusted looks similar and threw me off. I tried really hard to jump into Title Studio and love it, but I really never got comfortable with it.

On the flip side, there are hundreds of presets that could help build quick titles that render a lot faster than New Blue Titler did. In some of the presets I noticed the text was placed outside of 16×9 Title Safety, which is odd since that is kind of a long standing rule in television. In the author’s defense, they are within Action Safety, but still.

If you need a quick way to make 4K titles, Title Studio might be what you want. The updated Title Studio includes realtime playback using the GPU instead of the CPU, new materials, new shaders and external monitoring support using Blackmagic hardware (AJA will be coming at some point). There are some great pre-sets including pre-built slates, lower thirds, kinetic text and even progress bars.

If you don’t have Mocha Pro, Continuum can still access and use Mocha to track shapes and masks. Almost every plugin can access Mocha and can track objects quickly and easily.
That brings me to the newly updated Mocha, which has some new features that are extremely helpful including a Magnetic Spline tool, prebuilt geometric shapes and more.

Mocha Pro 2019
If you loved the previous version of Mocha, you are really going to love Mocha Pro 2019. Not only do you get the Magnetic Lasso, pre-built geometric shapes, the Essentials interface and high-resolution display support, but BorisFX has rewritten the Remove Module code to use GPU video hardware. This increases render speeds about four to five times. In addition, there is no longer a separate Mocha VR software suite. All of the VR tools are included inside of Mocha Pro 2019.

If you are unfamiliar with what Mocha is, then I have a treat for you. Mocha is a standalone planar tracking app as well as a native plugin that works with Media Composer, Premiere and After Effects, or through OFX in Blackmagic’s Fusion, Foundry’s Nuke, Vegas Pro and Hitfilm.

Mocha tracking

In addition (and unofficially) it will work with Blackmagic DaVinci Resolve by way of importing the Mocha masks through Fusion. While I prefer to use After Effects for my work, importing Mocha masks is relatively painless. You can watch colorist Dan Harvey run through the process of importing Mocha masks to Resolve through Fusion, here.

But really, Mocha is a planar tracker, which means it tracks multiple points in a defined area that works best in flat surfaces or at least segmented surfaces, like the side of a face, ear, nose, mouth and forehead tracked separately instead of all at once. From blurs to mattes, Mocha tracks objects like glue and can be a great asset for an online editor or colorist.

If you have read any of my plugin reviews you probably are sick of me spouting off about Mocha, saying how it is probably the best plugin ever made. But really, it is amazing — especially when incorporated with plugins like Continuum and Sapphire. Also, thanks to the latest Media Composer with Symphony option you can incorporate the new Color Correction shapes with Mocha Pro to increase the effectiveness of your secondary color corrections.

Mocha Pro Remove module

So how fast is Mocha Pro 2019’s Remove Module these days? Well, it used to be a very slow process, taking lots of time to calculate an object’s removal. With the latest Mocha Pro 2019 release, including improved GPU support, the render time has been cut down tremendously. In my estimation, I would say three to four times the speed (that’s on the safe side). In Mocha Pro 2019 removal jobs that take under 30 seconds would have taken four to five minutes in previous versions. It’s quite a big improvement in render times.

There are a few changes in the new Mocha Pro, including interface changes and some amazing tool additions. There is a new drop-down tab that offers different workflow views once you are inside of Mocha: Essentials, Classic, Big Picture and Roto. I really wish the Essentials view was out when I first started using Mocha, because it gives you the basic tools you need to get a roto job done and nothing more.

For instance, just giving access to the track motion objects (Translation, Scale, Rotate, Skew and Perspective) with big shiny buttons helps to eliminate my need to watch YouTube videos on how to navigate the Mocha interface. However, if like me you are more than just a beginner, the Classic interface is still available and one I reach for most often — it’s literally the old interface. Big Screen hides the tools and gives you the most screen real estate for your roto work. My favorite after Classic is Roto. The Roto interface shows just the project window and the classic top toolbar. It’s the best of both worlds.

Mocha Pro 2019 Essentials Interface

Beyond the interface changes are some additional tools that will speed up any roto work. This has been one of the longest running user requests. I imagine the most requested feature that BorisFX gets for Mocha is the addition of basic shapes, such as rectangles and circles. In my work, I am often drawing rectangles around license plates or circles around faces with X-splines, so why not eliminate a few clicks and have that done already? Answering my need, Mocha now has elliptical and rectangular shapes ready to go in both X-splines and B-splines with one click.

I use Continuum and Mocha hand in hand. Inside of Media Composer I will use tools like Gaussian Blur or Remover, which typically need tracking and roto shapes created. Once I apply the Continuum effect, I launch Mocha from the Effect Editor and bam, I am inside Mocha. From here I track the objects I want to affect, as well as any objects I don’t want to affect (think of it like an erase track).

Summing Up
I can save tons of time and also improve the effectiveness of my work exponentially when working in Continuum 2019 and Mocha Pro 2019. It’s amazing how much more intuitive Mocha is to track with instead of the built-in Media Composer and Symphony trackers.

In the end, I can’t say enough great things about Continuum and especially Mocha Pro. Mocha saves me tons of time in my VFX and image restoration work. From removing camera people behind the main cast in the wilderness to blurring faces and license plates, using Mocha in tandem with Continuum is a match made in post production heaven.

Rendering in Continuum and Mocha Pro 2019 is a lot faster than previous versions, really giving me a leg up on efficiency. Time is money right?! On top of that, using Mocha Pro’s magic Object removal and Modules takes my image restoration work to the next level, separating me from other online editors who use standard paint and tracking tools.

In Continuum, Primatte Studio gives me the leg up on greenscreen keys with its exceptional ability to auto analyze a scene and perform 80% of the keying work before I dial-in the details. Whenever anyone asks me what tools I couldn’t live without, I without a doubt always say Mocha.
If you want a real Mocha Pro education you need to watch all of Mary Poplin’s tutorials. You can find them on YouTube. Check out this one on how to track and replace a logo using Mocha Pro 2019 in Adobe After Effects. You can also find great videos at Borisfx.com.

Mocha point parameter tracking

I always feel like there are tons of tools inside of the Mocha Pro toolset that go unused simply because I don’t know about them. One I recently learned about in a Surfaced Studio tutorial was the Quick Stabilize function. It essentially stabilizes the video around the object you are tracking allowing you to more easily rotoscope your object with it sitting still instead of moving all over the screen. It’s an amazing feature that I just didn’t know about.

As I was finishing up this review I saw that Boris FX came out with a training series, which I will be checking out. One thing I always wanted was a top-down set of tutorials like the ones on Mocha’s YouTube page but organized and sent along with practical footage to practice with.

You can check out Curious Turtle’s “More Than The Essentials: Mocha in After Effects” on their website where I found more Mocha training. There is even a great search parameter called Getting Started on BorisFX.com. Definitely check them out. You can never learn enough Mocha!


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: iOgrapher Multi Case for mobile filmmaking

By Brady Betzel

Thanks to the amazing iPhone X, Google Pixel and Samsung Galaxy, almost everyone has a high-end video camera on their person at all times and this is helping to spur on mobile filmmaking and vlogging.

From YouTube to Instagram to movies like Unsane (Steven Soderbergh) or Tangerine (Sean Baker) — and regardless of whether you think a $35,000 camera setup tells a story better than a $1,000 cell phone (looking at you Apple Phone XS Max) — mobile filmmaking is here to stay and will only get better.

iOgrapher’s latest release is the iOgrapher Multi Case, a compact mobile filmmaking mounting solution that works with today’s most popular phones. iOgrapher has typically created solutions that were tied to the mobile device being used for filmmaking, such as an iPhone, the latest Samsung Galaxy phones, iPads or even action cameras like a GoPro Hero 7 Black.

With the new iOgrapher Multi Case you can fit any mobile device that measures more than 5 ½” x 2 ¼” and less than 6 ½” by 3 ⅜”. Unfortunately, you won’t be fitting an iPad or a GoPro in the iOgrapher Multi Case, but don’t fret! iOgrapher makes rigs for those as well. On the top of the Multi Case are two cold shoe mounts for lights, microphones or any other device, like a GoPro. To mount things with ¼” 20 screw mounts in the cold shoes you will need to find a cold shoe to ¼” 20 adapter, which is available on iOgrapher’s accessory page. You can also find these at Monoprice or Amazon for real cheap.

And if you are looking to order more mounts you may want to order some extra cold shoe adapters that can be mounted on the handles of the iOgrapher Multi Case in the additional ¼” 20 screw mounts. The mounts on the handles are great for adding in additional lighting or microphones. I’ve even found that if you are going to be doing some behind-the-scenes filming or need another angle for your shooting, a small camera like a GoPro can be easily mounted and angled. With all this mounting you should assume that you are going to be using the iOgrapher on a sturdy tripod. Just for fun, I mounted the iOgrapher Multi Case onto a GoPro 3-Way Grip, which can also be used as a light tripod. It wasn’t exactly stable but it worked. I wouldn’t suggest using it for more than an emergency shooting situation though.

On the flip side (all pun intended), the iOgrapher can be solidly mounted vertically with the ¼” 20 screw mounts on the handles. With Instagram making headway with vertical video in their Instagram Stories, iOgrapher took that idea and built that into their Multi Case, further cementing grumbling from the old folks who just don’t get vertical video.

Testing
I tried out both a Samsung Galaxy s8+ as well as an iPhone 7+ with their cases on inside of the iOgrapher Multi Case. Both fit. The iPhone 7+ was stretching the boundaries of the Multi Case, but it did fit and worked well. The way the phones are inserted into the Multi Case is by a spring-loaded bottom piece. From the left or top side, if you are shooting vertically, you push the bottom of the mobile device into the corner covered slots of the iOgrapher Multi Case until the top or the left side can be secured under the left or top side of the Multi Case. It’s really easy.

I was initially concerned with the spring loading of the case; I wasn’t sure if the springs would be resilient enough to handle the constant pulling in and out of the phones, but the springs are high quality and held up beautifully. I even tried inserting my mobile phones tons of times and didn’t notice any issues with the springs or my phones.

Take care when inserting your phone into the Multi Case if you have a protective shield on the screen of your device. If you aren’t extra careful it can pull or snag on the cover — especially with the tight fit of a case. Just pay attention and there will be nothing to worry about. The simple beauty of the iOgrapher is that with a wider grip of your filmmaking device, you have a larger area to distribute any shaking coming from your hands, essentially helping stabilize your filmmaking without the need for a full-fledged gimbal.

If you accidentally drop your iOgrapher you may get a scratch, but for the most part they are built sturdy and can withstand punishment, whether it’s from your four year old or from weather. If you want to get a little fancy, you can buy affordable lights like the Litra Torch (check out my review) to attach to the cold shoe mounts, or even a Rode microphone (don’t forget the TRS to TRRS adapter if you are plugging into an iPhone), and you are off and running.

Summing Up
I have been really intrigued with iOgrapher’s products since day one. They are an affordable and sturdy way to jump into filmmaking using cameras everyone carries with them every day: their phones.

Whether you are a high school student looking to get steady and professional mobile video, or a journalist looking for a quick way to make the most of your shots with just a phone, light, mic and tripod mount, the iOgrapher Multi Case will unlock your mobile filmmaking potential.

The iOgrapher Multi Case is a very durable protective case for your mobile filmmaking devices that is a steal at $79.99. If you are a parent that is looking for an inexpensive way to try and tease your child’s interest in video take a look at www.iographer.com and grab a few accessories like a Manfrotto light and Rode VideoMicro to add some subtle lighting and pick up the best quality audio.

Make sure to check out Dave Basulto’s — the creator of iOgrapher — demo of the iOgrapher Multi Case, including trying out the fit of different phones.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

You can now export ProRes on a PC with Adobe’s video apps

By Brady Betzel

Listen up post pros! You can now natively export ProRes from a Windows 10-based PC for $20.99 with the latest release of Adobe’s Premiere, After Effects and Media Encoder.

I can’t overstate how big of a deal this is. Previously, the only way to export ProRes from a PC was to use a knock-off reverse-engineered codec that would mimic the process — creating footage that would often fail QC checks at networks — or be in possession of a high-end app like Fusion, Nuke, Nucoda or Scratch. The only other way would be to have a Cinedeck in your hands and output your files in realtime through it. But, starting today, you can export native ProRes 4444 and ProRes 422 from your Adobe Creative Cloud Suite apps like Premiere Pro, After Effects, and Media Encoder. Have you wanted to use those two or three Nvidia GTX 1080ti graphics cards that you can’t stuff into a Mac Pro? Well, now you can. No more being tied to AMD for ProRes exports.

Apple seems to be leaving their creative clients in the dust. Unless you purchase an iMac Pro or MacBook Pro, you have been stuck using a 2013 Mac Pro to export or encode your files to ProRes specifications. A lot of customers, who had given Apple the benefit of the doubt and stuck around for a year or two longer than they probably should have waiting for a new Mac Pro — allegedly being released in 2019 — began to transition over to Windows-based platforms. All the while, most would keep that older Mac just to export ProRes files while using the more powerful and updated Windows PC to do their daily tasks.

Well, that day is now over and, in my opinion, leads me to believe that Apple is less concerned with keeping their professional clients than ever before. That being said, I love that Apple has finally opened their ProRes codecs up to the Adobe Creative Cloud.

Let’s hope it can become a system-wide feature, or at least added to Blackmagic’s Resolve and Avid’s Media Composer. You can individually rent Adobe Premiere Pro or After Effects for $20.99 month, rent the entire Adobe Creative Cloud library for $52.99 a month or, if you are a student or teacher, you can take advantage of the best deal around for $19.99 a month, which gives you ALL the Creative Cloud apps.

Check out Adobe’s blog about the latest Windows ProRes export features.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: DJI’s Mavic Air lightweight drone

By Brady Betzel

Since the first DJI Phantom was released in January of 2013, drones found a place in our industry. Turn on almost any television show airing on National Geographic and you will see some sort of drone videography at work. DJI and GoPro have revolutionized how everyone films over the last decade.

Nowadays drones are expected to be a part of every cameraperson’s kit. Once DJI released their second-generation flagship drone, the Mavic Pro, the quality of footage and still frame images went from prosumer-level to professional. One thing that has always been a tough sell for me with drones is the physical size of the unmanned aerial vehicles. The original DJI flagship drone, the Phantom, is a little big, you essentially need a duffle-sized backpack to carry it and its accessories. But now DJI has upped the ante with a smaller footprint — the Mavic Air.

The Mavic Air is absolutely the best drone I have ever had my hands on — from being the size of a few iPhones stacked on top of each other to recording high-quality footage that is 100% being used on television shows airing around the world. It’s not only the easiest drone to use with or without a remote, but it is by far the best picture I have seen from a consumer-level drone for under $1,000.

The Mavic Air is small, lightweight, and packed with amazing technology to help itself avoid slamming into the sides of buildings or trees. You can find all the nerdy tech specs here.

While there are super high-end drones flying Red Monstros around, sometimes there are restrictions that require the crew or cameraperson to downsize their equipment to only what is vital. So a drone that takes up a fraction of your carry-on luggage and will still yield 4K footage acceptable for broadcast is a win. Obviously, you won’t be getting the same sensors that you will find in the

Digging In
The Mavic Air has many features that set it apart from the pack. SmartCapture allows anyone to fly the drone without a remote, instead you just use a few specific gestures from your hands. An updated slow-motion feature allows the Mavic Air to shoot up to 1080p, 120fps for those uber-epic sweeps in slow motion. There are multiple Quickshot modes you can find in the DJI app — like the two newest: Asteroid and Boomerang.

DJI is known for advancing drone technology and keeping their prices relatively low. One of the most advanced features DJI consistently works on is flight sensors. Flight Autonomy 2.0 and Advanced Pilot Assistance Systems are the latest advances in technology for the Mavic Air. Flight Autonomy 2.0 takes information from the seven onboard infrared sensors to create its own 3D environmental map to avoid crashing. The Advanced Pilot Assistance System (APAS), which has to be enabled, will automatically tell the Mavic Air to avoid obstacles while flying.

Taking Flight
So really how is it to fly and work with the Mavic Air? It’s very easy. The drone is ultra-portable, and the remote folds up nicely as well — nice and tight in fact, and you can then unfold it and install the newly removable joysticks for flight. You mount your phone on the bottom and connect it with one of the three cables provided to you. I have a Samsung Galaxy, so I used a USB-C connection. I downloaded and updated the DJI Go App, connected the USB-C cable to my phone (which is a little clumsy and could hopefully be a little better integrated in the future), paired the remote to the Mavic Air and was flying… that was unless I had to update firmware. Almost every time I went to fly one piece of equipment — if not more — needed to be updated. While it doesn’t take a long time, it is annoying. Especially when you have three young boys staring at you to fly this bad boy around. But once you get up and running, the Mavic is simple to fly.

I was most impressed with how it handled wind. The Mavic Air lives up to its name, and while it is definitely tiny, it can fight for its position in wind with the best of them. The sensors are amazing as well. You can try your hardest (unless you are in sports mode — don’t try to fly into anything as the sensors are disabled) to run into stuff and the Mavic Air stops dead in its tracks.

The Mavic Air’s filming capabilities are just as impressive as its flying capabilities. I like to set my DJI footage to D-Cinelike to get a flatter color profile in my video, allowing for a little more range in the shadows and highlights when I am color correcting. However, the stock DJI color settings are amazing. Another trick is to knock the sharpening down to medium or off and add that back in when finishing your video. The Mavic Air records using a 3-axis stabilized camera for ultra-smooth video up to 4K (UHD) at 30fps in the newly upped 100Mb/s H.264/MPEG-4 AVC recording format. Not quite the H.265 compression, but I’m sure that will come in the next version. I would love to see DJI offer a built-in neutral density filter on their drones — this would really help get that cinematic look without sacrificing highlight and shadow detail.

In terms of batteries, I was sent two, which I desperately needed; they only lasted about 20 minutes apiece. The batteries take around an hour to charge, but when you buy the Fly More Combo for $999 you also get a sweet four-battery charger to charge them all at once. Check out all the goodies you get with the Fly More Combo.

You will want to buy a decent-sized memory card, probably a 128GB, but a 64GB would be fine. When inserting the memory card into the Air it can take a little practice, the slot and cover are a little clunky and hard to use.

Summing Up
In the end, the DJI Mavic Air is the best drone I have used hands down. From the ultra-portable size (due to its compact folding ability) to the amazing shooting modes, you get everything you would want in a drone for under $1,000 with the Fly More Combo. The Mavic Air is just the right balance of technology and fun that will make you want to fly your drone.

Sometimes I get intimidated when flying a drone because they are so large and distracting, but not the Mavic Air — it is tiny and unassuming but packed with raw power to capture amazing images for broadcast or personal use.

While we typically don’t rate our reviewed products, I will just this once. I would rate the Mavic Air a 10, and can only hope that they next iteration embraces the Hasselblad history to stretch the Mavic Air into even further professional directions.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Apple updates FCPX, Motion and Compressor

By Brady Betzel

While Apple was busy releasing the new Mac Mini’s last month, they were also quietly prepping some new updates. Apple has releasing free updates to FCPX, Motion and Compressor.

CatDV inside of FCPX

The FCPX 10.4.4 update includes Workflow Extensions, batch sharing, Comparison Viewer, built-in video noise reduction, timecode window and more. The Workflow Extensions are sure to take the bulk of the update cake: At launch Apple has announced Shutterstock, Frame.io and CatDV will have extensions directly usable inside of FCPX instead of through a web browser. Frame.io looks to be the most interesting extension with realtime reflection of who is watching your video and at what timecode they are at, a.k.a, “Presence.”

Frame.io being rebuilt from the ground up using Swift will make its venture inside of FCPX extremely streamlined and fast. Notwithstanding Internet bandwidth, Frame.io inside of FCPX looks to be the go-to approval system that FCPX editors will use. I am not quite sure why Apple didn’t create their own approval and note-taking system, but they didn’t go wrong working with Frame.io. Since many editors use this as their main approval system, FCPX users will surely love this implementation directly inside of the app.

When doing color correction, it is essential to compare your current work with either other images or the source image, and luckily for FCPX colorists you can now do this with the all new Comparison Viewer. Essentially, the Comparison Viewer will allow you to compare anything to the clip you are color grading.

One feature of this that I really love is that you can have access to scopes on both the reference image and your working image. If you understand how scopes work, color matching via parade or waveforms can often be quicker than by eyeball match.

Frame.io inside of FCPX

Final Cut Pro 10.4.4 has a few other updates like Batch Share, which allows you to cue a bunch of exports or projects in one step, Timecode Window (which is a “why wasn’t this there already” feature) is essential when editing video footage, and video noise reduction has been added as a built-in feature with adjustable amounts and sharpness. There are a few other updates like Tiny Planet, which allows you to quickly make that spherical 360-degree video look, not really an important technical update but fun nonetheless.

Motion
With Version 5.4.2, Apple has put the advanced color correction toolset from FCPX directly inside of Motion. In addition, you can now add custom LUTs to your work. Apple has added the Tiny Planet effect as well as a Comic filter inside Motion. Those aren’t incredibly impressive, but the addition of the color correction toolkit is an essential addition to Motion and will provide a lot of use.

Compressor
Compressor 4.4.2 in my opinion is the sleeper update. Apple has finally updated Compressor to a 64-bit engine to take advantage of all of your memory, as well as improved overall performance with huge files. And it will still work with legacy 32-bit formats. Closed captions can now be burned into a video, including the SRT format. Compressor has also added automatic configuration to apply correct frame rate, field order and color space to your MXF and QuickTime outputs.

The FCPX, Motion and Compressor updates are available now for free if you have previously purchased the apps. If not FCPX retails for $299.99. Motion and Compressor are $49.99 each.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: Puget Systems Genesis I custom workstation

By Brady Betzel

With so many companies building custom Windows-based PCs these days, what really makes for a great build? What would make me want to pay someone to build me a PC versus building it myself? In this review, I will be going through a custom-built PC sent to me to review by Puget Systems. In my opinion, besides the physical components, Puget Systems is the cream of the crop of custom -built PCs. Over the next few paragraphs I will focus on how Puget Systems identified the right custom-built PC solution for me (specifically for post), how my experience was before, during and after receiving the system and, finally, specs and benchmarks of the system itself.

While quality components are definitely a high priority when building a new workstation, the big thing that sets Puget Systems’ apart from the rest of the custom-built PC pack is the personal and highly thorough support. I usually don’t get the full customer experience when reviewing custom builds. Typically, I am sent a workstation and maybe a one-sheet to accompany the system. To Puget System’s credit they went from top to tail when helping me put together the system I would test. Not only did I receive a completely newly built and tested system, but I talked to a customer service rep, Jeff Stubbers, who followed up with me along the way.

First, I spoke with Jeff over the phone. We talked about my price range and what I was looking to do with the system. I usually get told what I should buy — by the way, I am not a person that likes to be told what I want. I have a lot of experience not only working on high-end workstations but have been building and supporting them essentially my entire life. I actively research the latest and greatest technology. Jeff from Puget Systems definitely took the correct approach; he started by asking which apps I use and how I use them. When using After Effects, am I doing more 3D work or simple lower thirds and titles. Do I use and do I plan to continue using Avid Media Composer, Adobe Premiere Pro or Blackmagic’s DaVinci Resolve the most?

Essentially, my answers were that I use After Effects sparingly, but I do use it. I use Avid Media Composer professionally more than Premiere, but I see more and more Premiere projects coming my way. However, I think Resolve is the future, so I would love to tailor my system toward that. Oh and I dabble in Maxon Cinema 4D as well. So in theory, I need a system that does everything, which is kind of a tall order.

I told Jeff that I would love to stay below $10,000, but need the system to last a few years. Essentially, I was taking the angle of a freelance editor/colorist buying an above mid-range system. After we configured the system, Jeff continued to detail benchmarks that Puget Systems performs on a continuing basis and why two GTX 1080ti cards are going to benefit me instead of just one, as well as why an Intel i9 processor would specifically benefit my work in Resolve.

After we finished on the phone I received an email from Jeff that contained a link to webpage that continually would update me on the details and how my workstation was being built — complete with pictures of my actual system. There are also some links to very interesting articles and benchmarks on the Puget System’s website. They perform more pertinent benchmarks for post production pros than I have seen from any other company. Usually you see a few generic Premiere or Resolve benchmarks, but nothing like Puget System’s, even if you don’t buy a system from them you should read their benchmarks.

While my system went through the build and ship process, I saw pictures and comments about who did what in the process over at Puget Systems. Beth was my installer. She finished and sent the system to Kyle who ran benchmarks. Kyle then sent it to Josh for quality control. Josh discovered the second GTX 1080ti was installed in a reduced bandwidth PCIe slot and would be sent back to Beth for correction. I love seeing this transparency! It not only gives me the feeling that Puget Systems is telling me the truth, but that they have nothing to hide. This really goes a long way with me. Once my system was run through a second quality control pass, it was shipped to me in four days. From start to finish, I received my system in 12 days. Not a short amount of time, but for what Puget Systems put the system through, it was worth it.

Opening the Box
I received the Genesis I workstation in a double box. A nice large box with sturdy foam corners encasing the Fractal Design case box. There was also an accessories box. Within the accessories box were a few cables and an awesome three-ring binder filled with details of my system, the same pictures of my system, including thermal imaging pictures from the website, all of the benchmarks performed on my system (real-world benchmarks like Cinebench and even processing in Adobe Premiere) and a recovery USB 3.0 drive. Something I really appreciated was that I wasn’t given all of the third-party manuals and cables I didn’t need, only what I needed. I’ve received other custom-built PCs where the company just threw all of the manuals and cables into a Ziploc and called it a day.

I immediately hooked the system up and turned it on… it was silent. Incredibly silent. The Fractal Design Define R5 Titanium case was lined with a sound-deadening material that took whatever little sound was there and made it zero.

Here are the specs of the Puget System’s Genesis I I was sent:
– Gigabyte X299 Designare EX motherboard
– Intel Core i9 7940X 3.1GHz 14 Core 19.25MB 165W CPU
– Eight Crucial DDR4-2666 16GB RAM
– EVGA GeForce GTX 1080 TI 11GB gaming video card
– Onboard sound card
– Integrated WiFi+Bluetooth networking
– Samsung 860 Pro 512GB SATA3 2.5-inch SSD hard drive — primary drive
– Samsung 970 Pro 1TB M.2 SSD hard drive — secondary drive.
– Asus 24x DVD-RW SATA (Black) CD / DVD-ROM
– Fractal Design Define R5 titanium case
– EVGA SuperNova 1200W P2 power supply
– Noctua NH-U12DX i4 CPU cooling
– Arctic Cooling MX-2 thermal compound
– Windows 10 Pro 64-bit operating system
– Warranty: Lifetime labor and tech support, one-year parts warranty
– LibreOffice software: courtesy install
– Chrome software: courtesy install
– Adobe Creative Cloud Desktop App software: courtesy Install
– Resolve 1-3 GPU

System subtotal: $8,358.38. The price is right in my opinion, and mixed with the support and build detail it’s a bargain.

System Performance
I ran some system benchmarks and tests that I find helpful as a video editor and colorist who uses plugins and other tools on a daily basis. I am becoming a big fan of Resolve, so I knew I needed to test this system inside of Blackmagic’s Resolve 15. I used a similar sequence between Adobe Premiere and Resolve 15: a 10-minute, 23.98fps, UHD/3840×2160 sequence with mixed format footage from 4K and 8K Red, ARRI Raw UHD and ProRes4444. I added some Temporal Noise Reduction to half of the clips, including the 8K Red footage, resizes to all clips, all on top of a simple base grade.

First, I did a simple Smart User cache test by enabling the User Cache at DNxHR HQX 10-bit to the secondary Samsung 1TB drive. It took about four minutes and 34 seconds. From there I tried to playback the media un-cached, and I was able to playback everything except the 8K media in realtime. I was able to playback the 8K Red media at Quarter Res Good (Half Res would go between 18-20fps playback). The sequence played back well. I also wanted to test the export speeds. The first test was an H.264 export without cache on the same sequence. I set the H.264 output in Resolve to 23.98fps, UHD, auto-quality, no frame reordering, force highest quality debayer/resizes and encoding profile: main. The file took 11 minutes and 57 seconds. The second test was a DNxHR HQX 10-bit QuickTime with the same sequence, it took seven minutes and 44 seconds.

To compare these numbers I recently ran a similar test on an Intel i9-based MacBook Pro and with the Blackmagic eGPU with Radeon Pro 580 attached, the H.264 export took 16 minutes and 21 seconds, while a ProRes4444 took 22 minutes and 57 seconds. While not comparing apples to apples, this is still a good comparison in terms of a speed increase you can have with a desktop system and a pair of Nvidia GTX 1080ti graphics cards. With the impending release of the Nvidia GTX 2080 cards, you may want to consider getting those instead.

While in Premiere I ran similar tests with a very similar sequence. To export an H.264 (23.98fps, UHD, no cache used during export, VBR 10Mb/s target rate, no frame reordering) it took nine minutes and 15 seconds. Going a step further it took 47 minutes to export an H.265. Similarly, doing a DNxHR HQX 10-bit QuickTime export took 24 minutes.

I also ran the AJA System test on the 1TB spare drive (UHD, 16GB test file size, ProRes HQ). The read speed was 2951MB/sec and the write speed was 2569MB/sec. Those are some very respectable drive speeds, especially for a cache or project drive. If possible you would probably want to add another drive for exports or to have your RAW media stored on in order to maximize input/output speeds.

Up next was Cinebench R15: OpenGL — 153.02fps, Ref. Match 99.6%, CPU — 2905 cb, CPU (single core) — 193cb and MP Ratio 15.03x. Lastly, I ran a test that I recently stumbled upon: the Superposition Benchmark from Unigine. While it is more of a gaming benchmark, I think a lot of people use this and might glean some useful information from it. The overall score was 7653 (fps: min 45.58, avg 57.24, max 72.11, GPU degrees Celsius: min 36, max 85, GPU use: max 98%.

Summing Up
In the end, I am very skeptical of custom-build PC shops. Typically, I don’t see the value in the premium they set when you can probably build it yourself with parts you choose from PCpartpicker.com. However, Puget Systems is the exception — their support and build-quality are top notch. From the initial phone conversation to the up-to-the minute images and custom-build updates online, to the final delivery, and even follow-up conversations, Puget Systems is by far the most thorough and worthwhile custom-build PC maker I have encountered.

Check out their high-end custom build PCs and tons of benchmark testing and recommendations on their website.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

HP offerings from Adobe Max 2018

By Brady Betzel

HP workstations have been a staple in the post community, especially for anyone not using a Mac or the occasional DIY/custom build from companies like Puget Systems or CyberPower PCs. The difference comes with customers who need workstation-level components and support. Typically, a workstation is run through much tougher and stringent tests so the client can be assured of 24/7/365 up-time. HP continues to evolve and become, in my opinion, a leader for all non-Apple dedicated workflows.

At Adobe Max 2018, HP announced updated components to its Z by HP line of mobile workstations, including the awesome ZBook Studio x360, ZBook Studio, ZBook 15 and ZBook 17. I truly love HP’s mobile workstation offerings. The only issue I constantly come up against is can I — or any freelance worker for that matter — justify the cost of their systems?

I always want the latest and greatest, and I feel I can get that with the updated performance options in this latest update to the ZBook line. They include the increased 6-core Intel i9 processors; expanded memory of up to 32GB (or 128GB in some instances); a really interesting M.2 SSD RAID-1 configuration from the factory that allows for constant mirroring of your boot drive (if one drive fails, the other will take over right where you left off); the ZBook Studio and Studio x360 getting a GPU increase with the Nvidia Quadro P2000; and the anti-glare touchscreen on the x360. This is all in addition to HP’s DreamColor option, which allows for 100% Adobe RGB coverage and 600 nits of brightness. But again, this all comes at a high cost when you max out the workstation with enough RAM and GPU horsepower. But there is some good news for those that don’t have a corporate budget to pull from: HP has introduced the pilot program Z Club.

The Z Club is essentially a leasing program for HP’s Z series products. At the moment, HP will take 100 creators for this pilot program, which will allow you to select a bundle of Z products and accessories that fit your creative lifestyle for a monthly cost. This is exactly how you solve the problem of getting prosumer and freelance workers who can’t quite justify a $5,000 price tag for purchase, but can justify a $100 a month payment. HP has touted categories of products for editors, photographers and many others. With monthly payments that range from $100 to $250, depending on what you order, this is much more manageable for mid-range end users who need the power of a workstation but up until now couldn’t afford it.

So what will you get if you are accepted to the Z Club pilot program? You can choose the products you want and not pay for three months. And you can continue or return your products, you can switch products and you will have access to a Z Club concierge service for any questions and troubleshooting.

On the call I had with HP, they mentioned that a potential bundle for a video editor could be an HP Z series mobile workstation or desktop, along with a DreamColor display, and an external RAID storage system to top it off.

In the end, I think HP (much like Blackmagic’s Resolve in the NLE/color world) is at the front of the pack. They are listening to what creatives are saying about Apple — how this giant company is not listening to their customers in an efficient and price-conscious way. Creating essentially a leasing program for mid- to high-range products with support is the future. It’s essentially Apple’s own iPhone program but with computers!

Hopefully this program takes off, and if you are lucky enough to be accepted into the pilot program, I would be curious to hear your experience, so please reach out. But with HP making strides in the workstation security initiatives like Sure Start, a privacy mode for mobile systems, and military-grade testing known as MIL-spec, HP is going from being a standard in the media and entertainment post industry. For those leaving Apple for a Windows-based PC, you should apply for the Z Club pilot program. Go to www.hp.com to find out more or follow along on Twitter @AdobeMax, @HP or using #AdobeMax.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: Blackmagic’s eGPU and Intel i9 MacBook Pro 2018

By Brady Betzel

Blackmagic’s eGPU is worth the $699 price tag. You can buy it from Apple’s website, where it is being sold exclusively for the time being. Wait? What? You wanted some actual evidence as to why you should buy the BMD eGPU?

Ok, here you go…

MacBook Pro With Intel i9
First, I want to go over the latest Apple MacBook Pro, which was released (or really just updated) this past July. With some controversial fanfare, the 2018 MacBook Pro can now be purchased with the blazingly fast Intel i9, 2.6GHz (Turbo Boost up to 4.3GHz) six-core processor. In addition, you can add up to 32GB of 2400MHz DDR4 onboard memory. The Radeon Pro 560x GPU with 4GB of GDDR5 memory and even a 4TB SSD storage drive. It has four Thunderbolt 3 ports and, for some reason, a headphone jack. Apple is also touting its improved butterfly keyboard switches as well as its True Tone display technology. If you want to read more about that glossy info head over to Apple’s site.

The 2018 MacBook Pro is a beast. I am a big advocate for the ability to upgrade and repair computers, so Apple’s venture to create what is essentially a leased computer ecosystem that needs to be upgraded every year or two usually puts a bad taste in my mouth.

However, the latest MacBook Pros are really amazing… and really expensive. The top-of-the-line MacBook Pro I was provided with for this review would cost $6,699! Yikes! If I was serious, I would purchase everything but the $2,000 upgrade from the 2TB SSD drive to the 4TB, and it would still cost $4,699. But I suppose that’s not a terrible price for such an intense processor (albeit not technically workstation-class).

Overall, the MacBook Pro is a workhorse that I put through its video editing and color correcting paces using three of the top four professional nonlinear editors: Adobe Premiere, Apple FCP X and Blackmagic’s Resolve 15 (the official release). More on those results in a bit, but for now, I’ll just say a few things: I love how light and thin it is. I don’t like how hot it can get. I love how fast it charges. I don’t like how fast it loses charge when doing things like transcoding or exporting clips. A 15-minute export can drain the battery over 40% while playing Spotify for eight hours will hardly drain the battery at all (maybe 20%).

Blackmagic’s eGPU with Radeon Pro 580 GPU
One of the more surprising releases from Blackmagic has been this eGPU offering. I would never have guessed they would have gone into this area, and certainly would never have guessed they would have gone with a Radeon card, but here we are.

Once you step back from the initial, “Why in the hell wouldn’t they let it be user-replaceable and also not brand dependent” shock, it actually makes sense. If you are Mac OS user, you probably can do a lot in terms of external GPU power already. When you buy a new iMac, iMac Pro or MacBook Pro, you are expecting it to work, full stop.

However, if you are a DIT or colorist that is more mobile than that sweet million-dollar color bay you dream of, you need more. This is where the BMD eGPU falls nicely into place. You plug it in and instantly see it populate in the menu bar. In addition, the eGPU acts as a dock with four USB 3 ports, two Thunderbolt 3 ports and an HDMI port. The MacBook Pro will charge off of the eGPU as well, which eliminates the need for your charger at your docking point.

On the go, the most decked out MacBook Pro can handle its own. So it’s no surprise that FCP X runs remarkably fast… faster than everything else. However, you have to be invested in an FCP X workflow and paradigm — and while I’m not there yet, maybe the future will prove me wrong. Recently, I saw someone on Twitter who developed an online collaboration workflow, so people are excited about it.

Anyway, many of the nonlinear editors I work with can also play on the MacBook Pro, even with 4K Red, ARRI and, especially, ProRes footage. Keep in mind though, with the 2K, 4K, or whatever K footage, you will need to set the debayer to around “half good” if you want a fluid timeline. Even with the 4GB Radeon 560x I couldn’t quite play realtime 4K footage without some sort of compromise in quality.

But with the Blackmagic eGPU, I significantly improved my playback capabilities — and not just in Resolve 15. I did try and plug the eGPU into a PC with Windows 10 I was reviewing at the same time and it was recognized, but I couldn’t get all the drivers sorted out. So it’s possible it will work in Windows, but I couldn’t get it there.

Before I get to the Resolve testing, I did some benchmarking. First I ran Cinebench R15 without the eGPU attached and got the following scores: OpenGL – 99.21fps, reference match 99.5%, CPU – 947cb, CPU (single core) 190cb and MP ratio of 5.00x. With the GPU attached: Open GL — 60.26fps, reference match 99.5%, CPU — 1057 cb, CPU (single core) 186cb and MP ratio of 5.69x. Then I ran Unigine’s Valley Benchmark 1.0 without the eGPU, which got 21.3fps and a score of 890 (minimum 12.4fps/maximum 36.2fps). With the eGPU it got 25.6fps and a score of 1073 (minimum 19.2 fps/max 37.1fps)

Resolve 15 Test
I based all of my tests on a similar (although not exact for the different editing applications) 10-minute timeline, 23.98fps, 3840×2160, 4K and 8K RAW Red footage (R3D files) and Alexa (.ari and ProRes444XQ) UHD footage, all with edit page resizes, simple color correction and intermittent sharpening and temporal noise reduction (three frames, better, medium, 10, 10 and 5).

Playback: Without the eGPU I couldn’t play 23.98fps, 4K Red R3D without being set to half-res. With the eGPU I could playback at full-res in realtime (this is what I was talking about in sentence one of this review). The ARRI footage would play at full res, but would go between 1fps and 7fps at full res. The 8K Red footage would play in realtime when set to quarter-res.

One of the most re-assuring things I noticed when watching my Activity Monitor’s GPU history readout was that Resolve uses both GPUs at once. Not all of the apps did.

Resolve 15 Export Tests
In the following tests, I disabled all cache or optimized media options, including Performance Mode.

Test 1: H.264 at 23.98fps, UHD, auto-quality, no frame reordering, force highest-quality debayer/resizes and encoding profile Main)
a. Without eGPU (Radeon Pro 560x): 22 minutes, 16 seconds
b. With BMD eGPU (Radeon Pro 580): 16 minutes and 21 seconds

Test 2: H.265 10-bit, 23.98/UHD, auto quality, no frame reordering, force highest-quality debayer/resizes)
a. Without eGPU: stopped rendering after 10 frames
b. With BMD eGPU: same result

Test 3:
ProRes4444 at 23.98/UHD
a. Without eGPU: 27 min and 29 seconds
b. With BMD eGPU: 22 minutes and 57 seconds

Test 4:
– Edit page cache – enabled Smart User Cache at ProResHQ
a. Without eGPU: 17 minutes and 28 seconds
b. With BMD eGPU: 12 minutes and 22 seconds

Adobe Premiere Pro v.12.1.2
I performed similar testing in Adobe Premiere Pro using a 10-minute timeline at 23.98fps, 3840×2160, 4K and 8K RAW Red footage (R3D files) and Alexa (DNxHR SQ 8-bit) UHD footage, all with Effect Control tab resizes and simple Lumetri color correction, including sharpening and intermittent denoise (16) under the HSL Secondary tab in Lumetri applied to shadows only.

In order to ensure your eGPU will be used inside of Adobe Premiere, you must use Metal as your encoder. To enable it go to File > Project Settings > General and change the renderer to Mercury Playback Engine GPU acceleration Metal — (OpenCL will only use the internal GPU for processing.)

Premiere did not handle the high-resolution media as aptly as Resolve had, but it did help a little. However, I really wanted to test the export power with the added eGPU horsepower. I almost always send my Premiere sequences to Adobe Media Encoder to do the processing, so that is where my exports were processed.

Adobe Media Encoder
Test 1: H.264 (No render used during exports: 23.98/UHD, 80Mb/s, software encoding doesn’t allow for profile setup)
a. Open CL with no eGPU: about 140 minutes (sorry had to chase the kids around and couldn’t watch this snail crawl)
b. Metal no eGPU: about 137 minutes (chased the kids around again, and couldn’t watch this snail crawl, either)
c. Open CL with eGPU: wont work, Metal only
d. Metal with eGPU: one hour

Test 2: H.265
a. Without eGPU: failed (interesting result)
b. With eGPU: 40 minutes

Test 3: ProRes4444
a. Without eGPU: three hours
b. With eGPU: one hour and 14 minutes

FCP X
FCP X is an interesting editing app, and it is blazing fast at handling ProRes media. As I mentioned earlier, it hasn’t been in my world too much, but that isn’t because I don’t like it. It’s because professionally I haven’t run into it. I love the idea of roles, and would really love to see that playout in other NLEs. However, my results speak for themselves.

One caveat to using the eGPU in FCP X is that you must force it to work inside of the NLE. At first, I couldn’t get it to work. The Activity Monitor would show no activity on the eGPU. However, thanks to a Twitter post, James Wells (@9voltDC) sent me to this, which allows you to force FCP X to use the eGPU. It took a few tries but I did get it to work, and funny enough I saw times when all three GPUs were being used inside of FCP X, which was pretty good to see. This is one of those use-at-your-own risk things, but it worked for me and is pretty slick… if you are ok with using Terminal commands. This also allows you to force the eGPU onto other apps like Cinebench.

Anyways here are my results with the BMD eGPU exporting from FCP X:

Test 1: H.264
a. Without eGPU: eight minutes
b. With eGPU: eight minutes and 30 seconds

Test 2: H.265: Not an option

Test 3: ProRes4444
a. Without eGPU: nine minutes
b. With eGPU: six minutes and 30 seconds

Summing Up
In the end, the Blackmagic eGPU with Radeon Pro 580 GPU is a must buy if you use your MacBook Pro with Resolve 15. There are other options out there though, like the Razer Core v2 or the Akitio Node Pro.

From this review I can tell you that the Blackmagic eGPU is silent even when processing 8K Red RAW footage (even when the MacBook Pro fans are going at full speed), and it just works. Plug it in and you are running, no settings, no drivers, no cards to install… it just runs. And sometimes when I have three little boys running around my house, I just want that peace of mind and I want things to just work like the Blackmagic eGPU.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: Maxon Cinema 4D R19 — an editor’s perspective

By Brady Betzel

It’s time for my yearly review of Maxon’s Cinema 4D. Currently in Release 19, Cinema 4D comes with a good amount of under-the-hood updates. I am an editor, first and foremost, so while I dabble in Cinema 4D, I am not an expert. There are a few things in the latest release, however, that directly correlate to editors like me.

Maxon offers five versions of Cinema 4D, not including BodyPaint 3D. There is the Cinema 4D Lite, which comes free with Adobe After Effects. It is really an amazing tool for discovering the world of 3D without having to invest a bunch of money. But, if you want all the goodies that come packed into Cinema 4D you will have to pay the piper and purchase one of the other four versions. The other versions include Prime, Broadcast, Visualize and Studio.

Cinema 4D Prime is the first version that includes features like lighting, cameras and animation. Cinema 4D Broadcast includes all of Cinema 4D Prime’s features as well as the beloved MoGraph tools and the Broadcast Library, which offers pre-built objects and cameras that will work with motion graphics. Cinema 4D Visualize includes Cinema 4D Prime features as well, but is geared more toward architects and designers. It includes Sketch and Toon, as well as an architecturally focused library of objects and presets. Cinema 4D Studio includes everything in the other versions plus unlimited Team Render nodes, a hair system, a motion/object tracker and much more. If you want to see a side-by-side comparison you can check out Maxon’s website.

What’s New
As usual, there are a bunch of new updates to Cinema 4D Release 19, but I am going to focus on my top three, which relate to the workflows and processes I might use as an editor: New Media Core, Scene Reconstruction and the Spherical Camera. Obviously, there are a lot more updates — including the incredible new OpenGL Previews and the cross-platform ProRender, which adds the ability to use AMD or Nvidia graphics cards — but to keep this review under 30 pages I am focusing on the three that directly impact my work.

New Media Core
Buckle up! You can now import animated GIFs into Cinema 4D. So, yes, you can import animated GIFs into Cinema 4D Release 19, but that is just one tiny aspect of this update. The really big addition is the QuickTime-free support of MP4 videos. Now MP4s can be imported and used as textures, as well as exported with different compression settings, directly from within Cinema 4D’s  interface — all of this without the need to have QuickTime installed. What is cool about this is that you no longer need to export image-based file sequences to get your movie inside of Cinema 4D. The only slowdown will be how long it takes Cinema 4D R19 to cache your MP4 so that you will have realtime playback… if possible.

In my experience, it doesn’t take that much time, but that will be dependent on your system performance. While this is a big under-the-hood type of update, it is great for those quick exports of a scene for approval. No need to take your export into Adobe Media Encoder, or something else, to squeeze out an MP4.

Scene Reconstruction
First off, for any new Cinema 4D users out there, Scene Reconstruction is convoluted and a little thick to wade through. However, if you work with footage and want to add motion graphics work to a scene, you will want to learn this. You can check out this Cineversity.com video for an eight-minute overview.

Cinema 4D’s Scene Reconstruction works by tracking your footage to generate point clouds, and then after you go back and enable Scene Reconstruction, it creates a mesh from the resulting scene calculation that Cinema 4D computes. In the end, depending on how compatible your footage is with Scene Detection (i.e. contrasting textures and good lighting will help) you will get a camera view with matching scene vertices that are then fully animatable. I, unfortunately, do not have enough time to recreate a set or scene inside of Cinema 4D R19, however, it feels like Maxon is getting very close to fully automated scene reconstruction, which would be very, very interesting.

I’ve seen a lot of ideas from pros on Twitter and YouTube that really blow my mind, like 3D scanning with a prosumer camera to recreate objects inside of Cinema 4D. Scene Reconstruction could be a game-changing update, especially if it becomes more automated as it would allow base users like me to recreate a set in Cinema 4D without having to physically rebuild a set. A pretty incredible motion graphics-compositing future is really starting to emerge from Cinema 4D.

In addition, the Motion Tracker has received some updates, including manual tracking on R, G, B, or custom channel — viewed as Tracker View — and the tracker can now work with a circular tracking pattern.

Spherical Camera
Finally, the last update, which seems incredible, is the new Spherical Camera. It’s probably because I have been testing and using a lot more 360 video, but the ability to render your scene using a Spherical Camera is here. You can now create a scene, add a camera and enable Spherical mapping, including equirectangular, cubic string, cubic cross or even Facebook’s 360 video 3×2 cubic format. In addition, there is now support for Stereo VR as well as dome projection.

Other Updates
In addition to the three top updates I’ve covered, there are numerous others updates that are just as important, if not more so to those who use Cinema 4D in other ways. In my opinion, the rendering updates take the cake. Also, as mentioned before, there is support for both Nvidia and AMD GPUs, multi-GPU support, incredible viewport enhancements with Physical Rendering and interactive Preview Renders in the viewport.

Under MoGraph, there is an improved Voronoi Fracture system (ability to destroy an object quickly) including improved performance for high polygon counts and detailing to give the fracture a more realistic look. There is also a New Sound Effector to allow for interactive MoGraph creation to the beat of the music. One final note: the new Modern Modelling Kernel has been introduced. The new kernel gives more ability to things like polygon reduction and levels of detail.

In the end, Cinema 4D Release 19 is a huge under-the-hood update that will please legacy users but will also attract new users with AMD-based GPUs. Moreover, Maxon seems to be slowly morphing Cinema 4D into a total 2D and 3D modeling and motion graphics powerhouse, much like the way Blackmagic’s Resolve is for colorists, video editors, VFX creators and audio mixers.

Summing Up
With updates like Scene Recreation and improved motion tracking, Maxon gives users like me the ability to work way above their pay grade to composite 3D objects onto our 2D footage. If any of this sounds interesting to you and you are a paying Adobe Creative Cloud user, download and open up Cinema 4D Lite along with After Effects, then run over to Cineversity and brush up on the basics. Cinema 4D Release 19 is an immensely powerful 3D application that is blurring the boundaries between 3D and 2D compositing. With Cinema 4D Release 19’s large library of objects, preset scenes and lighting setups you can be experimenting in no time, and I didn’t even touch on the modeling and sculpting power!


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: The Litra Torch for pro adventure lighting

By Brady Betzel

If you are on Instagram you’ve definitely seen your fair share of “adventure” photography and video. Typically, it’s those GoPro-themed action-adventure shots of someone cliff diving off a million-mile-high waterfall. I definitely get jealous. Nonetheless, one thing I love about GoPro cameras is their size. They are small enough to fit in your pocket, and they will reliably produce a great image. Where those actioncams suffer is with light performance. While it is getting better every day, you just can’t pull a reliably clean and noise-free image from a camera sensor so small. This is where actioncam lights come into play as a perfect companion, including the Litra Torch.

The Litra Torch is an 800 Lumen, 1.5 by 1.5-inch magnetic light. I first started seeing the tiny light trend on Instagram where people were shooting slow shutter photos at night but also painting certain objects with a tiny bit of light. Check out Litra on Instagram: @litragear to see some of the incredible images people are producing with this tiny light. I saw an action sports person showing off some incredible nighttime pictures using the GoPro Hero. He mentioned in the post that he was using the Litra Torch, so I immediately contacted Litra, and here I am reviewing the light. Litra sent me the Litra Paparazzi Bundle, which retails for $129.99. The bundle includes the Litra Torch,  along with a filter kit and cold shoe mount.

So the Litra Torch has four modes, all accessible by clicking the button on top of the light: 800 Lumen brightness, 450 Lumens, 100 Lumens and flashing. The Torch has a consistent color temperature of 5700 kelvin, essentially the light is a crisp white — right in between blue and yellow. The rechargeable lithium-ion battery can be charged via the micro USB cable and will last up to 30 minutes or more depending on the brightness selected. With a backup battery attached you could be going for hours.

Over a month with intermittent use I only charged it once. One night I had to check out something under the hood of my car and used the Litra Torch to see what I was doing. It is very bright and when I placed the light onto the car I realized it was magnetic! Holy cow. Why doesn’t GoPro put magnets into their cameras for mounting! The Torch also has two ¼-20 camera screw mounts so you can mount them just about anywhere. The construction of the Torch is amazing — it is drop-proof, waterproof and made of a highly resilient aluminum. You can feel the high quality of the components the first time you touch the Torch.

In addition to the Torch itself, the cold shoe mount and diffuser, the Paparazzi Bundle comes with the photo filter kit. The photo filter kit comes with five frames to mount the color filters onto the Torch; three sets of Rosco Tungsten 4600k filters; three sets of Rosco Tungsten 3200k filters; 1 White Diffuser filter; and one each of a red, yellow and green color filter. Essentially, they give you a cheap way to change white balance temperatures and also some awesome color filters to play around with. I can really see the benefit of having at least two if not three of the Litra Torches in your bag with the filter sets; you can easily set up a properly lit product shoot or even a headshot session with nothing more than three tiny Torch lights.

Putting It To The Test
To test out the light in action I asked my son to set-up a Lego scene for me. One hour later I had some Lego models to help me out. I always love seeing people’s Lego scenes on Instagram so I figured this would also be a good way to show off the light and the extra color filters sent in the Paparazzi Bundle. One thing I discovered is that I would love to have a slide-in filter holder that is built onto the light; it would definitely help me avoid wasting time having to pop filters into frames.

All in all, this light is awesome. The only problem is I wish I had three so I could do a full three-point lighting setup. However, with some natural light and one Litra Torch I had enough to pull off some cool lighting. I really liked the Torch as a colored spotlight; you can get that blue or red shade on different objects in a scene quickly.

Summing Up
In the end, the Litra Torch is an amazing product. In the future I would really love to see multiple white balance temperatures built into the Torch without having to use photo filters. Also, a really exciting but probably expensive prospect of building a Bluetooth connection and multiple colors. Better yet, make this light a full-color-spectrum app-enabled light… oh wait, just recently they announced the Litra Pro on Kickstarter. You should definitely check that out as well with it’s advanced options and color profile.

I am spoiled by all of those at home lights, like the LIFX brand, that change to any color you want, so I’m greedy and want those in a sub-$100 light. But those are just wishes — the Litra Torch is a must-have for your toolkit in my opinion. From mounting it on top of my Canon DSLR using the cold shoe mount, to using the magnetic ability and mounting in unique places, as well as using the screw mount to attach to a tripod — the Litra Torch is a mind-melting game changer for anyone having to lug around a 100-pound light kit, which makes this new Kickstarter of the Litra Pro so enticing.

Check out their website for more info on the Torch and new Litra Pro, as well as a bunch of accessories. This is a must-have for any shooter looking to carry a tiny but powerful light anywhere, especially for summer and the outdoors!


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.