Tag Archives: Brady Betzel

My Top Five Ergonomic Workstation Accessories

By Brady Betzel

Instead of writing up my normal “Top Five Workstation Accessories” column this year, I wanted to take a slightly different route and focus on products that might lessen pain and maybe even improve your creative workflow — whether you are working at a studio or, more likely these days, working from home.

As an editor, I sit in a chair for most of my day, and that is on top of my three- to four-hour round-trip commute to work. As aches and pains build up (I’m 36, and I’m sure it doesn’t just get better), I had to start looking for solutions to alleviate the pain I can see coming in the future. In the past I have mentioned products like the Wacom Intuos Pro Pen tablet, which is great and helped me lessen wrist pain. Or color correction panels such as theLoupedeck, which helps creative workflows but also prevents you from solely using the mouse, also lessening wrist pain.

This year I wanted to look at how the actual setup of a workstation environment that might prevent pain or alleviate it. So get out of your seat and move around a little, take a walk around the block, and when you get back, maybe rethink how your workstation environment could become more conducive to a creativity-inspiring flow.

Autonomous SmartDesk 2 
One of the most useful things in my search for flexibility in the edit bay is the standup desk. Originally, I went to Ikea and found a clearance tabletop in the “dents” section and then found a kitchen island stand that was standing height. It has worked great for over 10 years; the only issue is that it isn’t easily adjustable, and sometimes I need to sit to really get my editing “flow” going.

Many companies offer standing desk solutions, including manual options like the classic VariDesk desk riser. If you have been in the offline editing game over the past five to 10 years, then you have definitely seen these come and go. But at almost $400, you might as well look for a robotic standing desk. This is where the Autonomous SmartDesk 2 comes into play. Depending on whether you want the Home Office version, which stands between 29.5 inches and 48 inches, or the Business Office version, which stands between 26 inches and 52 inches, you are looking to spend $379 or $479, respectively (with free shipping included).

The SmartDesk 2 desktop itself is made of MDF (medium-density fibreboard) material, which helps to lower the overall cost but is still sturdy and will hold up to 300 pounds. From black to white oak, there are multiple color options that not only help alleviate pains but can also be a conversation piece in the edit bay. I have the Business version in black along with a matching black chair, and I love that it looks clean and modern. The SmartDesk 2 is operated using a front-facing switch plate complete with up, down and four height-level presets. It operates smoothly and, to be honest, impressively. It gives a touch of class to any environment. Setup took about half an hour, and it came with easy-to-follow instructions, screws/washers and tools.

Keep an eye out for my full review of the Autonomous SmartDesk 2 and ErgoChair 2, but for now think about how a standup desk will at least alleviate some of the sitting you do all day while adding some class and conversation to the edit bay.

Autonomous ErgoChair 2 
Along with a standup desk — and more important in, my opinion — is a good chair. Most offline editors and assistant editors work at a company that either values their posture and buys Herman Miller Aeron chairs, or cheaps out and buys the $49 special at Office Depot. I never quite understood the benefit of saving a few bucks on a chair, especially if a company pays for health insurance — because in the end, they will be paying for it. Not everyone likes or can afford the $1,395 Aeron chairs, but there are options that don’t involve ruining your posture.

Along with the Autonomous SmartDesk 2, you should consider buying the ErgoChair 2, which costs $349 — a similar price to other chairs, like the Secretlab Omega series gaming chair that retails for $359. But the ErgoChair 2 has the best of both worlds: an Aeron chair-feeling mesh back and neck support plus a super-comfortable seat cushion with all the adjustments you could want. Even though I have only had the Autonomous products for a few weeks now, I can already feel the difference when working at home. It seems like a small issue in the grand scheme of things, but being comfortable allows my creativity to flow. The chair took under 30 minutes to build and came with easy-to-follow instructions and good tools, just like the SmartDesk 2.

A Footrest
When I first started in the industry, as soon as I began a freelance job, I would look for an old Sony IMX tape packing box. (Yes, the green tapes. Yes, I worked with tape. And yes, I can operate an MSW-2000 tape deck.) Typically, the boxes would be full of tapes because companies bought hundreds and never used them, and they made great footrests! I would line up a couple boxes under my feet, and it made a huge difference for me. Having a footrest relieves lower back pressure that I find hard to relieve any other way.

As I continue my career into my senior years, I finally discovered that there are actual footstools! Not just old boxes. One of my favorites is on Amazon. It is technically an adjustable nursing footstool but works great for use under a desk. And if you have a baby on the way, it’s a two-for-one deal. Either way, check out the “My Brest Friend” on Amazon. It goes for about $25 with free one-day Amazon Prime shipping. Or if you are a woodworker, you might be able to make your own.

GoFit Muscle Hook 
After sitting in an edit bay for multiple hours, multiple days in a row, I really like to stretch and use a massager to un-stuff my back. One of the best massagers I have seen in multiple edit bays is called the GoFit Muscle Hook.

Luckily for us it’s available at almost any Target or on the Target website for about $25. It’s an alien-looking device that can dig deep into your shoulder blades, neck and back. You can use it a few different ways — large hook for middle-of-the-back issues, smaller hook that I like to use on the neck and upper back, and the neck massage on the bar (that one feels a little weird to me).

There are other massage devices similar to the Muscle Hook, but in my opinion the GoFit Muscle Hook is the best. The plastic-composite seems indestructible and almost feels like it could double as a self-defense tool. But it can work out almost any knots you have worked up after a long day. If you don’t buy anything else for self-care, buy the Muscle Hook. You will be glad you did. Anyone who gets one has that look of pain and relief when they use it for the first time.

Foam Roller
Another item that I just started using was a foam roller. You can find them anywhere for the most part, but I found one on Amazon for $13.95 plus free Amazon Prime one-day shipping. It’s also available on the manufacturer’s website for about $23. Simply, it’s a high-density foam cylinder that you roll on top of. It sounds a little silly, but once you get one, you will really wonder how you lived without one. I purchased an 18-inch version, but they range from 12 inches to 36 inches. And if you have three young sons at home, they can double as fat lightsabers (but they hurt, so keep an eye out).

Summing Up
In the end, there are so many ways to try keeping a flexible editing lifestyle, from kettlebells to stand-up desks. I’ve found that just getting over the mental hurdle of not wanting to move is the biggest catalyst. There are so many great tech accessories for workstations, but we hardly mention ones that can keep our bodies moving and our creativity flowing. Hopefully, some of these ergonomic accessories for your workstation will spark an idea to move around and get your blood flowing.

For some workout inspiration, Onnit has some great free workouts featuring weird stuff like maces, steel clubs and sandbags, but also kettlebells. The site also has nutritional advice. For foam roller stretches, I would check out the same Onnit Academy site.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producers Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: Litra Pro’s Premium 3 Point Lighting Bundle

By Brady Betzel

With LED lights showing up everywhere these days, it’s not always easy to find the balance between affordability, power output and size. I have previously reviewed itty bitty-LED lights like the Litra Torch, which for its size is amazing. Litra has now expanded its LED offerings, adding the Litra Pro and the Litra Studio.

Litra Studio ($650) is at the top of the Litra mountain with not only varying color temperatures — from 2,000 to 10,000 kelvin with adjustable green/magenta settings — but also RGBWW (RGB + cool white + warm white), CCT (kelvin adjustments), HSL (hue + saturation + lightness), color gel presets, flash effects and more.

But for today’s review, I am wanted to focus on the Litra Pro LED, which comes to the Premium 3 Point Lighting Bundle, complete with light stands, lights, soft boxes, and carrying case. I had reached out to Litra about reviewing this bundle because I am tired of having to lug around big clunky lights for quick interviews or smaller setup product shots. And to be honest, it was right before I was heading to Sundance to shoot some interviews for postPerspective, I and didn’t want to check a bag at the airport. (Check out my interviews with editors at Sundance here.)

For the trip, I wanted to bring lights, a Blackmagic 6K Pocket cinema camera, my Canon L series zoom lens, a small tripod and some hard drives all stuffed into my backpack. I knew I’d be in the snow, so I needed lights that could potentially withstand all types of precipitation. Also, I would be throwing these lights around, so I needed them to be durable. The Litra Pro lights fit the bill. They measure 2.75in x 2in x 1.2in  — smaller than a phone, weigh 6oz and have upwards of a 10-hour battery life if set to 5% power. Each Litra Pro costs just under $220 but can be purchased in different bundle assortments. Individually, each Litra Pro comes with a rubberized diffuser, USB-A to Micro-USB charging cable (very short, maybe 3-4inches in length), DSLR mount (to be mounted in a hot/cold shoe), GoPro mount and a little zipper bag.

I wish Litra would package not only the GoPro mount to ¼”-20 but also the female ¼”-20 to GoPro mount adapter to be mounted to something like a tripod. If you don’t already them them, you’d need to purchase the GoPro mounts. Alternatively, it would be nice to have a mini-ball head mount like they sell on the site separately.

I was sent the Litra Pro Premium 3 Point Lighting Bundle. This essentially gives you everything you need for a standard three-point lighting setup — key light, fill light and back light. In addition, you get three light stands with carrying bags, three soft boxes, a customizable foam-insert carrying case and the standard accessories. This package retails for $779.95, which is a pretty good discount. If bought separately, the package would add up to about $820 not including the light stands, which aren’t available on Litra site and cost around $26 for two on Amazon. That means with the bundle you are essentially getting a free carrying case and light stands. The carrying case fits most of the products, except for the light stands. I had some trouble fitting all of the soft boxes along with the original accessories into the carrying case, but with a little force, I got it zipped up.

Do Specs Live Up to Output?
The Litra Pro lights are amazing lights packed into a small package, but with a kind of expensive price tag — Think of the saying, “Better, faster, cheaper: Pick two because you can’t have all three.” They have a CRI (Color Rendering Index) of greater than 95, which on the surface means they will show accurate colors. They can output up to 1200 lumens (increasing from 0-100% in 5% increments) either by app or on the light itself; have a 70-degree beam angle; can be adjusted from 3000k to 6000k color temperature in 100k increments; and have zero flicker, no matter the shutter speed (a.k.a. shutter angle). The top OLED screen displays battery info, Bluetooth connection info, kelvin temperature and brightness values.

One of my two favorite features of the Litra Pro lights are the rugged exterior and the impact they can withstand, based on MIL-STD-810 testing. The Litra Pros can withstand a lot of punishment, typically more than any filmmaker will dish out. For me, I need lights that can be in a pocket, a backpack, or mounted on a lighting stand in the rain, and these lights will withstand all of the elements.

They stood up to my practical production abuse: dropping, water, snow, rain, general throwing around in my backpack on an airplane, and my three sons — all under 10 — throwing them around. In fact, they are waterproofed up to 30 meters (90 feet).

My second favorite feature is the ability to control color temperature and brightness among a group of lights simultaneously or individually through the Litra app. When purchasing the 3 Point Lighting Bundle, this makes a lot of sense. Controlling all of the lights from one app simultaneously can allow you to watch your output image on the camera without moving around the room adjusting each light.

When I first started writing this review, the Litra app was one of the most important factors. When I was at Sundance, I needed to change lighting temperatures or brightness levels without leaving my interview position. I wasn’t able to bring an external monitor, so I only had the monitor on the back of the BMPCC6K camera to judge my lighting decisions. But with the updated Litra app, I was able to quickly add the three Litra Pro lights into a group and adjust the temperature and brightness easily. I tested the app on both Android and iOS devices, and as of mid-February, they both worked.

There can be a little lag when adjusting the brightness and temperature of the lights in a group, but they quickly catch up. The Litra app also has “CTO” (Color Temperature Orange) common preset temperatures of Daylight 5600, ⅛ CTO 4900K, ¼ CTO 4500K, ½ CTO 3800K and ¾ CTO 3200K to quickly jump to the more common color temperatures. If those don’t work, you can also set your favorites. An interesting function is to flash the lights — you can set a brightness minimum/maximum, color temperature and strobe per second in Hz.

When shooting product and interview photography or videography, I like to use diffusion. As I mentioned earlier, the light comes with a rubberized diffusion cover that sits right on the camera. But if you need a little more distance between the light and your diffusion to draw out the softness of the light, the Litra 3 Point Lighting Bundle includes soft boxes that snap together and snap onto the Litra Pro. At first, I was a little thrown off by the soft boxes because you have to build them and break them down if you want to travel with them — I guess I was hoping for more of a collapsible setup. But they come with a padded, zippered pouch for transport, and they lay very flat when broken down. They actually work pretty well when snapped together and are pretty durable. The soft boxes are indispensable for interviews. Without the soft boxes, it is hard to get an even light; add the rubberized diffusion and you will almost get there, but the soft boxes really spread the light nicely.

Over Christmas, I helped out at an event for a pediatric cancer-based foundation called The Bumblebee Foundation, which supports families with kids going through pediatric cancer treatments. They needed someone to take pictures, so I grabbed my camera and mounted one of the Litra Pro lights with a soft box onto the hot shoe of my Canon 5D Mark II with the included mount. The Litra Pro was easy to use, and it didn’t startle people like a flash might. It was a really key item to have in that environment.

I also do some product photography/videography for my wife, who sews and makes hair bows, tutus and more. I needed to light a few Girl Scout Cookie hair bows she had made, so I mounted two of the lights using the lighting stands and soft boxes and just stood one of the Litra Pros behind the products. You can see the video here.

What was interesting is that I wanted more light vertically, and because the Litra Pros have 2-¼”-20 mounts (one on the bottom and one on the side), I could quickly mount the lights vertically. I never really realized how helpful mounting the Litra Pros vertically would be until I actually needed it. At the same time, I had left the lights on at 60-80% power, and after a few minutes, I felt the heat the Litra Pros can put out. It isn’t quite burning, but the Litra Pros do get hot to the touch if left on for a while… just something to keep in mind.

Summing Up
From the military-grade-feeling exterior aluminum construction to the CRI color accuracy, the Litra Pro lights are truly amazing. Whether you use them to light interviews at the 2020 Sundance Film Festival (like I did), add one to a GoPro shoot to take the load off of the sensor with a high ISO, or use them to light product photography, the Litra Pro 3 Point Lighting Bundle is worth your money. They can fit into your pocket and withstand being dropped on the ground or in water.

All in all, this is a great bundle. The Litra Pros are not cheap, but the peace of mind you get knowing they will still work if you drop them or get them wet is worth every penny. When flying to Sundance, I had no fear throwing them around. I was setting up the lighting for my interviews and noticed a water ring on the table from a glass of water. I didn’t think twice and put the Litra Pro right in the water. In fact, when I was shooting some videos for this review, I put the Litra Pros in a vase of water. At first I was nervous, but then I went for it, and they worked flawlessly.

If you are looking for super-compact lighting that is bright enough to use outdoors, light interviews indoors, film underwater, and even double as photography lighting, the Litra Pros are for you. If you are like me and need to do a lot of product videography and interview lighting quickly, the Litra Pro Premium 3 Point Lighting Bundle is where you should look.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: GoPro’s Hero 8 and GoPro Max 360 cameras

By Brady Betzel

Every year GoPro outdoes themselves and gives users more reasons why you should either upgrade your old GoPro action camera or invest in the GoPro ecosphere for the first time. Late last year, GoPro introduced the GoPro Hero 8 (for review the Black Edition) and the GoPro Max — a.k.a. the re-imagined Fusion 360 camera.

If you aren’t sure whether you want to buy one or both of the new GoPro cameras, you should take a look at the GoPro TradeUp program where you can send in any camera with an original retail value of $99.99 or more and you will receive $100 off the Hero 8 or $50 off the Hero 7. Check out the TradeUp program. That at least will take the sting off of the $399.99 or higher price.

GoPro Hero 8 Black Edition
Up first, I’ll take a look at the GoPro Hero 8 Black Edition. If you own the Hero 7 Black Edition you will be familiar with many of the Hero 8 features. But there are some major improvements from the Hero 7 to the Hero 8 that will make you think hard about upgrading. The biggest update, in my opinion, is the increase in the max bit rate to 100 Mb/s. With that increase comes better quality video (more data = more information = more details). But GoPro also allows you to use the HEVC (H.265) codec to compress videos, which is a much improved version of the antiquated H.264. You can get into the weeds on the H.264 vs H.265 codecs over at Frame.io’s blog where they have some really great (and nerdy) info.

Anyway, with the bit rate increased — any video from the GoPro Hero 8 Black Edition has the potential to be better than the Hero 7. I say “potential” because if the bit rate doesn’t need to be that high, the GoPro won’t force it to be — it’s a variable bit rate codec that only uses data if it needs to.

Beyond the increased bit rate, there are some other great features. The menu system has been improved even further than the 7, making it easier to get shooting without setting a bunch of advanced settings. There are presets made to set up your GoPro quickly and easily, depending on what you are filming: Standard (1080, 60, Wide), Activity (2.7K, 60, SuperView) and Cinematic (4K, 30, Linear). Live Streaming has been upped from 720p to 1080p, but we are still stuck with streaming from the GoPro through your phone natively to the sites YouTube and Facebook. You can stream to other sites like Twitch by setting up a RTMP URL — but Instagram is still off the list. This is unfortunate because, I think live Instagram streams/stories would be their biggest feature.

Hypersmooth 2.0 is an improved version of Hypersmooth 1.0 (still on the Hero 7). You can stabilize even more if you enable the “Boost” function, which adds more stabilization but at the cost of a larger crop on your video. Hypersmooth is an incredible, gimbal-replacing stabilization that GoPro has engineered with the help of automatic Horizon levelling, GPS and other telemetry data.

Coming soon will be external attachments called “Mods,” which will include a Media Mod adding an HDMI output, 3.5mm microphone jack and two cold shoe mounts; a Display Mod adding a flip up screen to see yourself from the front (think vlogging); and a Light Mod, which will add an LED to the camera for lighting. These will require the Media Mod ($79.99) to be purchased in addition to the Light ($49.99) and/or Display Mod ($79.99). Keep an eye on the GoPro store for info on pre-orders and purchases.

GoPro cameras have always taken great pictures and video… when the lighting is perfect. Even with the Hero 8, low light is the Achilles Heel of GoPro — the video starts to become muddy and hard to view. The Hero 8 shares the same camera sensor as the Hero 7 and even the same GP1 processor, but the Hero 8 was able to squeeze some more tech out of it with features like Timewarp 2.0 and Hypersmooth 2.0. So you will get similar images and videos out of both cameras at a base level, but with the added technology in the Hero 8 Black Edition, if you have the extra $100 you should get the latest version. Overall, the Hero 8 Black Edition is physically thinner (albeit a little taller), the battery compartment is no longer on the bottom and the Hero 8 now has a built-in mount! No more need for extra cages if you don’t need them!

GoPro Max
So last time I used GoPro’s 360 camera creation it was the Fusion. The Fusion was a hard-to-use, bulky, 360 camera that required two separate memory cards. It was not my favorite camera to test, in fact it felt like it needed another round of beta testing. Fast forward to today, and the recently released GoPro Max, which is what the Fusion should have been. While I think the Max is a marked improvement over the Fusion, it is not necessarily for everyone. If you specifically need to make 360 content or want a new GoPro, but also want to keep your options open to filming in 360 degrees, then the Max is for you.

The GoPro Max costs $499.99, so it’s not much more than a Hero 8 Black and it can shoot in 360 degrees. One of the best features is the ability to shoot like a traditional GoPro (i.e. one lens). Unfortunately, you are limited to 1080p/1440p 60/30/24fps — and maybe worst of all no slow-mo. You don’t always need to shoot in 360 with the Max, but you don’t quite get the full line-up of frame rates offered by the traditional Hero 8 Black Edition.

In addition, the bit rate of the Max — maxes out (all pun intended) at 78Mb/s not the 100Mb/s like the Hero 8. But if you want to shoot in 360, the GoPro Max is pretty simple to get running. It will even capture 360 audio with its six-mic array, shoot 24fps or 30fps at 5.6K resolution for stitched spherical video and, best of all, editing the spherical video is much easier in the latest GoPro phone app update. Unfortunately, editing on a Windows computer is not as easy as on the phone.

I am using a computer with Windows 10, Intel i7 6-core CPU, 32 GB memory and an Nvidia RTX 2080 GPU — so not a slow laptop. You can find all of the Windows software and plugins for 360 video on GoPro’s website. It seems like GoPro is trying to bury software for computers because it wasn’t easy to find these Windows apps. Nonetheless, you will want to download the GoPro Max Exporter and the Adobe GoPro FX Reframe plugins if you plan on editing your footage. Before I go any further, if you want to watch a video tutorial, I suggest you watch Abe Kislevitz’s tutorial on the GoPro Max workflow in Adobe Premiere. He is really good at explaining the workflow. Abe works at GoPro in media production and always has great videos, check his website out while you’re at it.

Moving on, in MacOS, iOS and Android, you can work with GoPro Max’s native 360 file format via the GoPro built apps. The 360 file format is a Google-created mapping known as EAC or Equal Area Cubemap. Get all the nerdy bits here.

Unfortunately, in Windows you need to convert the 360 videos into the “old school” equirectangular format that we’ve been used to. To do this, you run your 360 videos through the GoPro Exporter, which can use the power of both the CPU and GPU to create new files. This extra step stinks, I can’t sugar coat it. But with better compression and mapping structures come growing pains… I guess. GoPro is supposedly building a Windows-based reframer and exporter like the MacOS version, but after trolling some GoPro forums it seems like they have a lot of work to do and/or came to the conclusion that most users are going to use their phones or a MacOS-based system.

Either way, the process to reframe your 360 video and export as 1920×1080 or 1080×1920 goes: convert your 360 files to a more usable Cineform, H.265 or H.264 Quicktime > import into Adobe Premiere > create a sequence that will match your largest output size (i.e. 1920×1080) > apply the GoPro FX Reframe plugin and identify your output size (i.e. 1920×1080) > keyframe rotation, edit, reframe, etc. > export.

It’s not awful as long as you know what you are going to get yourself into. But take that with a grain of salt; I am a professional video editor that works in video 10-12 hours a day at least five days a week. I am very comfortable with this kind of stuff. It seems like it would take some more work if I wasn’t an editor by trade. If I was to try and explain this to my wife or kids, they may roll their eyes at me and just ask me to do it… understandably. If you want to upload to a site like YouTube without editing, you will still need to convert the 360 videos to something that is more manageable for YouTube like H.264 or H.265.

When working in Premiere with the GoPro Reframe plugin I actually found the effect controls overlay intuitive when panning and tilting. The hard part when keyframing panning and tilting in Premiere is adjusting the bezier curve keyframe graph. It’s not exactly intuitive, which makes it a challenge to get smooth starts and ends to your pans and tilts… even with Easy Ease In/Out set. With a little work you can get it done, but I really hope GoPro gets their Windows-based GoPro Max 360 editor up and running so I don’t have to use Adobe.

But after I did a couple of keyframed pans and tilts to my footage I took while riding It’s a Small World at Disneyland with my three sons, I exported a quick H.264. For about 50 seconds of footage took four minutes to export. Not exactly speedy. But playing back my footage in a 1920×1080 timeline in Premiere at ¼ resolution when using the GoPro plugin was actually pretty smooth.

You can check out the video I edited with the GoPro FX Reframer plugin in Premiere on my YouTube channel: https://youtu.be/frhJ3T8fzmE. I then wanted to take my video of our Tea Cup ride at Disneyland straight from the camera and upload it to YouTube. Unfortunately, you still need to convert the video in the GoPro Max Exporter to an H.264 or the like but this time it only took 1:45 to export a two-minute 4K video. You can check out the 4K-360 Tea Cup ride on YouTube.

Final Thoughts
Are the GoPro Hero 8 Black Edition and GoPro Max 360 camera worth an upgrade or new purchase? I definitely think the Hero 8 is worth it. GoPro always makes great cameras that are waterproof down to 33 feet, take great pictures when there is enough available light, can create amazing hyperlapses and timewarps incamera with little work, fine tune your shutter speed or flat-color footage with ProTune settings, and all of this can fit in your pocket.

I used to take a DSLR to Disneyland, but with the Hypersmooth 2.0 and the improved HDR images that come out of the Hero 8, this is the only camera you will need.

The GoPro Max is a different story. I like what the GoPro Max does technically; it films 360 video easily and can double as a 1440p Hero camera. But I found myself fumbling a little with the Max because of its larger footprint when compared to the Hero 8 (it’s definitely smaller than the old Fusion 360 camera though). And when editing on your phone or doing a straight upload, the videos are relatively easy to process, as long as you don’t mind the export time. Unfortunately, using Premiere to edit these videos is a little rough; it’s better than with the Fusion but it’s not simple.

Maybe I’m just getting older and don’t want to give in to the VR/360 video yet, or maybe it’s just not as smooth of a process as I would have hoped for. But if you want to work in 360 and don’t mind a lack of slow motion, I wouldn’t tell you to not buy the GoPro Max. It’s a great camera with durable lenses and exterior case.

Check out the videos on YouTube and see what amazing people like Abe Kislevitz are doing; they may just show you what you need to see. And check out www.gopro.com for more info including when those new Hero 8 Mods will be available.


 

Sundance Videos: Editor to Editor

Our own Brady Betzel headed out to Park City this year to talk to a few editors whose films were being screened at the Sundance Film Festival.

As an editor himself, Betzel wanted to know about the all-important workflow, but also about how they got their start in the business and how important it is to find a balance between work life and personal life.

Among those he sat down with were Scare Me editor Patrick Lawrence, Boys State editors Jeff Gilbert and Connor Hall, Save Yourselves! editor Sofi Marshall, Aggie editor Gil Seltzer, Miss Juneteenth editor Courtney Ware, Black Bear editor Matthew L. Weiss, Spree editor Benjamin Moses Smith and Dinner in America writer/director/editor Adam Carter Rehmeier.

Click here to see them all.

An online editor’s first time at Sundance

By Brady Betzel

I’ve always wanted to attend the Sundance Film Festival, and my trip last month did not disappoint. Not only is it an iconic industry (and pop-culture) event, but the energy surrounding it is palpable.

Once I got to Park City and walked Main Street — with the sponsored stores (Canon and Lyft among others) and movie theaters, like the Egyptian — I started to feel an excitement and energy that I haven’t felt since I was making videos in high school and college… when there were no thoughts of limits and what I should or shouldn’t do.

A certain indescribable nervousness and love started to bubble up. Sitting in the luxurious Park City Burger King with Steve Hullfish (Art of the Cut) and Joe Herman (Cinemontage) before my second screening of Sundance 2020: Dinner in America, I was thinking how I was so lucky to be in a place that is packed with creatives. It sounds cliché and trite, but it really is reinvigorating to surround yourself with positive energy — especially if you can get caught up in cynicism like me.

It brought me back to my college classes, taught by Daniel Restuccio (another postPerspective writer), at California Lutheran University, where we would cut out pictures from magazines, draw pictures, blow up balloons, eat doughnuts and do whatever we could to get our ideas out in the open.

While Sundance occasionally felt like an amalgamation of the thirsty-hipster Coachella crowd mixed with a high school video production class (but with million-dollar budgets), it still had me excited to create. Sundance 2020 in Park City was a beautiful resurgence of ideas and discussions about how we as an artistic community can offer accessibility to everyone and anyone who wants to tell their own story on screen.

Inclusiveness Panel
After arriving in Park City, my first stop was a panel hosted by Adobe called “Empowering Every Voice in Film and the World.” Maybe it was a combination of the excitement of Sundance and the discussion about accessibility, but it really got me thinking. The panel was expertly hosted by Adobe’s Meagan Keane and included producer, director Yance Ford (Disclosure: Trans Lives on Screen, Oscar-nominated for Strong Island); editor Eileen Meyer (Crip Camp); editor Stacy Goldate (Disclosure: Trans Lives on Screen); and director Crystal Kayiza (See You Next Time).

I walked away feeling inspired and driven to increase my efforts in accessibility. Eileen said one of her biggest opportunities came from the Karen Schmeer Film Editing Fellowship, a year-long fellowship for emerging documentary editors.

Yance drove home the idea of inclusivity and re-emphasized the idea of access to equipment. But it’s not simply about access — you also have to make a great story and figure out things like distribution. I was really struck by all the speakers on-stage, but Yance really spoke to me. He feels like the voice we need when representing marginalized groups and to see more content from these creatives. The more content we see the better.

Crystal spoke about the community needing to tell stories that don’t necessarily have standard plot points and stakes. The idea to encourage people to create their stories and for those that are in power to help and support these stories and trust the filmmakers, regardless of whether you identify with the ideas and themes.

Rebuilding Paradise

Screenings
One screening I attended was Rebuilding Paradise, directed by Ron Howard. He was at the premiere, along with some of the people who lost everything in the Paradise, California fires. In the first half of November 2018, there were several fires that raged out of control in California. One surrounded the city of Simi Valley and worked its way toward the Pacific Coast. (It was way too close for my comfort in Simi Valley. We eventually evacuated but were fine.)

Another fire was in the town of Paradise, which burnt almost the entire city to the ground. Watching Rebuilding Paradise filled me with great sadness for those who lost family members and their homes. Some of the “found footage” was absolutely breathtaking. One in particular was of a father racing out of what appears to be hell, surrounded by flames, in his car with his child asking if they were going to die. Absolutely incredible and heart wrenching.

Dinner in America

Another film I saw was Dinner in America, as referenced earlier in this piece. I love a good dark comedy/drama, so when I got a ticket to Adam Carter Rehmeier’s Dinner in America I was all geared up. Little did I know it would start off with a disgruntled 20-something throwing a chair through a window and lighting the front sidewalk on fire. Kudos to composer John Swihart, who took a pretty awesome opening credit montage and dropped the heat with his soundtrack.

Dinner in America is a mid-‘90s Napoleon Dynamite cross-pollinated with the song “F*** Authority” by Pennywise. Coincidentally, Swihart composed the soundtrack for Napoleon Dynamite. Seriously, the soundtrack to Dinner in America is worth the ticket price alone, in my opinion. It adds so much to one of the main character’s attitude. The parallel editing mixed with the fierce anti-authoritarianism love story, lived by Kyle Gallner and Emily Skeggs, make for a movie you probably won’t forget.

Adam Rehmeier

During the Q&A at the end, writer, director and editor Rehmeier described how he essentially combined two ideas that led to Dinner in America. As I watched the first 20 minutes, it felt like two separate movies, but once it came together it really paid off. Much like the cult phenomenon Napoleon Dynamite, Dinner in America will resonate with a wide audience. It’s worth watching when it comes to a theater (or streaming platform) near you. In the meantime, check out my video interview with him.

Adobe Productions
During Sundance, Adobe announced an upcoming feature for Premiere called “Productions.” While in Park City, I got a small demo of the new Productions at Adobe’s Sundance Production House. It took about 15 minutes before I realized that Adobe has added the one feature that has set Avid Media Composer apart for over 20 years — bin locking. Head’s up Avid, Adobe is about to release multi-user workflow that is much easier to understand and use than on previous iterations of Premiere.

The only thing that caught me off guard was the nomenclature — Productions and Projects. Productions is the title, but really a “Production” is a project, and what they call a “project” is a bin. If you’re familiar with Media Composer, you can create a project and inside have folders and bins. Bins are what house media links, sequences, graphics and everything else. In the new Productions update, a “Production” will house all of your “Projects” (i.e. a Project with bins).

Additionally, you will be able to lock “Projects.” This means that in a multi-user environment (which can be something like a SAN or even an Avid Nexis), a project and media can live on the shared server and be accessed by multiple users. These users can be named and identified inside of the Premiere Preferences. And much like Blackmagic’s DaVinci Resolve, you can update the “projects” when you want to — individually or all projects at once. On its face, Productions looks like the answer to what every editor has said is one of the only reasons Avid is still such a powerhouse in “Hollywood” — the ability to work relatively flawlessly among tons of editors simultaneously. If what I saw works the way it should, Adobe is looking to take a piece of the multi-user environment pie Avid has controlled for so long.

Summing Up
In the end, the Sundance Film Festival 2020 in Park City was likely a once-in-a-lifetime experience for me. From seeing celebrities, meeting other journalists, getting some free beanies and hand warmers (it was definitely not 70 degrees like California), to attending parties hosted by Canon and Light Iron — Sundance can really reinvigorate your filmmaking energy.

It’s hard to keep going when you get burnt out by just how hard it is to succeed and break through the barriers in film and multimedia creation. But seeing indie films and meeting like-minded creatives, you can get excited to create your own story. And you realize that there are good people out there, and sometimes you just have to fly to Utah to find them.

Walking down Main Street, I found a coffee shop named Atticus Coffee and Tea House. My oldest son’s name is Atticus, so I naturally had to stop in and get him something, I ended up getting him a hat and me a coffee. It was good. But what I really did was sit out front pretending to shoot b-roll and eavesdropping on some conversations. It really is true that being around thoughtful energy is contagious. And while some parts of Sundance feel like a hipster-popularity contest, there are others who are there to improve and absorb culture from all around.

The 2020 Sundance Film Festival’s theme in my eyes was to uplift other people’s stories. As Harper Lee wrote in “To Kill a Mockingbird” when Atticus Finch is talking with Scout: “First of all, if you learn a simple trick, Scout, you’ll get along a lot better with all kinds of folks. You never really understand a person until you consider things from his point of view . . . until you climb into his skin and walk around in it.”


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: Neat Video 5 noise reduction plugin

By Brady Betzel

One of the best (and most underrated) tricks in an online editor’s tool kit is to have good image restoration techniques. Removing digital video imperfections — from flicker to digital video noise — is not easy, and not easy to do well. That is, unless you have good noise reduction software like Neat Video.

While Neat Video might not be that well-known, once you see how simply (or intricatly) Neat Video 5 works inside of apps like Blackmagic’s DaVinci Resolve, it will be hard to forget the company’s name.

(While the software was recently updated to 5.1.5 — with expanded GPU support as well as support for new versions of Resolve, Adobe and Nuke — nothing really changes for this review. You can check out a detailed list of the updates here.)

Neat Video 5 is a noise reduction plugin. In a Windows OS environment, Neat Video is compatible with apps like Adobe After Effects, Adobe Premiere Pro, DaVinci Resolve, Avid Media Composer, Vegas, Magix, Edius, Virtual Dub, and the OFX-compatible apps Nuke, Fusion, Scratch, HitFilm, Mamba, Natron, Flame, Baselight and DustBuster. In a macOS environment, Neat Video 5 is compatible with After Effects, Premiere, Final Cut Pro X, Motion 5, OFX, Resolve and Media Composer. In Linux, the software is compatible with OFX-compatible apps and Resolve.

Neat Video 5 comes in three flavors: Demo, Home and Pro. The Demo version works in up to 1280×720 resolution with a watermark. Home is literally made for the home user: It will process video up to 1920×1080 resolutions, it will use up to one GPU, and it is for non-commercial use. The cost is just $74.90 for most apps (Resolve is $89.90). The Pro version has no resolution restrictions, will work on two or more GPUs simultaneously, and can be used commercially. The Pro version starts at $129.90 per app ($159.90 for Resolve). Because Neat Video 5 for OFX works with so many apps, it only comes in Pro ($249.90) and Studio ($349.90) versions. The Studio version adds the ability for a floating license. You can see all of the pricing details here.

If there is one line you should take away from this review, it is this: Neat Video 5 is by far the easiest and best noise reduction software I have used in any application to date. And while this review is focusing on the Resolve version of Neat Video 5, all other apps work in much the same way. You can find Neat Video’s software-specific Quick Start Guides to help. Once you install and register your Neat Video 5 license, removing digital video noise is as easy as applying Neat Video 5 to a node in the color tab, clicking on “Prepare Noise Profile,” clicking on “Auto Profile,” and clicking “Apply.” Then, unless you want to fine-tune your noise reduction, you are done. Obviously, I have somewhat simplified how Neat Video 5 works, but essentially it can be done in as little as three steps per clip, and the results are typically amazing. If they aren’t amazing, you can jump back into Neat Video 5 and manually adjust specifics until the noise reduction looks correct. But I will say that in about 90% of cases, the Auto Profiling will do all of the noise reduction work necessary.

For tinkerers, or for those who need to go far beyond an Auto Profile, you can manually adjust your settings. But taking a step back, Neat Video needs an area of your image that has a uniform color and noise profile to process how it removes noise. The automatic profiling will do its best to find an area, but it doesn’t always work. What you need to keep in mind when building a good noise profile inside of Neat Video is that the area being processed needs to be as uniform as possible (think dark night sky or a wall painted in one color) — meaning no prominent features, a high noise level (something in the high four area is better), the largest possible sample area and no warnings from Neat Video.

So, if your automatic profile doesn’t do the job, you can find an area of your image that meets the above requirements and then build a profile. From there you can use one of the Neat Video 5 features, like “Profile Check.” Profile Check will highlight details that aren’t being affected by Neat Video, giving you a clear representation of what noise is being reduced and whether you need to adjust your profile to better reduce video noise.

At this point you might be wondering where you tweak advanced settings. When you load Neat Video, you will be in Beginner mode. To get into Advanced mode, go to the Tools menu, where you will see a lot of advanced functions that can help you fine-tune your noise profile. And if you still can’t get a good noise reduction profile, you can try out the “Generic Profile,” which can help you build a profile even if your video doesn’t have a large enough area of uniform noise. There are also presets — such as like light flicker, moire flicker, repeat frame issues, dust and scratch filters (including scan lines), jitter of details, artifact removal filter and more — that can solve certain problems.

Neat Video 5 is faster than previous generations. As in previous versions, there is even a tool that inside of Neat Video preferences that will run your CPU and GPU through a benchmark to specify whether you should run on CPU only, GPU only, or a combo of both. In Neat Video 5, if you have trouble with a clip, you can use up to four “Variants” of noise reduction in the new playback window to see how each profile works with your clip.

In terms of playback and rendering, noise reduction is never fast. However, inside of Neat Video the new playback window will typically play back your footage to preview the noise reduction before you jump back into Resolve. Inside of Resolve, even in just 1080p, my sequence would crawl to just a few frames of playback per second. It is one of the most processor- and GPU-intensive tasks you will run on your computer.

In my testing I applied Neat Video 5 to the first node in my color correction tree, followed by a basic color correction in a one-minute timeline. I took those same clips and compared my Neat Video results to Resolve’s Temporal and Spatial noise reduction tools. In terms of visual results, Neat Video 5 was superior. If that’s not the case for you, then jump into YCbCr viewer mode inside of Neat Video 5, isolate each channel and tweak each channel individually so you won’t affect your overall noise reduction if it isn’t necessary. Not only did Neat Video 5 handle normal noise in the shadows well but on clips with very tight lines, it was able to keep a lot of the details while removing the noise. Resolve’s noise reduction tools had a harder time removing noise but keeping detail. Temporal noise reduction really didn’t do much, and while Spatial noise reduction did work it would heavily blur and distort the image — essentially not acceptable.

To get a good example of how Neat Video 5 slams a computer system, I exported 1080p MP4s. Resolve’s built-in Temporal noise reduction took 1:03, while the Spatial noise reduction took 1:05. The Neat Video 5 render of the same one-minute timeline took 3:51 — almost four times as long. I was curious how much longer a 4K render would take. Using 4K (UHD) media, I applied a simple color correction and on a previous serial node that applied Neat Video 5. I exported a 4K (UHD) MP4, which took 52 seconds without Neat Video 5 applied and 16:27 with Neat Video applied — at least 16 times more render time! So while Neat Video 5 is an amazing tool, there is a trade-off in high render times.

To find additional training on more advanced noise reduction techniques in Neat Video, check out the video tutorials. I find myself watching these just because of how much you can learn about noise reduction in general. They aren’t as exciting as watching Game of Thrones or The Handmaid’s Tale, but they will push your knowledge in noise reduction to the next level.

Summing Up
I’ve used Neat Video for a while, so when I was approached to review Version 5 I immediately said yes. Noise reduction is post skill that not many possess.

If you are an online editor or colorist looking to separate yourself from the pack, learn all the noise reduction techniques you can and definitely check out Neat Video 5. Not only can Neat Video 5 work automatically, but you can fine-tune your noise reduction as much as you want.

And when demoing your color correction services, think about using Neat Video 5 to remove camera noise, flickering and chroma issues; color correcting your footage; and, finally, adding some grain back into your shot. Not only will your footage look better, but you’ll have a technical workflow that will definitely impress clients. Just don’t forget to account for the extra render time.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: FilmConvert Nitrate for film stock emulation

By Brady Betzel

If you’ve been around any sort of color grading forums or conferences, you’ve definitely heard some version of this: Film is so much better than digital. While I don’t completely disagree with the sentiment, let’s be real. We are in a digital age, and the efficiency and cost associated with digital recording is, in most cases, far superior to film.

Personally, I love the way film looks; it has an essence that is very difficult to duplicate — from the highlight roll-offs to the organic grain — but it is very costly. That is why film is hard to imitate digitally, and that is why so many companies try and often fail.

Sony A7iii footage

One company that has had grassroots success with digital film stock emulation is FilmConvert. The original plugin, known as FilmConvert Pro, works with Adobe’s Premiere and After Effects, Avid Media Composer and as an OFX plugin for apps like Blackmagic’s DaVinci Resolve.

Recently, FilmConvert expanded its lineup with the introduction of Nitrate, a film emulation plugin that can take Log-based video and transform it into full color corrected media with a natural grain similar to that of commonly loved film stocks. Currently, Nitrate works with Premiere and After Effects, with an OFX version for Resolve. A plugin for FCPX is coming in March.

The original FilmConvert Pro plugin works great, but it adjusts your image through an sRGB pipeline. That means FilmConvert Pro adjusts any color effects after your “base” grade is locked in while living in an sRGB world. While you download camera-specific “packs” that apply the film emulation — custom-made based on your sensor and color space — you are still locked into an sRGB pipeline, with little wiggle room. This means sometimes blowing out your highlights and muddying your shadows with little ability to recover any details.

SonyA7iii footage

I imagine FilmConvert Pro was introduced at a time when a lot of users shot with cameras like Canon 5D or other sRGB cameras that weren’t shooting in a Log color space. Think of using a LUT and trying to adjust the highlights and shadows after the LUT; typically, you will have a hard time getting any detail back, losing dynamic range even if your footage was shot Log. But if you color before a LUT (think Log footage), you can typically recover a lot of information as long as your shot was recorded properly. That blown-out sky might be able to be recovered if shot in a Log colorspace. This is what FilmConvert is solving with its latest offering, Nitrate.

How It Works
FilmConvert’s Nitrate works in a Cineon-Log processing pipeline for its emulation, as well as a full Log image processing pipeline. This means your highlights and shadows are not being heavily compressed into an sRGB color space, which allows you to fine-tune your shadows and highlights without losing as much detail. Simply, it means that the plugin will work more naturally with your footage.

In additional updates, FilmConvert has overhauled its GUI to be more natural and fluid. The Color Wheels have been redesigned, a new color tint slider has been added to quickly remove any green or magenta color cast, a new Color Curve control has been added, and there is now a Grain Response curve.

Grain Response

The Grain Response curve takes adding grain to your footage up a notch. Not only can you select between 8mm and 35mm grain sizes (with many more in between) but you can adjust the application of that grain from shadows to highlights. If you want your highlights to have more grain response, just point the Grain Response curve higher up. In the same window you can adjust the grain size, softness, strength and saturation via sliders.

Of the 19 film emulation options to choose from, there are many unique and great-looking presets. From the “KD 5207 Vis3” to the “Plrd 600,” there are multiple brands and film stocks offered. For instance, the “Kodak 5207 Vis3” is described on Kodak’s website in more detail:

“Vision3 250D Film offers outstanding performance in the extremes of exposure — including increased highlight latitude, so you can move faster on the set and pull more detail out of the highlights in post. You’ll also see reduced grain in shadows, so you can push the boundaries of underexposure and still get outstanding results.”

One of my favorite emulations in Nitrate — “Fj Velvia 100” or Fujichrome Velvia 100 — is described on FilmConvert’s website:

“FJ Velvia 100 is based on the Fujichrome Velvia 100 photographic film stock. Velvia is a daylight-balanced color reversal film that provides brighter ultra-high-saturation color reproduction. The Velvia is especially suited to scenery and nature photography as well as other subjects that require precisely modulated vibrant color reproduction.”

Accurate Grain

FilmConvert’s website offers a full list of the 19 film stocks, as well as examples and detailed descriptions of each film stock.

Working With FilmConvert Nitrate
I used Nitrate strictly in Premiere Pro because the OFX version (specifically for Resolve) wasn’t available at the time of this review.

Nitrate works pretty well inside of Premiere, and surprisingly plays back fluidly — this is probably thanks to its GPU acceleration. Even with Sony a7 III UHD footage, Premiere was able to keep up with Lumetri Color layered underneath the FilmConvert Nitrate plugin. To be transparent I tested Nitrate on a laptop with an Intel i7 CPU and an Nvidia RTX 2080 GPU, so that definitely helps.

At first, I struggled to see where I would fit FilmConvert’s Nitrate plugin into my normal workflow so I could color correct my own footage and add a grade later. However, when I started cycling through the different film emulations, I quickly saw that they were adding a lot of life to the images and videos. Whether it was the grain that comes from the updated 6K grain scans in Nitrate or the ability to identify which camera and color profile you used when filming via the downloadable camera packs, FilmConvert’s Nitrate takes well-colored footage and elevates it to finished film levels.

It’s pretty remarkable; I came in thinking FilmConvert was essentially a preset LUT plugin and wasn’t ready for it to be great. To my surprise, it was great and it will add the extra edge of professional feeling to your footage quickly and easily.

Test 1
In my first test, I threw some clips I had shot on a Sony a7 III camera in UHD (at SLog3 — SGamut3) into a timeline, applied the FilmConvert Nitrate plugin and realized I needed to download the Sony camera packs. This pack was about 1GB, but others —like the Canon 5D Mark II — came in at just over 300MB. Not the end of the world, but if you have multiple cameras, you are going to need to download quite a few packs and the download size will add up.

Canon 5D

I tried using just the Nitrate plugin to do color correction and film emulation from start to finish, but I found the tools a little cumbersome and not really my style. I am not the biggest fan of Lumetri color correction tools, but I used those to get a base grade and apply Nitrate over that grade. I tend to like to keep looks to their own layer, so coloring under Nitrate was a little more natural to me.

A quick way to cycle through a bunch of looks quickly is to apply Nitrate to the adjustment layer and hit the up or down arrows. As I was flicking through the different looks, I noticed that FilmConvert does a great job processing the film emulations with the specified camera. All of the emulations looked good with or without a color balance done ahead of time.

It’s like adding a LUT and then a grade all in one spot. I was impressed by how quickly this worked and how good they all looked. When I was done, I rendered my one-minute sequence out of Adobe Media Encoder, which took 45 seconds to encode a ProResHQ and 57 seconds for an H.264 at 10Mb/s. For reference, the uncolored version of this sequence took 1:17 for the ProResHQ and :56 for the H.264 at 10Mb/s. Interesting, because the Nvidia RTX 2080 GPU definitely kicked in more when the FilmConvert Nitrate effect was added. That’s a definite plus.

Test 2
I also shot some clips using the Blackmagic Pocket Cinema Camera (BMPCC) and the Canon 5D Mark II. With the BMPCC, I recorded CinemaDNG files in the film color space, essentially Log. With the 5D, the videos were recorded as Movie files wrapped in MP4 files (unless you shoot with the Magic Lantern hack, which allows you to record in the raw format). I brought in the BMPCC CinemaDNG files via the Media Browser as well as imported the 5D Movs and applied the FilmConvert Nitrate plugin to the clips. Keep in mind you will need to download and install those camera packs if you haven’t already.

Pocket Cinema Camera

For the BMPCC clips I identified the camera and model as appropriate and chose “Film” under profile. It seemed to turn my CinemaDNG files a bit too orange, which could have been my white balance settings and/or the CinemaDNG processing done by Premiere. I could swing the orange hue out by using the temperature control, but it seemed odd to have to knock it down to -40 or -50 for each clip. Maybe it was a fluke, but with some experimentation I got it right.

With the Canon 5D Mark II footage, I chose the corresponding manufacturer and model as well as the “Standard” profile. This worked as it should. But I also noticed some other options like Prolost, Marvel, VisionTech, Technicolor, Flaat and Vision Color — these are essentially color profiles people have made for the 5D Mark II. You can find them with a quick Google search.

Summing Up
In the end, FilmConvert’s Nitrate will elevate your footage. The grain looks smooth and natural, the colors in the film emulation add a modern take on nostalgic color corrections (that don’t look too cheesy), and most cameras are supported via downloads. If you don’t have a large budget for a color grading session you should be throwing $139 at FilmConvert for its Nitrate plugin.

Nitrate in Premiere

When testing Nitrate on a few different cameras, I noticed that it even made color matching between cameras a little bit more consistent. Even if you have a budget for color grading, I would still suggest buying Nitrate; it can be a great starting block to send to your colorist for inspiration.

Check out FilmConvert’s website and definitely follow them on Instagram, where they are very active and show a lot of before-and-afters from their users  — another great source of inspiration.

Main Image: Two-year-old Oliver Betzel shot with a Canon 5D with KD P400 Ptra emulsion applied.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: The Sensel Morph hardware interface

By Brady Betzel

As an online editor and colorist, I have tried a lot of hardware interfaces designed for apps like Adobe Premiere, Avid Media Composer, Blackmagic DaVinci Resolve and others. With the exception of professional color correction surfaces like the FilmLight Baselight, the Resolve Advanced Panel and Tangent’s Element color correction panels, it’s hard to get exactly what I need.

While they typically work well, there is always a drawback for my workflow; usually they are missing one key shortcut or feature. Enter Sensel Morph, a self-proclaimed morphable hardware interface. In reality, it is a pressure-sensitive trackpad that uses individual purchasable magnetic rubber overlays and keys for a variety of creative applications. It can also be used as a pressure-sensitive trackpad without any overlays.

For example, inside of the Sensel app you can identify the Morph as a trackpad and click “Send Map to Morph,” and it will turn itself into a large trackpad. If you are a digital painter, you can turn the Morph into “Paintbrush Area” and use a brush and/or your fingers to paint! Once you understand how to enable the different mappings you can quickly and easily Morph between settings.

For this review, I am going to focus on how you can use the Sensel Morph with Adobe Premiere Pro. For the record, you can actually use it with any NLE by creating your own map inside of the Sensel app. The Morph essentially works with keyboard shortcuts for NLEs. With that in mind, if you customize your keyboard shortcuts you are going to want to enable the default mapping inside of Premiere or adjust your settings to match the Sensel Morph’s settings.

Before you plug in your Morph, you will need to click over to https://sensel.com/pages/support, where you can get a quick-start guide in addition to the Sensel app you will need to install before you get working. After it’s downloaded and installed, you will want to plug in the Morph via the USB and let it charge before using the Bluetooth connection. It took a while for the Morph to fully charge, about two hours, but once I installed the Sensel app, added the Video Editing Overlay and opened Adobe Premiere, I was up and working.

To be honest, I was a little dubious about the Sensel Morph. A lot of these hardware interfaces have come across my desk, and they usually have poor software implementation, or the hardware just doesn’t hold up. But the Sensel Morph broke through my preconceived ideas of hardware controllers for NLEs like Premiere, and for the first time in a long time, I was inspired to use Premiere more often.

It’s no secret that I learned professional editing in Avid Media Composer and Symphony. And most NLEs can’t quite rise to the level of professional experience that I have experienced in Symphony. One of those experiences is how well and fluid the keyboard and Wacom tablet work together. The first time I plugged in the Sensel Morph, overlayed the Video Editing Overlay on top of the Morph and opened Premiere, I began to have that same feeling but inside of Premiere!

While there are still things Premiere has issues with, the Sensel Morph really got me feeling good about how well this Adobe NLE worked. And to be honest, some of those issues relate to me not learning Premiere’s keyboard shortcuts like I did in Avid. The Sensel Morph felt like a natural addition to my Premiere editing workflow. It was the first time I started to feel that “flow state” inside of Premiere that I previously got into when using Media Composer or Symphony, and I started trimming and editing like a mad man. It was kind of shocking to me.

You may be thinking that I am blowing this out of proportion, and maybe I am, a little, but the Morph immediately improved my lazy Premiere editing. In fact, I told someone that Adobe should package these with first-time Premiere users.

I really like the way the timeline navigation works (much like the touch bar). I also like the quick Ripple Left/Right commands, and I like how you can quickly switch timelines by pressing the “Timeline” button multiple times to cycle through them. I did feel like I needed a mouse some of the time and keyboard for some of the time, but for about 60% of the time I could edit without them. Much like how I had to force myself to use a Wacom tablet for editing, if you try not to use a mouse I think you will get by just fine. I did try and use a Wacom stylus with the Sensel Morph and, unfortunately, it did not work.

What improvements could the Sensel Morph make? Specifically in Premiere, I wish they had a full-screen shortcut (“`”) labeled on the Morph. It’s one of those shortcuts I use all the time, whether I want to see my timeline full screen, the effects controls full screen or the Program feed full screen. And while I know I could program it using the Sensel app, the OCD in me wants to see that reflected onto the keys. While we are on the keys subject, or overlay, I do find it a little hard to use when I customize the key presses. Maybe ordering a custom printed overlay could assuage this concern.

One thing I found odd was the GPU usage that the Sensel app needed. My laptop’s fans were kicking on, so I opened up Task Manager and saw that the Sensel app was taking 30% of my Nvidia RTX 2080. Luckily, you really only need it open when changing overlays or turning it into a trackpad, but I found myself leaving it open by accident, which could really hurt performance.

Summing Up
In the end, is the Sensel Morph really worth the $249? It does come with one free overlay of your choice with the $249 purchase price, along with a one-year warranty; but if you want more overlays those will set you back from $35 to $59 depending on the overlay.

The Video Editing one is $35 while the new Buchla Thunder overlay is $59. From a traditional Keyboard, Piano Key, Music Production, or even Drum Pad Overlay there are a few different options you can choose from. If you are a one-person band that goes between Premiere and apps like Abelton, then it’s 100 percent worth it. If you use Premiere a lot, I still think it is worth it. The iPad Mini-size and weight is really nice, and when using over Bluetooth you feel untethered. Its sleek and thin design really allows you to bring this morphable hardware interface with you anywhere you take your laptop or tablet.

The Sensel Morph is not like any of the other hardware interfaces I have used. Not only is it extremely mobile, but it works well and is compatible with a lot of content creation apps that pros use daily. They really delivered on this one.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: Lenovo Yoga A940 all-in-one workstation

By Brady Betzel

While more and more creators are looking for alternatives to the iMac, iMac Pro and Mac Pro, there are few options with high-quality, built-in monitors: Microsoft Surface Studio, HP Envy, and Dell 7000 are a few. There are even fewer choices if you want touch and pen capabilities. It’s with that need in mind that I decided to review the Lenovo Yoga A940, a 27-inch, UHD, pen- and touch-capable Intel Core i7 computer with an AMD Radeon RX 560 GPU.

While I haven’t done a lot of all-in-one system reviews like the Yoga A940, I have had my eyes on the Microsoft Surface Studio 2 for a long time. The only problem is the hefty price tag of around $3,500. The Lenovo’s most appealing feature — in addition to the tech specs I will go over — is its price point: It’s available from $2,200 and up. (I saw Best Buy selling a similar system to the one I reviewed for around $2,299. The insides of the Yoga and the Surface Studio 2 aren’t that far off from each other either, at least not enough to make up for the $1,300 disparity.)

Here are the parts inside the Lenovo Yoga A940: Intel Core i7-8700 3.2GHz processor (up to 4.6GHz with Turbo Boost), six cores (12 threads) and 12MB cache; 27-inch 4K UHD IPS multitouch 100% Adobe RGB display; 16GB DDR4 2666MHz (SODIMM) memory; 1TB 5400 RPM drive plus 256GB PCIe SSD; AMD Radeon RX 560 4GB graphics processor; 25-degree monitor tilt angle; Dolby Atmos speakers; Dimensions: 25 inches by 18.3 inches by 9.6 inches; Weight: 32.2 pounds; 802.11AC and Bluetooth 4.2 connectivity; side panel inputs: Intel Thunderbolt, USB 3.1, 3-in-1 card reader and audio jack; rear panel inputs: AC-in, RJ45, HDMI and four USB 3.0; Bluetooth active pen (appears to be the Lenovo Active Pen 2); and QI wireless charging technology platform.

Digging In
Right off the bat, I just happened to put my Android Galaxy phone on the odd little flat platform located on the right side of the all-in-one workstation, just under the monitor, and I saw my phone begin to charge wirelessly. QI wireless charging is an amazing little addition to the Yoga; it really comes through in a pinch when I need my phone charged and don’t have the cable or charging dock around.

Other than that nifty feature, why would you choose a Lenovo Yoga A940 over any other all-in-one system? Well, as mentioned, the price point is very attractive, but you are also getting a near-professional-level system in a very tiny footprint — including Thunderbolt 3 and USB connections, HDMI port, network port and SD card reader. While it would be incredible to have an Intel i9 processor inside of the Yoga, the i7 clocks in at 3.2GHz with six cores. Not a beast, but enough to get the job done inside of Adobe Premiere and Blackmagic’s DaVinci Resolve, but maybe with transcoded files instead of Red raw or the like.

The Lenovo Yoga A940 is outfitted with a front-facing Dolby Atmos audio speaker as well as Dolby Vision technology in the IPS display. The audio could use a little more low end, but it is good. The monitor is surprisingly great — the whites are white and the blacks are black; something not everyone can get right. It has 100% Adobe RGB color coverage and is Pantone-validated. The HDR is technically Dolby Vision and looks great at about 350 nits (not the brightest, but it won’t burn your eyes out either). The Lenovo BT active pen works well. I use Wacom tablets and laptop tablets daily, so this pen had a lot to live up to. While I still prefer the Wacom pen, the Lenovo pen, with 4,096 levels of sensitivity, will do just fine. I actually found myself using the touchscreen with my fingers way more than the pen.

One feature that sets the A940 apart from the other all-in-one machines is the USB Content Creation dial. With the little time I had with the system, I only used it to adjust speaker volume when playing Spotify, but in time I can see myself customizing the dials to work in Premiere and Resolve. The dial has good action and resistance. To customize the dial, you can jump into the Lenovo Dial Customization Assistant.

Besides the Intel i7, there is an AMD Radeon RX 560 with 4GB of memory, two 3W and two 5W speakers, 32 GB of DDR4 2666 MHz memory, a 1 TB 5400 RPM hard drive for storage, and a 256GB PCIe SSD. I wish the 1TB drive was also an SSD, but obviously Lenovo has to keep that price point somehow.

Real-World Testing
I use Premiere Pro, After Effects and Resolve all the time and can understand the horsepower of a machine through these apps. Whether editing and/or color correcting, the Lenovo A940 is a good medium ground — it won’t be running much more than 4K Red raw footage in real time without cutting the debayering quality down to half if not one-eighth. This system would make a good “offline” edit system, where you transcode your high-res media to a mezzanine codec like DNxHR or ProRes for your editing and then up-res your footage back to the highest resolution you have. Or, if you are in Resolve, maybe you could use optimized media for 80% of the workflow until you color. You will really want a system with a higher-end GPU if you want to fluidly cut and color in Premiere and Resolve. That being said, you can make it work with some debayer tweaking and/or transcoding.

In my testing I downloaded some footage from Red’s sample library, which you can find here. I also used some BRAW clips to test inside of Resolve, which can be downloaded here. I grabbed 4K, 6K, and 8K Red raw R3D files and the UHD-sized Blackmagic raw (BRAW) files to test with.

Adobe Premiere
Using the same Red clips as above, I created two one-minute-long UHD (3840×2160) sequences. I also clicked “Set to Frame Size” for all the clips. Sequence 1 contained these clips with a simple contrast, brightness and color cast applied. Sequence 2 contained these same clips with the same color correction applied, but also a 110% resize, 100 sharpen and 20 Gaussian Blur. I then exported them to various codecs via Adobe Media Encoder using the OpenCL for processing. Here are my results:

QuickTime (.mov) H.264, No Audio, UHD, 23.98 Maximum Render Quality, 10 Mb/s:
Color Correction Only: 24:07
Color Correction w/ 110% Resize, 100 Sharpen, 20 Gaussian Blur: 26:11
DNxHR HQX 10 bit UHD
Color Correction Only: 25:42
Color Correction w/ 110% Resize, 100 Sharpen, 20 Gaussian Blur: 27:03

ProRes HQ
Color Correction Only: 24:48
Color Correction w/ 110% Resize, 100 Sharpen, 20 Gaussian Blur: 25:34

As you can see, the export time is pretty long. And let me tell you, once the sequence with the Gaussian Blur and Resize kicked in, so did the fans. While it wasn’t like a jet was taking off, the sound of the fans definitely made me and my wife take a glance at the system. It was also throwing some heat out the back. Because of the way Premiere works, it relies heavily on the CPU over GPU. Not that it doesn’t embrace the GPU, but, as you will see later, Resolve takes more advantage of the GPUs. Either way, Premiere really taxed the Lenovo A940 when using 4K, 6K and 8K Red raw files. Playback in real time wasn’t possible except for the 4K files. I probably wouldn’t recommend this system for someone working with lots of higher-than-4K raw files; it seems to be simply too much for it to handle. But if you transcode the files down to ProRes, you will be in business.

Blackmagic Resolve 16 Studio
Resolve seemed to take better advantage of the AMD Radeon RX 560 GPU in combination with the CPU, as well as the onboard Intel GPU. In this test I added in Resolve’s amazing built-in spatial noise reduction, so other than the Red R3D footage, this test and the Premiere test weren’t exactly comparing apples to apples. Overall the export times will be significantly higher (or, in theory, they should be). I also added in some BRAW footage to test for fun, and that footage was way easier to work and color with. Both sequences were UHD (3840×2160) 23.98. I will definitely be looking into working with more BRAW footage. Here are my results:

Playback: 4K realtime playback at half-premium, 6K no realtime playback, 8K no realtime playback

H.264 no audio, UHD, 23.98fps, force sizing and debayering to highest quality
Export 1 (Native Renderer)
Export 2 (AMD Renderer)
Export 3 (Intel QuickSync)

Color Only
Export 1: 3:46
Export 2: 4:35
Export 3: 4:01

Color, 110% Resize, Spatial NR: Enhanced, Medium, 25; Sharpening, Gaussian Blur
Export 1: 36:51
Export 2: 37:21
Export 3: 37:13

BRAW 4K (4608×2592) Playback and Export Tests

Playback: Full-res would play at about 22fps; half-res plays at realtime

H.264 No Audio, UHD, 23.98 fps, Force Sizing and Debayering to highest quality
Color Only
Export 1: 1:26
Export 2: 1:31
Export 3: 1:29
Color, 110% Resize, Spatial NR: Enhanced, Medium, 25; Sharpening, Gaussian Blur
Export 1: 36:30
Export 2: 36:24
Export 3: 36:22

DNxHR 10 bit:
Color Correction Only: 3:42
Color, 110% Resize, Spatial NR: Enhanced, Medium, 25; Sharpening, Gaussian Blur: 39:03

One takeaway from the Resolve exports is that the color-only export was much more efficient than in Premiere, taking just over three or four times realtime for the intensive Red R3D files, and just over one and a half times real time for BRAW.

Summing UpIn the end, the Lenovo A940 is a sleek looking all-in-one touchscreen- and pen-compatible system. While it isn’t jam-packed with the latest high-end AMD GPUs or Intel i9 processors, the A940 is a mid-level system with an incredibly good-looking IPS Dolby Vision monitor with Dolby Atmos speakers. It has some other features — like IR camera, QI wireless charger and USB Dial — that you might not necessarily be looking for but love to find.

The power adapter is like a large laptop power brick, so you will need somewhere to stash that, but overall the monitor has a really nice 25-degree tilt that is comfortable when using just the touchscreen or pen, or when using the wireless keyboard and mouse.

Because the Lenovo A940 starts at just around $2,299 I think it really deserves a look when searching for a new system. If you are working in primarily HD video and/or graphics this is the all-in-one system for you. Check out more at their website.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: PixelTools V.1 PowerGrade presets for Resolve

By Brady Betzel

Color correction and color grading can be tricky (especially for those of us who don’t work as a dedicated colorist). And to be good at one doesn’t necessarily mean you will be good at the other. After watching hundreds of hours of tutorials, the only answer to getting better at color correction and color grading is to practice. As trite and cliche as it sounds, it’s the truth. There is also the problem of a creative block. I can sometimes get around a creative block when color correcting or general editing by trying out of the box ideas, like adding a solid color on top of footage and changing blend modes to spark some ideas.

An easier way to get a bunch of quick looks on your footage is with LUTs (Look Up Tables) and preset color grades. LUTs can sometimes work at getting your footage into an acceptable spot color correction-wise or technically, in the correct color space (the old technical vs. creative LUTs discussion). They often need (or should) be tweaked to fit the footage you are using.

Dawn

This is where PixelTool’s PowerGrade presets for Blackmagic’s DaVinci Resolve come in to play. PixelTool’s presets give you that instant wow of a color grade, sharpening and even grain, but with the flexibility to tweak and adjust to your own taste.

PixelTool’s PowerGrade V.1 are a set of Blackmagic’s DaVinci Resolve PowerGrades (essentially pre-built color grades sometimes containing noise reduction, glows or film grain) that retail for $79.99. Once purchased, the PowerGrade presets can be downloaded immediately. If you aren’t sure about the full commitment to purchase for $79.99, you can download eight sample PowerGrade presets to play with by signing up for PixelTools’ newsletter.

While it doesn’t typically matter what version of Resolve you are using with the PixelTool PowerGrade, you will probably want to make sure you are using Resolve Studio 15 (or higher) or you may miss out on some of the noise reduction or film. I’m running Resolve 16 Studio.

What are PowerGrades? In Resolve, you can save and access pre-built color correction node trees across all projects in a single database. This way if you have an amazing orange and teal, bleach bypass, or maybe a desaturated look with a vignette and noise reduction that you don’t want to rebuild inside every project you can them in the PowerGrades folder in the color correction tab. Easy! Just go into the Color Correction Tab > Gallery (in the upper left corner) > click the little split window icon > right click and “Add PowerGrade Album.”

Golden

Installing the PixelTools presets is pretty easy, but there are a few steps you are going to want to follow if you’ve never made a PowerGrades folder before. Luckily, there is a video just for that. Once you’ve added the presets into your database you can access over 110 grades in both Log and Rec 709 color spaces. In addition, there is a folder of “Utilities,” which offers some helpful tools like Scanlines (Mild-Intense), various Vignettes, Sky Debanding, preset Noise Reductions, two-and three-way Grain Nodes and much more. Some of the color grading presets can fit on one node but some have five or six nodes like the “2-Strip Holiday.” They will sometimes be applied as a Compound Node for organization-sake but can be decomposed to see all the goodness inside.

The best part of PixelTools, other than the great looks, is the ability to decompose or view the Compound Node structure and see what’s under the hood. Not only does it make you appreciate all of the painstaking work that is already done for you, but you can study it, tweak it and learn from it. I know a lot of companies that don’t like to reveal how things are done, but with PixelTools you can break the grades. Follows my favorite motto: “A rising tide lifts all boats” mindset.

From the understated “2-Strip Holiday” look to the crunchy “Bleach Duotone 2” with the handy “Saturation Adjust” node on the end of the tree, PixelTools is the prime example of pre-built looks that can be as easy as drag-and-dropping onto a clip or as intricate as adjusting each node to the way you like it. One of my favorite looks is a good-old Bleach Bypass — use two layer nodes (one desaturated and one colored), layer mix with a composite mode set to Overlay and adjust saturation to taste. The Bleach Bypass setup is not a tightly guarded secret, but PixelTools gets you right to the Bleach Bypass look with the Bleach Duotone 2 and also adds a nice orange and teal treatment on top.

2-Strip Holiday

Now I know what you are thinking — “Orange and Teal! Come on, what are we Michael Bay making Transformers 30?!” Well, the answer is, obviously, yes. But to really dial the look to taste on my test footage I brought down the Saturation node at the end of the node tree to around 13%, and it looks fantastic! Moral of the story is: always dial in your looks, especially with presets. Just a little customization can take your preset-look to a personalized look quickly. Plus, you won’t be the person who just throws on a preset and walks away.

Will these looks work with my footage? If you shot in a Log-ish style like SLog or BMD Film, Red Log Film or even GoPro Flat you can use the Log presets and dial them to taste. If you shot footage in Rec. 709 with your Canon 5D Mark II, you can just use the standard looks. And if you want to create your own basegrade on Log footage just add the PixelTool PowerGrade Nodes after!

Much like my favorite drag-and-drop tools from Rampant Design, PixelTools will give you a jump on your color grading quickly and if nothing else can maybe shake loose some of that colorist creative block that creeps in. Throw on that “Fuji 1” or “Fuji 2” look, add a serial node in the beginning and crank up the red highlights…who knows it may give you some creative jumpstart that you are looking for. Know the rules to break the rules, but also break the rules to get those creative juices flowing.

Saturate-Glow-Shadows

Summing Up
In the end, PixelTools is not just a set of PowerGrades for DaVinci Resolve, they can also be creative jumpstarts. If you think your footage is mediocre, you will be surprised at what a good color grade will do. It can save your shoot. But don’t forget about the rendering when you are finished. Rendering speed will still be dependent on your CPU and GPU setup. In fact, using an Asus ConceptD 7 laptop with an Nvidia RTX 2080 GPU, I exported a one-minute long Blackmagic Raw sequence with only color correction (containing six clips) to 10-bit DPX files in :46 seconds, with a random PixelTools PowerGrade applied to each clip it took :40 seconds! In this case the Nvidia RTX 2080 really aided in the fast export but your mileage may vary.

Check out pixeltoolspost.com and make sure to at least download their sample pack. From the one of five Kodak looks, two Fuji Looks, Tobacco Newspaper to Old Worn VHS 2 with a hint of chromatic aberration you are sure to find something that fits your footage.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: Samsung’s 970 EVO Plus 500GB NVMe M.2 SSD

By Brady Betzel

It seems that the SSD drives are dropping in price by the hour. (This might be a slight over-exaggeration, but you understand what I mean.) Over the last year or so there has been a huge difference in pricing, including high-speed NVMe SSD drives. One of those is the highly touted Samsung EVO Plus NVMe line.

In this review, I am going to go over Samsung’s 500GB version of the 970 EVO Plus NVMe M.2 SSD drive. The Samsung 970 EVO Plus NVMe M.2 SSD drive comes in four sizes — 250GB, 500GB, 1TB, and 2TB — and retails (according to www.samsung.com) for $74.99, $119.99, $229.99 and $479.99, respectively. For what it’s worth, I really didn’t see much of price difference on other sites I visited, namely Amazon.com and Best Buy.

On paper, the EVO Plus line of drives can achieve speeds of up to 3,500MB/s read and 3,300MB/s write. Keep in mind that the lower the storage size the lower the read/write speeds will be. For instance, the EVO Plus 250GB SSD can still get up to 3,500MB/s in sequential read speeds, while the sequential write speeds dwindle down to max speeds of 2,300MB/s. Comparatively, the “standard” EVO line can get 3,400MB/s to 3,500MB/s sequential read speeds and 1,500MB/s sequential write speeds on the 250GB EVO SSD. The 500GB version costs just $89.99, but if you need more storage size, you will have to pay more.

There is another SSD to compare the 970 EVO Plus to, and that is the 970 Pro, which only comes in 512GB and 1TB sizes — costing around $169.99 and $349.99, respectively. While the Pro version has similar read speeds to the Plus (up to 3,500MB/s read) and actually slower write speeds (up to 2,700MB/s), the real ticket to admission for the Samsung 970 Pro is the Terabytes Written (TBW) warranty period. Samsung warranties the 970 line of drives for five years or Terabytes Written, whichever comes first. In the 500GB line of 970 drives, the “standard” and Plus 970 cover 300TBW, while the Pro covers a whopping 600TBW.

Samsung says its use of the latest V-NAND technology, in addition to its Phoenix controller, provides the highest speeds and power efficiency of the EVO NVMe drives. Essentially, V-NAND is a way to vertically stack memory instead of the previous method of stacking memory in a planar way. Stacking vertically allows for more memory in the same space in addition to longer life spans. You can read more about the Phoenix controller here.

If you are like me and want both a good warranty (or, really, faith in the product) and blazing speeds, check out the Samsung 970 EVO Plus line of drives. Great price point with almost all of the features as the Pro line. The 970 line of NVMe M.2 SSD drives fits the 2280 form factor (meaning 22mm x 80mm) and fits an M key-style interface. It’s important to understand what interface your SSD is compatible with: either M key (or M) or B key. Cards in the Samsung 970 EVO line are all M key. Most newer motherboards will have at least one if not two M.2 ports to plug drives into. You can also find PCIe adapters for under $20 or $30 on Amazon that will give you essentially the same read/write speeds. External USB 3.1 Gen 2, USB-C enclosures can also be found that will give you an easier way of replacing the drives when needed without having to open your case.

One really amazing way to use these newly lower-priced drives: When color correcting, editing, and/or performing VFX miracles in apps like Adobe Premiere Pro or Blackmagic Resolve, use NVMe drives for only cache, still stores, renders and/or optimized media. With the low cost of these NVMe M.2 drives, you might be able to include the price of one when charging a client and throw it on the shelf when done, complete with the project and media. Not only will you have a super-fast way to access the media, but you can easily get another one in the system when using an external drive.

Summing Up
In the end, the price points of the Samsung 970 EVO Plus NVMe M.2 drives are right in the sweet spot. There are, of course, competing drives that run a little bit cheaper, like the Western Digital Black SN750 NVMe SSDs (at around $99 for the 500GB model), but they come with a slightly slower read/write speed. So for my money, the Samsung 970 line of NVMe drives is a great combination of speed and value that can take your computer to the next level.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: Accusonus Era 4 Pro audio repair plugins

By Brady Betzel

With each passing year it seems that the job title of “editor” changes. It’s not just someone responsible for shaping the story of the show but also for certain aspects of finishing, including color correction and audio mixing.

In the past, when I was offline editing more often, I learned just how important sending a properly mixed and leveled offline cut was. Whether it was a rough cut, fine cut or locked cut — the mantra to always put my best foot forward was constantly repeating in my head. I am definitely a “video” editor but, as I said, with editors becoming responsible for so many aspects of finishing, you have to know everything. For me this means finding ways to take my cuts from the middle of the road to polished with just a few clicks.

On the audio side, that means using tools like Accusonus Era 4 Pro audio repair plugins. Accusonus advertises these Era 4 plugins as one-button solutions, and they are as easy as one button but you can also nuance the audio if you like. The Era 4 Pro plugins work not only work with your typical DAW like Pro Tools 12.x and higher, but within nonlinear editors like Adobe Premiere Pro CC 2017 or higher, FCP X 10.4 or higher and Avid Media Composer 2018.12.

Digging In
Accusonus’ Era 4 Pro Bundle will cost you $499 for the eight plugins included in its audio repair offering. This includes De-Esser Pro, De-Esser, Era-D, Noise Remover, Reverb Remover, Voice Leveler, Plosive Remover and De-Clipper. There is also an Era 4 (non-pro) bundle for $149 that includes everything mentioned previously except for De-Esser Pro and Era-D. I will go over a few of the plugins in this review and why the Pro bundle might warrant the additional $350.

I installed the Era 4 Pro Bundle on a Wacom MobileStudio Pro tablet that is a few years old but can still run Premiere. I did this intentionally to see just how light the plugins would run. To my surprise my system was able to toggle each plug-in off and on without any issue. Playback was seamless when all plugins were applied. Now I wasn’t playing anything but video, but sometimes when I do an audio pass I turn off video monitoring to be extra sure I am concentrating on the audio only.

De-Esser
First up is the De-Esser, which tackles harsh sounds resulting from “s,” “z,” “ch,” “j” and “sh.” So if you run into someone who has some ear piercing “s” pronunciations, apply the De-Esser plugin and choose from narrow, normal or broad. Once you find which mode helps remove the harsh sounds (otherwise known as sibilance), you can enable “intense” to add more processing power (but doing this can potentially require rendering). In addition, there is an output gain setting, “Diff,” that plays only the parts De-Esser is affecting. If you want to just try the “one button” approach, the Processing dial is really all you need to touch. In realtime, you can hear the sibilance diminish. I personally like a little reality in my work so I might dial the processing to the “perfect” amount then dial it back 5% or 10%.

De-Esser Pro
Next up is De-Esser Pro. This one is for the editor who wants the one-touch processing but also the ability to dive into the specific audio spectrum being affected and see how the falloff is being performed. In addition, there are presets such as male vocals, female speech, etc. to jump immediately to where you need help. I personally find the De-Esser Pro more useful than the De-Esser. I can really shape the plugin. However, if you don’t want to be bothered with the more intricate settings, the De-Esser is a still a great solution. Is it worth the extra $350? I’m not sure, but combining it with the Era-D might make you want to shell out the cash for the Era 4 Pro bundle.

Era-D
Speaking of the Era-D, it’s the only plugin not described by its own title, funnily enough, but it is a joint de-noise and de-reverberation plugin. However, Era-D goes way beyond simple hum or hiss removal. With Era-D, you get “regions” (I love saying that because of the audio mixers who constantly talk in regions and not timecode) that can not only be split at certain frequencies — and have different percentage of plugin applied to said region — but also have individual frequency cutoff levels.

Something I had never heard of before is the ability to use two mics to fix a suboptimal recording on one of the two mics, which can be done in the Era-D plugin. There is a signal path window that you can use to mix the amount of de-noise and de-reverb. It’s possible to only use one or the other, and you can even run the plugin in parallel or cascade. If that isn’t enough, there is an advanced window with artifact control and more. Era-D is really the reason for that extra $350 between the standard Era 4 bundle and the Era 4 Bundle Pro — and it is definitely worth it if you find yourself removing tons of noise and reverb.

Noise Remover
My second favorite plugin in the Era 4 Bundle Pro is the Noise Remover. Not only is the noise removal pretty high-quality (again, I dial it back to avoid robot sounds), but it is painless. Dial in the amount of processing and you are 80% done. If you need to go further, then there are five buttons that let you focus where the processing occurs: all-frequencies (flat), high frequencies, low frequencies, high and low frequencies and mid frequencies. I love clicking the power button to hear the differences — with and without the noise removal — but also dialing the knob around to really get the noise removed without going overboard. Whether removing noise in video or audio, there is a fine art in noise reduction, and the Era 4 Noise Removal makes it easy … even for an online editor.

Reverb Remover
The Reverb Remover operates very much like the Noise Remover, but instead of noise, it removes echo. Have you ever gotten a line of ADR clearly recorded on an iPhone in a bathtub? I’ve worked on my fair share of reality, documentary, stage and scripted shows, and at some point, someone will send you this — and then the producers will wonder why it doesn’t match the professionally recorded interviews. With Era 4 Noise Remover, Reverb Remover and Era-D, you will get much closer to matching the audio between different recording devices than without plugins. Dial that Reverb Remover processing knob to taste and then level out your audio, and you will be surprised at how much better it will sound.

Voice Leveler
To level out your audio, Accusonus also has included the Voice Leveler, which does just what is says: It levels your audio so you won’t get one line blasting in your ears while the next one doesn’t because the speaker backed away from the mic. Much like the De-Esser, you get a waveform visual of what is being affected in your audio. In addition, there are two modes: tight and normal, helping to normalize your dialog. Think of the tight mode as being much more distinctive than a normal interview conversation. Accusonus describes tight as a more focused “radio” sound. The Emphasis button helps to address issues when the speaker turns away from a microphone and introduces tonal problems. Breath control is a simple

De-Clipper and Plosive Remover
The final two plugins in the Era 4 Bundle Pro are the Plosive Remover and De-Clipper. De-Clipper is an interesting little plugin that tries to restore lost audio due to clipping. If you recorded audio at high gain and it came out horribly, then it’s probably been clipped. De-Clipper tries to salvage this clipped audio by recreating overly saturated audio segments. While it’s always better to monitor your audio recording on set and re-record if possible, sometimes it is just too late. That’s when you should try De-Clipper. There are two modes: normal/standard use and one for trickier cases that take a little more processing power.

The final plugin, Plosive Remover, focuses on artifacting that’s typically caused by “p” and “b” sounds. This can happen if no pop screen is used and/or if the person being recorded is too close to the microphone. There are two modes: normal and extreme. Subtle pops will easily be repaired in normal mode, but extreme pops will definitely need the extreme mode. Much like De-Esser, Plosive Remover has an audio waveform display to show what is being affected, while the “Diff” mode only plays back what is being affected. However, if you just want to stick to that “one button” mantra, the Processing dial is really all you need to mess with. The Plosive Remover is another amazing plugin that, when you need it, really does a great job fast and easily.

Summing Up
In the end, I downloaded all of the Accusonus audio demos found on the Era 4 website, along with installers. This is the same place you can download the installers if you want to take part in the 14-day trial. I purposely limited my audio editing time to under one minute per clip and plugin to see what I could do. Check out my work with the Accusonus Era 4 Pro audio repair plugins on YouTube and see if anything jumps out at you. In my opinion, the Noise Remover, Reverb Remover and Era-D are worth the price of admission, but each plugin from Accusonus does great work.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: Dell’s Precision T5820 workstation

By Brady Betzel

Multimedia creators are looking for faster, more robust computer systems and seeing an increase in computing power among all brands and products. Whether it’s an iMac Pro with a built-in 5K screen or a Windows-based, Nvidia-powered PC workstation, there are many options to consider. Many of today’s content creation apps are operating-system-agnostic, but that’s not necessarily true of hardware — mainly GPUs. So for those looking at purchasing a new system, I am going to run through one of Dell’s Windows-based offerings: the Dell Precision T5820 workstation.

The most important distinction between a “standard” computer system and a workstation is the enterprise-level quality and durability of internal parts. While you might build or order a custom-built system for less money, you will most likely not get the same back-end assurances that “workstations” bring to the party. Workstations aren’t always the fastest, but they are built with zero downtime and hardware/software functionality in mind. So while non-workstations might use high-quality components, like an Nvidia RTX 2080 Ti (a phenomenal graphics card), they aren’t necessarily meant to run 24 hours a day, 365 days a year. On the other hand, the Nvidia Quadro series GPUs are enterprise-level graphics cards that are meant to run constantly with low failure rates. This is just one example, but I think you get the point: Workstations run constantly and are warrantied against breakdowns — typically.

Dell Precision T5820
Dell has a long track record of building everyday computer systems that work. Even more impressive are its next-level workstation computers that not only stand up to constant use and abuse but are also certified with independent software vendors (ISVs). ISV is a designation that suggests Dell has not only tested but supports the end-user’s primary software choices. For instance, in the nonlinear editing software space I found out that Dell had tested the Precision T5820 workstation with Adobe Premiere Pro 13.x in Windows 10 and has certified that the AMD Radeon Pro WX 2100 and 3100 GPUs with 18.Q3.1 drivers are approved.

You can see for yourself here. Dell also has driver suggestions from some recent versions of Avid Media Composer, as well as other software packages. That being said, Dell not only tests but will support hardware configurations in the approved software apps.

Beyond the ISV certifications and the included three-year hardware warranty with on-site/in-home service after remote diagnostics, how does the Dell Precision T5820 perform? Well, it’s fast and well-built.

The specs are as follows:
– Intel Xeon W-2155 3.3GHz, 4.5GHz Turbo, 10-core, 13.75MB cache with hyperthreading
– Windows 10 Pro (four cores plus for workstations — this is an additional cost)
– Precision 5820 Tower with 950W chassis
– Nvidia Quadro P4000, 8GB, four DisplayPorts (5820T)
– 64GB (8x8GB) 2666MHz DDR and four RDIMM ECC
– Intel vPro technology enabled
– Dell Ultra-Speed Drive Duo PCIe SSD x8 Card, 1 M.2 512GB PCIe NVMe class 50 Solid State Drive (boot drive)
– 3.5-inch 2TB 7200rpm SATA hard drive (secondary drive)
– Wireless keyboard and mouse
– 1Gb network interface card
– USB 3.1 G2 PCIe card (two Type C ports, one DisplayPort)
– Three years hardware warranty with onsite/in-home service after remote diagnosis

All of this costs around $5,200 without tax or shipping and not including any sale prices.

The Dell Precision T5820 is the mid-level workstation offering from Dell that finds the balance between affordability, performance and reliability — kind of the “better, Cheaper, faster” concept. It is one of the quietest Dell workstations I have tested. Besides the spinning hard drive that was included on the model I was sent, there aren’t many loud cards or fans that distract me when I turn on the system. Dell is touting the new multichannel thermal design for advanced cooling and acoustics.

The actual 5820 case is about the size of a mid-sized tower system but feels much slimmer. I even cracked open the case to tinker around with the internal components. The inside fans and multichannel cooling are sturdy and even a little hard to remove without some force — not necessarily a bad thing. You can tell that Dell made it so that when something fails, it is a relatively simple replacement. The insides are very modular. The front of the 5820 has an optical drive, some USB ports (including two USB-C ports) and an audio port. If you get fancy, you can order the systems with what Dell calls “Flex Bays” in the front. You can potentially add up to six 2.5-inch or five 3.5-inch drives and front-accessible storage of up to four M.2 or U.2 PCIe NVMe SSDs. The best part about the front Flex Bays is that, if you choose to use M.2 or U.2 media, they are hot-swappable. This is great for editing projects that you want to archive to an M.2 or save to your Blackmagic DaVinci Resolve cache and remove later.

In the back of the workstation, you get audio in/out, one serial port, PS/2, Ethernet and six USB 3.1 Gen 1 Type A ports. This particular system was outfitted with an optional USB 3.1 Gen 2 10GB/s Type C card with one DisplayPort passthrough. This is used for the Dell UltraSharp 32-inch 4K (UHD) USB-C monitor that I received along with the T5820.

The large Dell UltraSharp 32-inch monitor (U3219Q) offers a slim footprint and a USB-C connection that is very intriguing, but they aren’t giving them away. They cost $879.99 if ordered through Dell.com. With the ultra-minimal Infinity Edge bezel, 400 nits of brightness for HDR content, up to UHD (3840×2160) resolution, 60Hz refresh rate and multiple input/output connections, you can see all of your work in one large IPS panel. For those of you who want to run two computers off one monitor, this Dell UltraSharp has a built-in KVM switch function. Anyone with a MacBook Pro featuring USB-C/Thunderbolt 3 ports can in theory use one USB-C cable to connect and charge. I say “in theory” only because I don’t have a new MacBook Pro to test it on. But for PCs, you can still use the USB-C as a hub.

The monitor comes equipped with a DisplayPort 1.4, HDMI, four USB 3.0 Type A ports and a USB-C port. Because I use my workstation mainly for video and photo editing, I am always concerned with proper calibration. The U3219Q is purported by Dell to be 99% Adobe sRGB-, 95% DCI-P3- and 99% Rec. 709-accurate, so if you are using Resolve and outputting through a DeckLink, you will be able to get some decent accuracy and even use it for HDR. Over the years, I have really fallen in love with Dell monitors. They don’t break the bank, and they deliver crisp and accurate images, so there is a lot to love. Check out more of this monitor here.

Performance
Working in media creation I jump around between a bunch of apps and plugins, from Media Composer to Blackmagic’s DaVinci Resolve and even from Adobe After Effects to Maxon’s Cinema 4D. So I need a system that can not only handle CPU-focused apps like After Effects but GPU-weighted apps like Resolve. With the Intel Xeon and Nvidia Quadro components, this system should work just fine. I ran some tests in Premiere Pro, After Effects and Resolve. In fact, I used Puget Systems’ benchmarking tool with Premiere and After Effects projects. You can find one for Premiere here. In addition, I used the classic 3D benchmark Cinebench R20 from Maxon, and even did some of my own benchmarks.

In Premiere, I was able to play 4K H.264 (50MB and 100MB 10-bit) and ProRes files (HQ and 4444) in realtime at full resolution. Red Raw 4K was able to playback in full-quality debayer. But as the Puget Systems’ Premiere Benchmark shows, 8K (as well as heavily effected clips) started to bog the system down. With 4K, the addition of Lumetri color correction slowed down playback and export a little bit — just a few frames under realtime. It was close though. At half quality I was essentially playing in realtime. According to the Puget Systems’ Benchmark, the overall CPU score was much higher than the GPU score. Adobe uses a lot of single core processing. While certain effects, like resizes and blurs, will open up the GPU pipes, I saw the CPU (single-core) kicking in here.

In the Premiere Pro tests, the T5820 really shined bright when working with mezzanine codec-based media like ProRes (HQ and 4444) and even in Red 4K raw media. The T5820 seemed to slow down when multiple layers of effects, such as color correction and blurs, were added on top of each other.

In After Effects, I again used Puget Systems’ benchmark — this time the After Effects-specific version. Overall, the After Effects scoring was a B or B-, which isn’t terrible considering it was up against the prosumer powerhouse Nvidia RTX 2080. (Puget Systems used the 2080 as the 100% score). It seemed the tracking on the Dell T5820 was a 90%, while Render and Preview scores were around 80%. While this is just what it says — a benchmark — it’s a great way to see comparisons between machines like the benchmark standard Intel i9, RTX 2080 GPU, 64GB of memory and much more.

In Resolve 16 Beta 7, I ran multiple tests on the same 4K (UHD), 29.97fps Red Raw media that Puget Systems used in its benchmarks. I created four 10-minute sequences:
Sequence 1: no effects or LUTs
Sequence 2: three layers of Resolve OpenFX Gaussian blurs on adjustment layers in the Edit tab
Sequence 3: five serial nodes of Blur Radius (at 1.0) created in the Color tab
Sequence 4: in the Color tab, spatial noise reduction was set at 25 radius to medium, blur set to 1.0 and sharpening in the Blur tab set to zero (it starts at 0.5).

Sequence 1, without any effects, would play at full debayer quality in real time and export at a few frames above real time, averaging about 33fps. Sequence 2, with Resolve’s OpenFX Gaussian blur applied three times to the entire frame via adjustment layers in the Edit tab, would play back in real time and export at between 21.5fps and 22.5fps. Sequence 3, with five serial nodes of blur radius set at 1.0 in the Blur tab in the Color tab, would play realtime and export at about 23fps. Once I added a sixth serial blur node, the system would no longer lock onto realtime playback. Sequence 4 — with spatial noise reduction set at 25 radius to medium, blur set to 1.0 and sharpening in the Blur tab set to zero in the Color tab — would play back at 1fps to 2fps and export at 6.5fps.

All of these exports were QuickTime-based H.264s exported using the Nvidia encoder (the native encoder would slow it down by 10 frames or so). The settings were UHD resolution; “automatic — best” quality; disabled frame reordering; force sizing to highest quality; force debayer to highest quality and no audio. Once I stacked two layers of raw Red 4K media, I started to drop below realtime playback, even without color correction or effects. I even tried to play back some 8K media, and I would get about 14fps on full-res. Premium debayer, 14 to 16 on half res. Premium 25 on half res. good, and 29.97fps (realtime) on quarter res. good.

Using the recently upgraded Maxon Cinebench R20 benchmark, I found the workstation to be performing adequately around the fourth-place spot. Keep in mind, there are thousands of combinations of results that can be had depending on CPU, GPU, memory and more. These are only sample results that you could verify against your own for 3D artists. The Cinebench R20 results were CPU: 4682, CPU (single-core): 436, and MP ratio: 10.73x. If you Google or check out some threads for Cinebench R20 result comparisons, you will eventually find some results to compare mine against. My results are a B to B+. A much higher-end Intel Xeon or i9 or an AMD Threadripper processor would really punch this system up a weight class.

Summing Up
The Dell Precision T5820 workstation comes with a lot of enterprise-level benefits that simply don’t come with your average consumer system. The components are meant to be run constantly, and Dell has tested its systems against current industry applications using the hardware in these systems to identify the best optimizations and driver packages with these ISVs. Should anything fail, Dell’s three-year warranty (which can be upgraded) will get you up and running fast. Before taxes and shipping, the Dell T5820 I was sent for review would retail for just under $5,200 (maybe even a little more with the DVD drive, recovery USB drive, keyboard and mouse). This is definitely not the system to look at if you are a DIYer or an everyday user who does not need to be running 24 hours a day, seven days a week.

But in a corporate environment, where time is money and no one wants to be searching for answers, the Dell T5820 workstation with accompanying three-year ProSupport with next-day on-site service will be worth the $5,200. Furthermore, it’s invaluable that optimization with applications such as the Adobe Creative Suite is built-in, and Dell’s ProSupport team has direct experience working in those professional apps.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

 

Review: LaCie mobile, high Speed 1TB SSD

By Brady Betzel

With the flood of internal and external hard drives hitting the market at relatively low prices, it is sometimes hard to wade through the swamp and find the drive that is right for your workflow. In terms of external drives, do you need a RAID? USB-C? Is Thunderbolt 3 the same as USB-C? Should I save money and go with a spinning drive? Are spinning drives even cheaper than SSD drives these days? All of these questions are valid and, hopefully, I will answer them.

For this review, I’m taking a look at the LaCie Mobile SSD  which comes in three versions: 500GB, 1TB and 2TB, costing around $129.95, $219.95 and $399.95, respectively. According to LaCie’s website the mobile SSD drives are exclusive to Apple, but with some searching on Amazon you can find all three available as well and at lower prices than I’ve mentioned. The 1TB version I am seeing for $152.95 is being sold on Amazon through LaCie, so I assume the warranty still holds up.

I was sent the 1TB version of the LaCie Mobile SSD for review and testing. Along with the drive itself, you will get two connection cables: a (USB 3.0 speed) USB-A to USB-C cable, as well as a (USB 3.1 Gen2 speed) GenUSB-C to USB-C cable. For clarity, USB-C is the type of connection — the oval-like shape and technology used to transfer data. While USB-C will work on Thunderbolt 3 connections, Thunderbolt 3 only connections will not work on USB-C connections. Yes, that is super-confusing considering they look the same. But in the real world, Thunderbolt 3 is more Mac OS-based while USB-C is more Windows-based. You can find rare Thunderbolt 3 connections on Windows-based PCs, but you are more likely to find USB-C. That being said, the LaCie Mobile SSD is compatible with both USB-C and Thunderbolt 3, as well as USB 3.0. Keep in mind you will not get the high transfer speed with the USB 3.0 to USB-C cable. You will only get that with the (USB 3.1 Gen 2) USB-C to USB-C cable. The drive comes formatted as exFAT, which is immediately compatible with both Mac OS and Windows.

So, are spinning drives worth the cheaper price? In my opinion, no. Spinning drives are more fragile when moved around a lot and they transfer at much slower speeds. Advertised speeds vary from about 130MB/s for spinning drives to 540MB/s for SSDs, so for today what amounts to $100 more will give you a significant speed increase.

A very valuable piece of the LaCie Mobile SSD purchase is the limited three-year warranty and three years of data recovery services for free. No matter how your data becomes corrupted, Seagate will try and recover it — Seagate became LaCie’s parent company in 2014. Each product is eligible for one in-lab data recovery attempt and can be turned around in as little as two days, depending on the type of recovery. The recovered media will then be sent back to you on a storage device as well as be available to you from a cloud-based account that will be hosted online for 60 days. This is a great feature that’s included in the price.

The drive itself is small, measuring approximately .35” x 3” x 3.8” and weighing only .22 lbs. The outside has sharp lines much in the vein of a faceted diamond. It feels solid and great to carry. The color is about the same as a MacBook Pro, space gray and is made of aluminum.

Transfer SpeedsAlright, let’s get to the nitty-gritty: transfer speeds. I tested the LaCie Mobile SSD on both a Windows-based PC with USB-C and an iMac Pro with Thunderbolt 3/USB-C. On the Windows PC, I initially connected the drive to a port on the front of my system and I was only getting around 150MB/s write speed (about the speed of USB 3.0). Immediately, I knew something was wrong, so I connected to a USB-C port that was in a PCI-e slot in the rear of my PC. On that port I was getting 440.9MB/s write speed and 516.3MB/s read speeds. Moral of the story, make sure your USB-C ports are not just for charging or simply the USB-C connector running at USB 3.0 speeds.

On the iMac Pro, I was getting write speeds of 487.2MB/s and read speeds of 523.9MB/s. This is definitely on par with the correct Windows PC transfer speeds. The retail packaging on the LaCie Mobile SSD states a 540MB/s speed (doesn’t differentiate between read or write), but much like retail miles-per-gallon readouts on car sales brochures, you have to take their numbers with a few grains of salt. And while I have previoulsy tested drives (not from LaCie) that would initially transfer at a high rate and drop down, the LaCie Mobile SSD drive sustained the high speed transfer rates.

Summing Up
In the end, the size and design of the LaCie Mobile SSD will be one of the larger factors in determining if you buy this drive. It’s small. Like real small, but it feels sturdy. I don’t think anyone can argue that the LaCie Rugged drives (the ones that are orange-rubber encased) are a staple of the post industry. I really wish LaCie kept that tradition and added a tiny little orange rubberized edge. Not only does it feel safer for some reason, but it is a trademark that immediately says, “I’m a professional.”

Besides the appearance, the $152.95 price tag for a 1TB SSD drive that can easily fit into your shirt pocket without being noticed is pretty reasonable. At $219.95 I might say keep looking around. In addition, if you aren’t already an Adobe Creative Cloud subscriber you will get a free 30-day trial (normally seven days) included with purchase.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: FXhome’s HitFilm Pro 12 for editing, compositing, VFX

By Brady Betzel

If you have ever worked in Adobe Premiere Pro, Apple FCP X or Avid Media Composer and wished you could just flip a tab and be inside After Effects, with access to 3D objects directly in your timeline, you are going to want to take a look at FXhome’s HitFilm Pro 12.

Similar to how Blackmagic brought Fusion inside of its most recent versions of DaVinci Resolve, HitFilm Pro offers a nonlinear editor, a composite/VFX suite and a finishing suite combined into one piece of software. Haven’t heard about HitFilm yet? Let me help fill in some blanks.

Editing and 3D model Import

Editing and 3D model Import

What is HitFilm Pro 12?
Technically, HitFilm Pro 12 is a non-subscription-based nonlinear editor, compositor and VFX suite that costs $299. Not only does that price include 12 months of updates and tech support, but one license can be used on up to three computers simultaneously. In my eyes, HitFilm Pro is a great tool set for independent filmmakers, social media content generators and any editor who goes beyond editing and dives into topics like 3D modeling, tracking, keying, etc. without having to necessarily fork over money for a bunch of expensive third-party plugins. That doesn’t mean you won’t want to buy third-party plugins, but you are less likely to need them with HitFilm’s expansive list of native features and tools.

At my day job, I use Premiere, After Effects, Media Composer and Resolve. I often come home and want to work in something that has everything inside, and that is where HitFilm Pro 12 lives. Not only does it have the professional functionality that I am used to, such as trimming, color scopes and more, but it also has BorisFX’s Mocha planar tracking plugin built in for no extra cost. This is something I use constantly and love.

One of the most interesting and recent updates to HitFilm Pro 12 is the ability to use After Effects plugins. Not all plugins will work since there are so many, but in a video released after NAB 2019, HitFilm said plugins like Andrew Kramer’s Video CoPilot Element3D and ones from Red Giant are on the horizon. If you are within your support window, or you continue to purchase HitFilm, FXhome will work with you to get your favorite After Effects plugins working directly inside of HitFilm.

Timeline and 3D model editor

Some additional updates to HitFilm Pro 12 include a completely redesigned user interface that resembles Premiere Pro… kind of. Threaded rendering has also been added, so Windows users who have Intel and Nvidia hardware will see increased GPU speeds, the ability to add title directly in the editor and more.

The Review
So how doees HitFilm Pro 12 compare to today’s modern software packages? That is an interesting question. I have become more and more of a Resolve convert over the past two years, so I am constantly comparing everything to that. In addition, being an Avid user for over 15 years, I am used to a rock-solid NLE with only a few hiccups here and there. In my opinion, HitFilm 12 lands itself right where Premiere and FCP X live.

It feels prosumer-y, in a YouTuber or content-generator capacity. Would it stand up to 10 hours of abuse with content over 45 minutes? It probably would, but much like with Premiere, I would probably split my edits in scenes or acts to avoid slowdowns, especially when importing things like OBJ files or composites.

The nonlinear editor portion feels like Premiere and FCP X had a baby, but left out FCP X’s Magnetic Timeline feature. The trimming in the timeline feels smooth, and after about 20 minutes of getting comfortable with it I felt like it was what I am generally used to. Cutting in footage feels good using three-point edits or simply dragging and dropping. Using effects feels very similar to the Adobe world, where you can stack them on top of clips and they each affect each other from the top down.

Mocha within HitFilm Pro

Where HitFilm Pro 12 shines is in the inclusion of typically third-party plugins directly in the timeline. From the ability to create a scene with 3D cameras and particle generators to being able to track using BorisFX’s Mocha, HitFilm Pro 12 has many features that will help take your project to the next level. With HitFilm 12 Pro’s true 3D cameras, you can take flat text and enhance it with raytraced lighting, shadows and even textures. You can even use the included BorisFX Continuum 3D Objects to make great titles relatively easily. To take it a step further, you can even track them and animate them.

Color Tools
By day, I am an online editor/colorist who deals with the finishing aspect of media creation. Throughout the process, from color correction to exporting files, I need tools that are not only efficient but accurate. When I started to dig into the color correction side of HitFilm Pro 12, things slowed down for me. The color correction tools are very close to what you’ll find in other NLEs, like Premiere and FCP X, but they don’t quite rise to the level of Resolve. HitFilm Pro 12 does operate inside of a 32-bit color pipeline, which really helps avoid banding and other errors when color correcting. However, I didn’t feel that the toolset was making me more efficient; in fact, it was the opposite. I felt like I had to learn FXhome’s way of doing it. It wasn’t that it totally slowed me down, but I felt it could be better.

Color

Color

Summing Up
In the end, HitFilm 12 Pro will fill a lot of holes for individual content creators. If you love learning new things (like I do), then HitFilm Pro 12 will be a good investment of your time. In fact, FXhome post tons of video tutorials on all sorts of good and topical stuff, like how to create a Stranger Things intro title.

If you are a little more inclined to work with a layer-based workflow, like in After Effects, then HitFilm Pro Pro 12 is the app you’ll want to learn. Check out HitFilm Pro 12 on FXhome’s website and definitely watch some of the company’s informative tutorials.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: Western Digital’s Blue SN500 NVMe SSD

By Brady Betzel

Since we began the transfer of power from the old standard SATA 3.5-inch hard drives to SSD drives, multimedia-based computer users have seen a dramatic uptick in read and write speeds. The only issue has been price. You can still find a 3.5-inch brick drive, ranging in size from 2TB to 4TB, for under $200 (maybe closer to $100), but if you upgraded to an SSD drive over the past five years, you were looking at a huge jump in price. Hundreds, if not thousands, of dollars. These days you are looking at just a couple of hundred for 1TB and even less for 256GB or 512GB SSD.

Western Digital hopes you’ll think of NVMe SSD drives as more of an automatic purchase than a luxury with the Western Digital Blue SN500 NVMe M.2 2280 line of SSD drives.

Before you get started, you will need a somewhat modern computer with an NVMe M.2-compatible motherboard (also referred to as a PCIe Gen 3 interface). This NVMe SSD is a “B+M key” configuration, so you will need to make sure you are compatible. Once you confirm that your motherboard is compatible, you can start shopping around. The Western Digital Blue series has always been the budget-friendly level of hard drives. Western Digital also offers the next level up: the Black series. In terms of NVMe SSD M.2 drives, the Western Digital Blue series drives will be budget-friendly, but they also use two fewer PCIe lanes, which results in a slower read/write speed. The Black series uses up to four PCIe lanes, as well as has a heat sink to dissipate the heat. But for this review, I am focusing on the Blue series and how it performs.

On paper the Western Digital Blue SN500 NVMe SSD is available in either 250GB or 500GB sizes, measures approximately 80mm long and uses the M.2 2280 form factor for the PCIe Gen 3 interface in up to two lanes. Technically, the 500GB drive can achieve up to 1,700MB/s read and 1450MB/s write speeds, and the 250GB can achieve up to 1700MB/s read and 1300MB/s write speeds.

As of this review, the 250GB version sells for $53.99, while the 500GB version sells for $75.99. You can find specs on the Western Digital website and learn more about the Black series as well.

One of the coolest things about these NVMe drives is that they come standard with a five-year limited warranty (or max endurance limit). The max endurance (aka TBW — terabytes written) for the 250GB SSD is 150TB, while the max endurance for the 500GB version is 300TB. Both versions have a MTTF (mean time to failure) of 1.75 million hours.

In addition, the drive uses an in-house controller and 3D NAND logic. Now those words might sound like nonsense, but the in-house controller is what tells the NVMe what to do and when to do it (it’s essentially a dedicated processor), while3D NAND is a way of cramming more memory into smaller spaces. Instead of hard drive manufacturers adding more memory on the same platform in an x- or y-axis, they achieve more storage space by stacking layers vertically on top — or on the z-axis.

Testing Read and Write Speeds
Keep in mind that I ran these tests on a Windows-based PC. Doing a straight file transfer, I was getting about 1GB/s. When using Crystal Disk Mark, I would get a burst of speed at the top, slow down a little and then mellow out. Using a 4GB sample, my speeds were:
“Seq Q32T” – Read: 1749.5 MB/s – Write: 1456.6 MB/s
“4KiB Q8T8” – Read: 1020.4 MB/s – Write: 1039.9 MB/s
“4KiB Q32T1” – Read: 732.5 MB/s – Write: 676.5 MB/s
“4KiB Q1T1” – Read: 35.77 MB/s – Write: 185.5 MB/s

If you would like to read exactly what these types of tests entail, check out the Crystal Disk Mark info page. In the AJA System Test I had a little drop off, but with a 4GB test file size, I got an initial read speed of 1457MB/s and a write speed of 1210MB/s, which seems to fall more in line with what Western Digital is touting. The second time I ran the AJA System Test, I got a read speed of 1458MB/s and write speed of 883MB/s. I wanted a third opinion, so I ran the Blackmagic Design Disk Speed Test (you’ll have to install drivers for a Blackmagic card, like the Ultrastudio 4K). On my first run, I got a read speed of 1359.6MB/s and write speed of 1305.8MB/s. On my second run, I got a read speed of 1340.5MB/s and write speed of 968.3MB/s. My read numbers were generally above 1300MB/s, and my write numbers varied between 800 and 1000MB/s. Not terrible for a sub-$100 hard drive.

Summing Up
In the end, the Western Digital Blue SN500 NVMe SSD is an amazing value at under $100, and hopefully we will get expanded sizes in the future. The drive is a B+M key configuration, so when you are looking at compatibility, make sure to check which key your PCIe card, external drive case or motherboard supports. It is typically M or B+M key, but I found a PCI card that supported both. If you need more space and speed than the WD Blue series can offer, check out Western Digital’s Black series of NVMe SSDs.

The sticker price starts to go up significantly when you hit the 1TB or 2TB marks — $279.99 and $529.99, respectively (with the heat sink attachment). If you stick to the 500GB version, you are looking at a more modest price tag.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Tips for inside —and outside — the edit suite

By Brady Betzel

Over the past 15 years, I’ve seen lots of good and bad while working in production and post — from people being overly technical (only looking at the scopes and not the creative) to being difficult just for the sake of being difficult. I’ve worked on daytime and nighttime talk shows, comedies, reality television, Christmas parades, documentaries and more. All have shaped my editing skills as well as my view on the work-life balance.

Here are some tips I try to keep in mind that have helped me get past problems I’ve encountered in and out of the edit bay:

No One Cares
This one is something I constantly have to remind myself. It’s not necessarily true all the time, but it’s a good way to keep my own ego in check, especially on social media. When editing and coloring, I constantly ask myself, “Does anyone care about what I’m doing? If not, why not?” And if the answer is that they don’t, then something needs to change. I also ask myself, “Does anything about my comment or edit further the conversation or the story, or am I taking away from the story to draw attention to myself?” In other words, am I making an edit just to make an edit?

It’s an especially good thing to think about when you get trolled on Twitter by negative know-it-alls telling you why you’re wrong about working in certain NLEs. Really, who cares? After I write my response and edit it a bunch of times, I tell myself, “No one cares.” This philosophy not only saves me from feeling bad about not answering questions that no one really cares about, but it also helps improve my editing, VFX and color correction work.

Don’t be Difficult!
As someone who has worked everywhere and in all sorts of positions — from a computer tech at Best Buy (before Geek Squad), a barista at Starbucks, a post PA for the Academy Awards, and assistant editor, editor, offline editor, online editor — I’ve seen the power of being amenable.

I am also innately a very organized person, both at work and at home, digitally and in real life — sometimes to my wife’s dismay. I also constantly repeat this mantra to my kids: “If you’re not early, you’re late.”

But sometimes I need to be reminded that it’s OK to be late, and it’s OK not to do things the technically “correct” way. The same applies to work. Someone might have a different way of doing something that’s slower than the way I’d do it, but that doesn’t mean that person is wrong. Avoiding confrontation is the best way to go. Sure, go ahead and transcode inside of Adobe Premiere Pro instead of batch transcoding in Media Encoder. If the outcome is the same and it helps avoid a fight, just let it slide. You might also learn something new by taking a back seat.

Sometimes Being Technically Perfect Isn’t Perfect
I often feel like I have a few obsessive traits: leaving on time, having a tidy desktop and doing things (I feel) correctly. One of the biggest examples is when color correcting. It is true that scopes don’t lie; they give you the honest truth. But when I hear about colorists bragging that they turn off the monitors and color using only Tektronix Double Diamond displays, waveforms and vectorscopes — my skeptical hippo eyes come out. (Google it; it’s a thing).

While scopes might get you to a technically acceptable spot in color correction, you need to have an objective view from a properly calibrated monitor. Sometimes an image with perfectly white whites and perfectly black blacks is not the most visually pleasing image. I constantly need to remind myself to take a step back and really blend the technical with the creative. That is, I sit back and imagine myself as the wide-eyed 16-year-old in the movie theater being blown away and intrigued by American Beauty.

You shouldn’t do things just because you think that is how they should be done. Take a step back and ask yourself if you, your wife, brother, uncle, mom, dad, or whoever else might like it.

Obviously, being technically correct is vital when creating things like deliverables, and that is where there might be less objectivity, but I think you understand my point. Remember to add a little objectivity into your life.

Live for Yourself, Practice and Repeat
While I constantly tell people to watch tutorials on YouTube and places like MZed.com, you also need to physically practice your craft. This idea becomes obvious when working in technically creative positions like editing.

I love watching tutorials on lighting and photography since so much can be translated over to editing and color correcting. Understanding relationships between light and motion can help guide scenes. But if all you do is watch someone tell you how light works, you might not really be absorbing the information. Putting into practice the concepts you learn is a basic but vital skill that is easy to forget. Don’t just watch other people live life, live it for yourself.

For example, a lot of people don’t understand trimming and re-linking in Media Composer. They hear about it but don’t really use these skills to their fullest unless they actively work them out. Same goes for people wanting to use a Wacom tablet instead of a mouse. It took me two weeks of putting my mouse in the closet to even get started on the Wacom tablet, but in the end, it is one of those things I can’t live without. But I had to make the choice to try it for myself and practice, practice, practice to know it.

If you dabble and watch videos on a Wacom tablet, using it once might turn you off. Using trimming once might not convince you it is great. Using roles in FCPX once might not convince you that it is necessary. Putting those skills into practice is how you will live editing life for yourself and discover what is important to you … instead of relying on other people to tell you what’s important.

Put Your Best Foot Forward
This bit of advice came to me from a senior editor on my first real professional editing job after being an assistant editor. I had submitted a rough cut and — in a very kind manner — the editor told me that it wasn’t close to ready for a rough cut title. Then we went through how I could get there. In the end, I essentially needed to spend a lot more time polishing the audio, checking for lip flap, polishing transitions and much more. Not just any time, but focused time.

Check your edit from a 30,000-foot view for things like story and continuity, but also those 10-foot view things like audio pops and interviews that sound like they are all from one take. Do all your music cues sting on the right beat? Is all your music panned for stereo and your dialogue all center-panned to cut up the middle?

These are things that take time to learn, but once you get it in your head, it will be impossible to forget … if you really want to be a good editor. Some might read this and say, “If you don’t know these workflows, you shouldn’t be an editor.” Not true! Everyone starts somewhere, but regardless of what career stage you’re in, always put your best foot forward.

Trust Your Instincts
I have always had confidence in my instincts, and I have my parents to thank for that. But I have noticed that a lot of up-and-coming production and post workers don’t want to make decisions. They also are very unsure if they should trust their first instinct. In my experience, your first instinct is usually your best instinct. Especially when editing.

Obviously there are exceptions to this rule, but generally I rely heavily on my instincts even when others might not agree. Take this with a grain of salt, but also throw that salt away and dive head first!

This notion really came to a head for me when I was designing show titles in After Effects. The producers really encouraged going above and beyond when making opening titles of a show I worked on. I decided to go into After Effects instead of staying inside of the NLE. I used crazy compositing options that I didn’t often use, tried different light leaks, inverted mattes … everything. Once I started to like something, I would jump in head first and go all the way. Usually that worked out, but even if it didn’t, everyone could see the quality of work I was putting out, and that was mainly because I trusted my instincts.

Delete and Start Over
When you are done trusting your instincts and your project just isn’t hitting home — maybe the story doesn’t quite connect, the HUD you are designing just doesn’t quite punch or the music you chose for a scene is very distracting — throw it all away and start over. One of the biggest skills I have acquired in my career thus far seems to be the ability to throw a project away and start over.

Typically, scenes can go on and on with notes, but if you’re getting nowhere, it might be time to start over if you can. Not only will you have a fresh perspective, but you will have a more intimate knowledge of the content than you had the first time you started your edit — which might lead to an alternate pathway into your story.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: Red Giant’s VFX Suite plugins

By Brady Betzel

If you have ever watched After Effects tutorials, you are bound to have seen the people who make up Red Giant. There is Aharon Rabinowitz, who you might mistake for a professional voiceover talent; Seth Worley, who can combine a pithy sense of humor and over-the-top creativity seamlessly; and my latest man-crush Daniel Hashimoto, better known as “Action Movie Dad” of Action Movie Kid.

In these videos, these talented pros show off some amazing things they created using Red Giant’s plugin offerings, such as the Trapcode Suite, the Magic Bullet Suite, Universe and others.

Now, Red Giant is trying to improve your visual effects workflow even further with the new VFX Suite for Adobe After Effects (although some work in Adobe Premiere as well).

The new VFX Suite is a compositing focused toolkit that will complement many aspects of your work, from green screen keying to motion graphics compositing with tools such as Video Copilot’s Element 3D. Whether you want to seamlessly composite light and atmospheric fog with fewer pre-composites, add a reflection to an object easily or even just have a better greenscreen keyer, the VFX Suite will help.

The VFX Suite includes Supercomp, Primatte Keyer 6, King Pin Tracker, Spot Clone Tracker, Optical Glow; Chromatic Displacement, Knoll Light Factory 3.1, Shadow and Reflection. The VFX Suite is priced at $999 unless you qualify for the academic discount, which means you can get it for $499.

In this review, I will go over each of the plugins within the VFX Suite. Up first will be Primatte Keyer 6.

Overall, I love Red Giant’s GUIs. They seem to be a little more intuitive, allowing me to work more “creatively” as opposed to spending time figuring out technical issues.

I asked Red Giant what makes VFX Suite so powerful and Rabinowitz, head of marketing for Red Giant and general post production wizard, shared this: “Red Giant has been helping VFX artists solve compositing challenges for over 15 years. For VFX suite, we looked at those challenges with fresh eyes and built new tools to solve them with new technologies. Most of these tools are built entirely from scratch. In the case of Primatte Keyer, we further enhanced the UI and sped it up dramatically with GPU acceleration. Primatte Keyer 6 becomes even more powerful when you combine the keying results with Supercomp, which quickly turns your keyed footage into beautifully comped footage.”

Primatte Keyer 6
Primatte is a chromakey/single-color keying technology used in tons of movies and television shows. I got familiar with Primatte when BorisFX included it in its Continuum suite of plugins. Once I used Primatte and learned the intricacies of extracting detail from hair and even just using their auto-analyze function, I never looked back. On occasion, Primatte needs a little help from others, like Keylight, but I can usually pull easy and tough keys all within one or two instances of Primatte.

If you haven’t used Primatte before, you essentially pick your key color by drawing a line or rectangle around the color, adjust the detail and opacity of the matte, and — boom — you’re done. With Primatte 6 you now also get Core Matte, a new feature that draws an inside mask automatically while allowing you to refine the edges — this is a real time-saver when doing hundreds of interview greenscreen keys, especially when someone decides to wear a reflective necklace or piece of jewelry that usually requires an extra mask and tracking. Primatte 6 also adds GPU optimization, gaining even more preview and rendering speed than previous versions.

Supercomp
If you are an editor like me — who knows enough to be dangerous when compositing and working within After Effects — sometimes you just want (or need) a simpler interface without having to figure out all the expressions, layer order, effects and compositing modes to get something to look right. And if you are an Avid Media Composer user, you might have encountered the Paint Effect Tool, which is one of those one-for-all plugins. You can paint, sharpen, blur and much more from inside one tool, much like Supercomp. Think of the Supercomp interface as a Colorista or Magic Bullet Looks-type interface, where you can work with composite effects such as fog, glow, lights, matte chokers, edge blend and more inside of one interface with much less pre-composing.

The effects are all GPU-accelerated and are context-aware. Supercomp is a great tool to use with your results from the Primatte Keyer, adding in atmosphere and light wraps quickly and easily inside one plugin instead of multiple.

King Pin Tracker and Spot Clone Tracker
As an online editor, I am often tasked with sign replacements, paint-out of crew or cameras in shots, as well as other clean-ups. If I can’t accomplish what I want with BorisFX Continuum while using Mocha inside of Media Composer or Blackmagic’s DaVinci Resolve, I will jump over to After Effects and try my hand there. I don’t practice as much corner pinning as I would like, so I often forget the intricacies when tracking in Mocha and copying Corner Pin or Transform Data to After Effects. This is where the new King Pin Tracker can ease any difficulties, especially when performing corner pinning on relatively simple objects but still need to keyframe positions or perform a planar track without using multiple plugins or applications.

The Spot Clone Tracker is exactly what is says it is. Much like Resolve’s Patch Replace, Spot Clone Tracker allows you to track one area while replacing that same area with another area from the screen. In addition, Spot Clone Tracker has options to flip vertical, flip horizontal, add noise, and adjust brightness and color values. For such a seemingly simple tool, the Spot Clone Tracker is the darkhorse in this race. You’d be surprised how many clone and paint tools don’t have adjustments, like flipping and flopping or brightness changes. This is a great tool for quick dead-pixel fixes and painting out GoPros when you don’t need to mask anything out. (Although there is an option to “Respect Alpha.”)

Optical Glow and Knoll Light Factory 3.1
Have you ever been in an editing session that needed police lights amplified or a nice glow on some text but the stock plugins just couldn’t get it right? Optical Glow will solve this problem. In another amazing, simple-yet-powerful Red Giant plugin, Optical Glow can be applied and gamma-adjusted for video, log and linear levels right off the bat.

From there you can pick an inner tint, outer tint and overall glow color via the Colorize tool and set the vibrance. I really love the Falloff, Highlight Rolloff, and Highlights Only functions, which allow you to fine-tune the glow and just how much it shows what it affects. It’s so simple that it is hard to mess up, but the results speak for themselves and render out quicker than with other glow plugins I am using.

Knoll Light Factory has been newly GPU-accelerated in Version 3.1 to decrease render times when using its more than 200 presets or when customizing your own lens flares. Optical Glow and Knoll Light Factory really complement each other.

Chromatic Displacement
Since watching an Andrew Kramer tutorial covering displacement, I’ve always wanted to make a video that showed huge seismic blasts but didn’t really want to put the time into properly making chromatic displacement. Lucky for me, Red Giant has introduced Chromatic Displacement! Whether you want to make rain drops appear on the camera lens or add a seismic blast from a phaser, Chromatic Displacement will allow you to offset your background with a glass-, mirror- or even heatwave-like appearance quickly. Simply choose the layer you want to displace from and adjust parameters such as displacement amount, spread and spread chroma, and whether you want to render using the CPU or GPU.

Shadow and ReflectionRed Giant packs Shadow and Reflection plugins into the VFX Suite as well. The Shadow plugin not only makes it easy to create shadows in front of or behind an object based on alpha channel or brightness, but, best of all, it gives you an easy way to identify the point where the shadow should bend. The Shadow Bend option lets you identify where the bend exists, what color the bend axis should be, the type of seam and seam the size, and even allows for motion blur.

The Reflection plugin is very similar to the Shadow plugin and produces quick and awesome reflections without any After Effects wizardry. Just like Shadow, the Reflection plugin allows you to identify a bend. Plus, you can adjust the softness of the reflection quickly and easily.

Summing Up
In the end, Red Giant always delivers great and useful plugins. VFX Suite is no different, and the only downside some might point to is the cost. While $999 is expensive, if compositing is a large portion of your business, the efficiency you gain might outweigh the cost.

Much like Shooter Suite does for online editors, Trapcode Suite does for VFX masters and Universe does for jacks of all trades, VFX Suite will take all of your ideas and help them blend seamlessly into your work.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: Polaroid 320 RGB LED light for controlled environments

By Brady Betzel

If you read a lot of reviews and articles like I do, sometimes you can get overwhelmed by how many products are out there. Lighting is one of those categories where the availability of products seems never ending. From ARRI Fresnels that can go for over $5,000 to the Kino Flo Diva-Lite at over $1,300, lights cover the entire spectrum of prices. But sometimes I don’t have the power, room or even money to buy these lights and need something cheaper — way cheaper. Portable lights like the Litra Torch are usually focused around tiny action-cam users are pretty great and go for around $100 but are tiny and may serve a better purpose such as a hair light or a tiny spot light. For those wanting a cheap but larger surface area, Polaroid has come to the rescue. For $99 you can buy the Polaroid RGB LED Light.

The Polaroid 320 RGB is a multi-color RGB LED light that runs off of a rechargeable Sony-style NP battery and can be controlled by an iOS or Android app via Bluetooth. The light also comes with a carrying case, a cold-shoe swivel head adapter to mount on top of a camera, a DC adapter with international adapters, a diffuser, battery and battery charger. The case itself is actually pretty nice — it will protect the light and hold all the accessories. I left the battery to charge overnight after I used the light so I can’t tell you exactly how long it took, but I can tell you it isn’t fast. Maybe a couple of hours. However, if you had a Sony camera from a long time ago you may have some leftover batteries you can use with the Polaroid light if you don’t have your DC adapter around.

The LED light is made up of 320 LED lamps: 144-3200K LEDS, 144-5600K LEDs and 32-RGB LEDs. The light can either be used in the RGB color mode or standard mode. To create the array of colors, Polaroid uses the 32 RGB LEDs to shine almost any color you can imagine. RGB lamps have three colors: red, green and blue, which can be turned off and on in multiple combinations to achieve almost any color. From cyan to magenta to yellow or purple, you can adjust the hue on the Polaroid RGB LED light by pushing the H/S button and turning the knob to the desired color. Oddly enough, it tells you which color you are on with a number between 0-199. I would think 360 would be the RGB designation since a color wheel is a circle.

If you hit the H/S button again you can access the saturation value of the light, which can be adjusted from 0-100. There is a Bluetooth light to tell you when the app is controlling the light… it will turn blue. Next to that is the Bat button, which will tell you how much power is left if using the battery. Underneath that is the Bri, or brightness, button and the Temp button. The Temp button is used when in the “white” mode to change color temperature values from 3200K-5600K, although the LED readout only displays three digits, so you won’t be getting the full Kelvin temperature read out.

But really the beauty of this light is using it through the Fi Light app you can find in both the App Store as well as the Google Play Store for Android. I have to admit, it was difficult finding the app in the Google Play Store, but if you look for the white lightbulb with blue background by “tek-q” you have the correct app. What’s even stranger is that to connect to the Polaroid light you don’t need to connect your Bluetooth to the light, the app will connect on its own. Something I couldn’t get through my head for some reason. But once the light is on and the app is up, start adjusting the hue, saturation and brightness, or even mess with the different modes like Rapid Rainbow Transition or Pulsating Red/Blue for a police light-type effect. While I couldn’t test more than one, there is a group settings dialogue that could presumably join forces of multiple lights to control them at once. The “Blu” light on the back of the light will light up, appropriately in blue when it is connected to your phone.

Summing Up
This light isn’t the strongest, especially when used in conjunction with the sunlight, but if you are photographing or filming products in a controlled environment like a garage, it will do just fine. Ideally, you would need two with a reflector, or three to light something. That being said, for $100 this Polaroid light may just fit your needs for product lighting, or even washing a wall with color behind an interview. It definitely won’t beat out any of the high-end LED lights, but it will do the job in a smaller space with controlled lighting. And because it can mount on a cold shoe of a camera it can even be a great run-and-gun light when working with subjects close to camera.

Check out www.polaroid.com for more products from Polaroid or on Amazon.com where you can search for this light and many more filmmaking-focused products from them.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: CyberPower PC workstation with AMD Ryzen

By Brady Betzel

With the influx of end users searching for alternatives to Mac Pros, as well as new ways to purchase workstation-level computing solutions, there is no shortage of opinions on what brands to buy and who might build it. Everyone has a cousin or neighbor that builds systems, right?

I’ve often heard people say, “I’ve never built a system or used (insert brand name here), but I know they aren’t good.” We’ve all run into people who are dubious by nature. I’m not so cynical, and when it comes to operating and computer systems, I consider myself Switzerland.

When looking for the right computer system, the main question you should ask is, “What do you need to accomplish?” Followed by, “What might you want to accomplish in the future?” I’m a video editor and colorist, so I need the system I build to work fluidly with Avid Media Composer, Blackmagic DaVinci Resolve and Adobe’s Premiere and After Effects. I also want my system to work with Maxon Cinema 4D in case I want to go a little further than Video Copilot’s Element 3D and start modeling in Cinema 4D. My main focus is video editing and color correction but I also need flexibility for other tools.

Lately, I’ve been reaching out to companies in the hopes of testing as many custom-built Windows -based PCs as possible. There have been many Mac OS-to-Windows transplants over the past few years, so I know pros are eager for options. One of the latest seismic shifts have come from the guys over at Greyscalegorilla moving away from Mac to PCs. In particular, I saw that one of the main head honchos over there, Nick Campbell (@nickvegas), went for a build complete with the Ryzen Threadripper 32-core workhorse. You can see the lineup of systems here. This really made me reassess my thoughts on AMD being a workstation-level processor, and while not everyone can afford the latest Intel i9 or AMD Threadripper processors, there are lower-end processors that will do most people just fine. This is where the custom-built PC makers like CyberPower PC, who equip machines with AMD processors, come into play.

So why go with a company like CyberPowerPC? The prices for parts are usually competitive, and the entire build isn’t much more than if you purchased the parts by themselves. Also, you deal with CyberPower PC for Warranty issues and not individual companies for different parts.

My CustomBuild
In my testing of an AMD Ryzen 7 1700x-based system with a Samsung NVMe hard drive and 16GB of RAM, I was able to run all of the software I mentioned before. The best part was the price; the total was around, $1,000! Not bad for someone editing and color correcting. Typically those machines can run anywhere from $2,000 to $10,000. Although the parts in those more expensive systems are more complex and have double to triple the amount of cores, some of that is wasted. And when on a budget you will be hard-pressed to find a better deal than CyberPower PC. If you build a system yourself, you might get close but not far off.

While this particular build isn’t going to beat out the AMD Threadripper’s or Intel i9-based systems, the AMD Ryzen-based systems offer a decent bang for the buck. As I mentioned, I focus on video editing and color correcting so I tested a simple one-minute UHD (3840×2160) 23.98 H.264 export. Using Premiere along with Adobe’s Media Encoder, I used about :30 seconds of Red UHD footage as well as some UHD S-log3/s-gamut3 footage I shot on the Sony a7 III creating a one-minute long sequence.

I then exported it as an H.264 at a bitrate around 10Mb/s. With only a 1D LUT on the Sony a7iii footage, the one-minute sequence took one minute 13 seconds. With added 10% resizes and a “simple” Gaussian blur over all the clips, the sequence exported in one minute and four seconds. This is proof that the AMD GPU is working inside of Premiere and Media Encoder. Inside Premiere, I was able to playback the full-quality sequence on a second monitor without any discernible frames dropping.

So when people tell you AMD isn’t Intel, technically they are right, but overall the AMD systems are performing at a high enough level that for the money you are saving, it might be worth it. In the end, with the right expectations and dollars, an AMD-based system like this one is amazing.

Whether you like to build your own computer or just don’t want to buy a big-brand system, custom-built PCs are a definite way to go. I might be a little partial since I am comfortable opening up my system and changing parts around, but the newer cases allow for pretty easy adjustments. For instance, I installed a Blackmagic DeckLink and four SSD drives for a RAID-0 setup inside the box. Besides wishing for some more internal drive cages, I felt it was easy to find the cables and get into the wiring that CyberPowerPC had put together. And because CyberPowerPC is more in the market for gaming, there are plenty of RGB light options, including the memory!

I was kind of against the lighting since any color casts could throw off color correction, but it was actually kind of cool and made my setup look a little more modern. It actually kind of got my creativity going.

Check out the latest AMD Ryzen processors and exciting improvements to the Radeon line of graphics cards on www.cyberpowerpc.com and www.amd.com. And, hopefully, I can get my hands on a sweet AMD Ryzen Threadripper 2990WX with 32 cores and 64 threads to really burn a hole in my render power.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: Avid Media Composer Symphony 2018 v.12

By Brady Betzel

In February of 2018, we saw a seismic shift in the leadership at Avid. Chief executive officer Louis Hernandez Jr. was removed and subsequently replaced by Jeff Rosica. Once Rosica was installed, I think everyone who was worried Avid was about to be liquidated to the highest bidder breathed a sigh of temporary relief. Still unsure whether new leadership was going to right a tilting ship, I immediately wanted to see a new action plan from Avid, specifically on where Media Composer and Symphony were going.

Media Composer with Symphony

Not long afterward, I was happily reading how Avid was taking lessons from its past transgressions and listening to its clients. I heard Avid was taking tours around the industry and listening to what customers and artists needed from them. Personally, I was asking myself if Media Composer with Symphony would ever be the finishing tool of Avid DS was. I’m happy to say, it’s starting to look that way.

It appears from the outside that Rosica is indeed the breath of fresh air Avid needed. At NAB 2019, Avid teased the next iteration of Media Composer, version 2019, with overhauled interface and improvements, such as a 32-bit float color pipeline workflow complete with ACES color management and a way to deliver IMF packages; a new engine with a distributed processing engine; and a whole new product called Media Composer|Enterprise, all of which will really help sell this new Media Composer. But the 2019 update is coming soon and until then I took a deep dive into Media Composer 2018 v12, which has many features editors, assistants, and even colorists have been asking for: a new Avid Titler, shape-based color correction (with Symphony option), new multicam features and more.

Titling
As an online editor who uses Avid Media Composer with Symphony option about 60% of the time, titling is always a tricky subject. Avid has gone through some rough seas when dealing with how to fix the leaky hole known as the Avid Title Tool. The classic Avid Title Tool was basic but worked. However, if you aligned something in the Title Tool interface to Title Safe zones, it might jump around once you close the Title Tool interface. Fonts wouldn’t always stay the same when working across PC and MacOS platforms. The list goes on, and it is excruciatingly annoying.

Titler

Let’s take a look at some Avid history: In 2002, Avid tried to appease creators and introduced the, at the time, a Windows-only titler: Avid Marquee. While Marquee was well-intentioned, it was extremely difficult to understand if you weren’t interested in 3D lighting, alignment and all sorts of motion graphics stuff that not all editors want to spend time learning. So, most people didn’t use it, and if they did it took a little while for anyone taking over the project to figure out what was done.

In December of 2014, Avid leaned on the New Blue Titler, which would work in projects higher than 1920×1080 resolution. Unfortunately, many editors ran into a very long render at the end, and a lot bailed on it. Most decided to go out of house and create titles in Adobe Photoshop and Adobe After Effects. While this all relates to my experience, I assume others feel the same.

In Avid Media Composer 2018, the company has introduced the Avid Titler, which in the Tools menu is labeled: Avid Titler +. It works like an effect rather than a rendered piece of media like in the traditional Avid Title Tool, where an Alpha and a Fill layer worked. This method is similar to how NewBlue or Marquee functioned. However, Avid Titler works by typing directly on the record monitor; adding a title is as easy as marking an in and out point and clicking on the T+ button on the timeline.

You can specify things like kerning, shadow, outlines, underlines, boxes, backgrounds and more. One thing I found peculiar was that under Face, the rotation settings rotate individual letters and not the entire word by default. I reached out to Avid and they are looking into making the entire word rotation option the default in the mini toolbar of Avid Titler. So stay tuned.

Also, you can map your fast forward and rewind buttons to “Go To Next/Previous Event.” This allows you to jump to your next edits in the timeline but also to the next/previous keyframes when in the Effect Editor. Typically, you click on the scrub line in the record window and then you can use those shortcuts to jump to the next keyframe. In the Avid Titler, it would just start typing in the text box. Furthermore, when I wanted to jump out of Effect Editor mode and back into Edit Mode, I usually hit “y,” but that did not get me out of Effects Mode (Avid did mention they are working on updates to the Avid Titler that would solve this issue). The new Avid Titler definitely has some bugs and/or improvements that are needed, and they are being addressed, but it’s a decent start toward a modern title editor.

Shape-based color correction

Color
If you want advanced color correction built into Media Composer, then you are going to want the Symphony option. Media Composer with the Symphony option allows for more detailed color correction using secondary color corrections as well as some of the newer updates, including shape-based color correction. Before Resolve and Baselight became more affordable, Symphony was the gold standard for color correction on a budget (and even not on a budget since it works so well in the same timeline the editors use). But what we are really here for is the 2018 v.12 update of Shapes.

With the Symphony option, you can now draw specific regions on the footage for your color correction to affect. It essentially works similarly to a layer-based system like Adobe Photoshop. You can draw shapes with the same familiar tools you are used to drawing with in the Paint or AniMatte tools and then just apply your brightness, saturation or hue swings in those areas only. On the color correction page you can access all of these tools on the right-hand side, including the softening, alpha view, serial mode and more.

When using the new shape-based tools you must point the drop-down menu to “CC Effect.” From there you can add a bunch of shapes on top of each other and they will play in realtime. If you want to lay a base correction down, you can specify it in the shape-based sidebar, then click shape and you can dial in the specific areas to your or your client’s taste. You can check off the “Serial Mode” box to have all corrections interact with one another or uncheck the box to allow for each color correction to be a little more isolated — a really great option to keep in mind when correcting. Unfortunately, tracking a shape can only be done in the Effect Editor, so you need to kind of jump out of color correction mode, track, and then go back. It’s not the end of the world, but it would be infinitely better if you could track efficiently inside of the color correction window. Avid could even take it further by allowing planar tracking by an app like Mocha Pro.

Shape-based color correction

The new shape-based corrector also has an alpha view mode identified by the infinity symbol. I love this! I often find myself making mattes in the Paint tool, but it can now be done right in the color correction tool. The Symphony option is an amazing addition to Media Composer if you need to go further than simple color correction but not dive into a full color correction app like Baselight or Resolve. In fact, for many projects you won’t need much more than what Symphony can do. Maybe a +10 on the contrast, +5 on the brightness and +120 on the saturation and BAM a finished masterpiece. Kind of kidding, but wait until you see it work.

Multicam
The final update I want to cover is multicam editing and improvements to editing group clips. I cannot emphasize enough how much time this would have saved me as an assistant editor back in the pre-historic Media Composer days… I mean we had dongles, and I even dabbled in the Meridian box. Literally days of grouping and regrouping could have been avoided with the Edit Group feature. But I did make a living fixing groups that were created incorrectly, so I guess this update is a Catch 22. Anyway, you can now edit groups in Media Composer by creating a group, right-clicking on that group and selecting Edit Group. From there, the group will now open in the Record Monitor as a sequence, and from there you can move, nudge and even add cameras to a previously created group. Once you are finished, you can update the group and refresh any sequences that used that group to update if you wish. One issue is that with mixed frame rate groups, Avid says committing to that sequence might produce undesirable effects.

Editing workspace

Cost of Entry
How much does Media Composer cost these days? While you can still buy it outright, it seems a bit more practical to go monthly since you will automatically get updates, but it can still be a little tricky. Do you need PhraseFind and/or ScriptSync? Do you need the Symphony option? Do you need to access shared storage? There are multiple options depending on your needs. If you want everything, then Media Composer Ultimate for $49 per month is what you want. If you want Media Composer and just one add-on, like Symphony, it will cost $19 per month plus $199 per year for the Symphony option. If you want to test the water before jumping in, you can always try Media Composer First.

For a good breakdown of the Media Composer pricing structure, check out KeyCode Media  page (a certified reseller). Another great link with tons of information organized into easily digestible bites is this. Additionally, www.freddylinks.com is a great resource chock full of everything else Avid, written by Avid technical support specialist Fredrik Liljeblad out of Sweden.

Group editing

Summing Up
In the end, I use and have used Media Composer with Symphony for over 15 years, and it is the most reliable nonlinear editor supporting multiple editors in a shared network environment that I have used. While Adobe Premiere Pro, Apple Final Cut Pro X and Blackmagic Resolve are offering fancy new features and collaboration modes, Avid seems to always hold stabile when I need it the most. These new improvements and a UI overhaul (set to debut in May), new leadership from Rosica, and the confidence of Rosica’s faithful employees all seem to be paying off and getting Avid back on the track they should have always been on.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: Mzed.com’s Directing Color With Ollie Kenchington

By Brady Betzel

I am constantly looking to educate myself, no matter what the source — or subject. Whether I am learning how to make a transition in Adobe After Effects from an eSports editor on YouTube to Warren Eagles teaching color correction in Blackmagic’s DaVinci Resolve on FXPHD.com, I’m always beefing up my skills. I even learn from bad tutorials — they teach you what not to do!

But when you come across a truly remarkable learning experience, it is only fair to share with the rest of the world. Last year I saw an ad for an MZed.com course called “Directing Color With Ollie Kenchington,” and was immediately interested. These days you can pretty much find any technical tutorial you can dream of on YouTube, but truly professional, higher education-like, theory-based education series are very hard to come by. Even ones you need to pay for aren’t always worth their price of admission, which is a huge let down.

Ollie sharing his wisdom.

Once I gained access to MZed.com I wanted to watch every educational series they had. From lighting techniques with ASC member Shane Hurlbut to the ARRI Amira Camera Primer, there are over 150 hours of education available from industry leaders. However, I found my way to Directing Color…

I am often asked if I think people should go to college or a film school. My answer? If you have the money and time, you should go to college followed by film school (or do both together, if the college offers it). Not only will you learn a craft, but you will most likely spend hundreds of hours studying and visualizing the theory behind it. For example, when someone asks me about the science behind camera lenses, I can confidently answer them thanks to my physics class based on lenses and optics from California Lutheran University (yes, a shameless plug).

In my opinion, a two-, four- or even 10-year education allows me to live in the grey. I am comfortable arguing for both sides of a debate, as well as the options that are in between —  the grey. I feel like my post-high school education really allowed me to recognize and thrive in the nuances of debate. Leaving me to play devil’s advocate maybe a little too much, but also having civil and proactive discussions with others without being demeaning or nasty — something we are actively missing these days. So if living in the grey is for you, I really think a college education supplemented by online or film school education is valuable (assuming you make the decision that the debt is worth it like I did).

However, I know that is not an option for everyone since it can be very expensive — trust me, I know. I am almost done paying off my undergraduate fees while still paying off my graduate ones, which I am still two or three classes away from finishing. That being said, Directing Color With Ollie Kenchington is the only online education series I have seen so far that is on the same level as some of my higher education classes. Not only is the content beautifully shot and color corrected, but Ollie gives confident and accessible lessons on how color can be used to draw the viewer’s attention to multiple parts of the screen.

Ollie Kenchington is a UK-based filmmaker who runs Korro Films. From the trailer of his Directing Color series, you can immediately see the beauty of Ollie’s work and know that you will be in safe hands. (You can read more about his background here.)

The course raises the online education bar and will elevate the audiences idea of professional insight. The first module “Creating a Palette” covers the thoughts behind creating a color palette for a small catering company. You may even want to start with the last Bonus Module “Ox & Origin” to get a look at what Ollie will be creating throughout the seven modules and about an hour and a half of content.

While Ollie goes over “looks,” the beauty of this course is that he goes through his internal thought processes including deciding on palettes based on color theory. He didn’t just choose teal and orange because it looks good, he chooses his color palette based on complementary colors.

Throughout the course Ollie covers some technical knowledge, including calibrating monitors and cameras, white balancing and shooting color charts to avoid having wrong color balance in post. This is so important because if you don’t do these simple steps, your color correction session while be much harder. And wasting time on fixing incorrect color balance takes time away from the fun of color grading. All of this is done through easily digestible modules that range from two to 20 minutes.

The modules include Creating a Palette; Perceiving Color; Calibrating Color; Color Management; Deconstructing Color 1 – 3 and the Bonus Module Ox & Origin.

Without giving away the entire content in Ollie’s catalog, my favorite modules in this course are the on-set modules. Maybe because I am not on-set that often, but I found the “thinking out loud” about colors helpful. Knowing why reds represent blood, which raise your heart rate a little bit, is fascinating. He even goes through practical examples of color use in films such as in Whiplash.

In the final “Deconstructing Color” modules, Ollie goes into a color bay (complete with practical candle backlighting) and dives in Blackmagic’s DaVinci Resolve. He takes this course full circle to show how since he had to rush through a scene he can now go into Resolve and add some lighting to different sides of someone’s face since he took time to set up proper lighting on set, he can focus on other parts of his commercial.

Summing Up
I want to watch every tutorial MZed.com has to offer. From “Philip Bloom’s Cinematic Masterclass” to Ollie’s other course “Mastering Color.” Unfortunately, as of my review, you would have to pay an additional fee to watch the “Mastering Color” series. It seems like an unfortunate trend in online education to charge a fee and then when an extra special class comes up, charge more, but this class will supposedly be released to the standard subscribers in due time.

MZed.com has two subscription models: MZed Pro, which is $299 for one year of streaming the standard courses, and MZed Pro Premium for $399. This includes the standard courses for one year and the ability to choose one “Premium” course.

“Philip Bloom’s Cinematic Master Class” was the Premium course I was signed up for initially, but you you can decide between this one and the “Mastering Color” course. You will not be disappointed regardless of which one you choose. Even their first course “How to Photograph Everyone” is chock full of lighting and positioning instruction that can be applied in many aspects of videography.

I really was impressed with Directing Color with Ollie Kenchington, and if the other course are this good MZed.com will definitely become a permanent bookmark for me.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: Red Giant’s Trapcode Suite 15

By Brady Betzel

We are now comfortably into 2019 and enjoying the Chinese Year of the Pig — or at least I am! So readers, you might remember that with each new year comes a Red Giant Trapcode Suite update. And Red Giant didn’t disappoint with Trapcode Suite 15.

Every year Red Giant adds more amazing features to its already amazing particle generator and emitter toolset, Trapcode Suite, and this year is no different. Trapcode Suite 15 is keeping tools like 3D Stroke, Shine, Starglow, Sound Keys, Lux, Tao, Echospace and Horizon while significantly updating Particular, Form and Mir.

I won’t be covering each plugin in this review but you can check out what each individual plugin does on the Red Giant’s website.

Particular 4
The bread and butter of the Trapcode Suite has always been Particular, and Version 4 continues to be a powerhouse. The biggest differences between using a true 3D app like Maxon’s Cinema 4D or Autodesk Maya and Adobe After Effects (besides being pseudo 3D) are features like true raytraced rendering and interacting particle systems with fluid dynamics. As I alluded to, After Effects isn’t technically a 3D app, but with plugins like Particular you can create pseudo-3D particle systems that can affect and be affected by different particle emitters in your scenes. Trapcode Suite 15 and, in particular (all the pun intended), Particular 4, have evolved to another level with the latest update to include Dynamic Fluids. Dynamic Fluids essentially allows particle systems that have the fluid-physics engine enabled to interact with one another as well as create mind-blowing liquid-like simulations inside of After Effects.

What’s even more impressive is that with the Particular Designer and over 335 presets, you don’t  need a master’s degree to make impressive motion graphics. While I love to work in After Effects, I don’t always have eight hours to make a fluidly dynamic particle system bounce off 3D text, or have two systems interact with each other for a text reveal. This is where Particular 4 really pays for itself. With a little research and tutorial watching, you will be up and rendering within 30 minutes.

When I was using Particular 4, I simply wanted to recreate the Dynamic Fluid interaction I had seen in one of their promos. Basically, two emitters crashing into each other in a viscus-like fluid, then interacting. While it isn’t necessarily easy, if you have a slightly above-beginner amount of After Effects knowledge you will be able to do this. Apply the Particular plugin to a new solid object and open up the Particular Designer in Effect Controls. From there you can designate emitter type, motion, particle type, particle shadowing, particle color and dispersion types, as well as add multiple instances of emitters, adjust physics and much more.

The presets for all of these options can be accessed by clicking the “>” symbol in the upper left of the Designer interface. You can access all of the detailed settings and building “Blocks” of each of these categories by clicking the “<” in the same area. With a few hours spent watching tutorials on YouTube, you can be up and running with particle emitters and fluid dynamics. The preset emitters are pretty amazing, including my favorite, the two-emitter fluid dynamic systems that interact with one another.

Form 4
The second plugin in the Trapcode Suite 15 that has been updated is Trapcode Form 4. Form is a plugin that literally creates forms using particles that live forever in a unified 3D space, allowing for interaction. Form 4 adds the updated Designer, which makes particle grids a little more accessible and easier to construct for non-experts. Form 4 also includes the latest Fluid Dynamics update that Particular gained. The Fluid Dynamics engine really adds another level of beauty to Form projects, allowing you to create fluid-like particle grids from the 150 included presets or even your own .obj files.

My favorite settings to tinker with are Swirl and Viscosity. Using both settings in tandem can help create an ooey-gooey liquid particle grid that can interact with other Form systems to build pretty incredible scenes. To test out how .obj models worked within form, I clicked over to www.sketchfab.com and downloaded an .obj 3D model. If you search for downloadable models that do not cost anything, you can use them in your projects under Creative Commons licensing protocols, as long as you credit the creator. When in doubt always read the licensing (You can find more info on creative commons licensing here, but in this case you can use them as great practice models.

Anyway, Form 4 allows us to import .obj files, including animated .obj sequences as well as their textures. I found a Day of the Dead-type skull created by JMUHIST, pointed form to the .obj as well as its included texture, added a couple After Effect’s lights, a camera, and I was in business. Form has a great replicator feature (much like Element3D). There are a ton of options, including fog distance under visibility, animation properties, and even the ability to quickly add a null object linked to your model for quick alignment of other elements in the scene.

Mir 3
Up last is Trapcode Mir 3. Mir 3 is used to create 3D terrains, objects and wireframes in After Effects. In this latest update, Mir has added the ability to import .obj models and textures. Using fractal displacement mapping, you can quickly create some amazing terrains. From mountain-like peaks to alien terrains, Mir is a great supplement when using plugins like Video Copilot Element 3D to add endless tunnels or terrains to your 3D scenes quickly and easily.

And if you don’t have or own Element 3D, you will really enjoy the particle replication system. Use one 3D object and duplicate, then twist, distort and animate multiple instances of them quickly. The best part about all of these Trapcode Suite tools is that they interact with the cameras and lighting native to After Effects, making it a unified animating experience (instead of animating separate camera and lighting rigs like in the old days). Two of my favorite features from the last update are the ability to use quad- or triangle-based polygons to texture your surfaces. This can give an 8-bit or low-poly feel quickly, as well as a second pass wireframe to add a grid-like surface to your terrain.

Summing Up
Red Giant’s Trapcode Suite 15 is amazing. If you have a previous version of the Trapcode Suite, you’re in luck: the upgrade is “only” $199. If you need to purchase the full suite, it will cost you $999. Students get a bit of a break at $499.

If you are on the fence about it, go watch Daniel Hashimoto’s Cheap Tricks: Aquaman Underwater Effects tutorial (Part 1 and Part 2). He explains how you can use all of the Red Giant Trapcode Suite effects with other plugins like Video CoPilot’s Element 3D and Red Giant’s Universe and offers up some pro tips when using www.sketchfab.com to find 3D models.

I think I even saw him using Video CoPilot’s FX Console, which is a free After Effects plugin that makes accessing plugins much faster in After Effects. You may have seen his work as @ActionMovieKid on Twitter or @TheActionMovieKid on Instagram. He does some amazing VFX with his kids — he’s a must follow. Red Giant made a power move to get him to make tutorials for them! Anyway, his Aquaman Underwater Effects tutorial take you step by step through how to use each part of the Trapcode Suite 15 in an amazing way. He makes it look a little too easy, but I guess that is a combination of his VFX skills and the Trapcode Suite toolset.

If you are excited about 3D objects, particle systems and fluid dynamics you must check out Trapcode Suite 15 and its latest updates to Particular, Mir and Form.

After I finished the Trapcode Suite 15 review, Red Giant released the Trapcode Suite 15.1 update. The 15.1 update includes Text and Mask Emitters for Form and Particular 4.1, updated Designer, Shadowlet particle type matching, shadowlet softness and 21 additional presets.

This is a free update that can be downloaded from the Red Giant website.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

 

Review: Boris FX’s Continuum and Mocha Pro 2019

By Brady Betzel

I realize I might sound like a broken record, but if you are looking for the best plugin to help with object removals or masking, you should seriously consider the Mocha Pro plugin. And if you work inside of Avid Media Composer, you should also seriously consider Boris Continuum and/or Sapphire, which can use the power of Mocha.

As an online editor, I consistently use Continuum along with Mocha for tight blur and mask tracking. If you use After Effects, there is even a whittled-down version of Mocha built in for free. For those pros who don’t want to deal with Mocha inside of an app, it also comes as a standalone software solution where you can copy and paste tracking data between apps or even export the masks, object removals or insertions as self-contained files.

The latest releases of Continuum and Mocha Pro 2019 continue the evolution of Boris FX’s role in post production image restoration, keying and general VFX plugins, at least inside of NLEs like Media Composer and Adobe Premiere.

Mocha Pro

As an online editor I am alway calling on Continuum for its great Chroma Key Studio, Flicker Fixer and blurring. Because Mocha is built into Continuum, I am able to quickly track (backwards and forwards) difficult shapes and even erase shapes that the built-in Media Composer tools simply can’t do. But if you are lucky enough to own Mocha Pro you also get access to some amazing tools that go beyond planar tracking — such as automated object removal, object insertion, stabilizing and much more.

Boris FX’s latest updates to Boris Continuum and Mocha Pro go even further than what I’ve already mentioned and have resulted in a new version naming, this round we are at 2019 (think of it as Version 12). They have also created the new Application Manager, which makes it a little easier to find the latest downloads. You can find them here. This really helps when jumping between machines and you need to quickly activate and deactivate licenses.

Boris Continuum 2019
I often get offline edits effects from a variety plugins — lens flares, random edits, light flashes, whip transitions, and many more — so I need Continuum to be compatible with offline clients. I also need to use it for image repair and compositing.

In this latest version of Continuum, BorisFX has not only kept plugins like Primatte Studio, they have brought back Particle Illusion and updated Mocha and Title Studio. Overall, Continuum and Mocha Pro 2019 feel a lot snappier when applying and rendering effects, probably because of the overall GPU-acceleration improvements.

Particle Illusion has been brought back from the brink of death in Continuum 2019 for a 64-bit keyframe-able particle emitter system that can even be tracked and masked with Mocha. In this revamp of Particle Illusion there is an updated interface, realtime GPU-based particle generation, expanded and improved emitter library (complete with motion-blur-enabled particle systems) and even a standalone app that can design systems to be used in the host app — you cannot render systems inside of the standalone app.

While Particle Illusion is a part of the entire Continuum toolset that works with OFX apps like Blackmagic’s DaVinci Resolve, Media Composer, After Effects, and Premiere, it seems to work best in applications like After Effects, which can handle composites simply and naturally. Inside the Particle Illusion interface you can find all of the pre-built emitters. If you only have a handful make sure you download additional emitters, which you can find in the Boris FX App Manager.

       
Particle Illusion: Before and After

I had a hard time seeing my footage in a Media Composer timeline inside of Particle Illusion, but I could still pick my emitter, change specs like life and opacity, exit out and apply to my footage. I used Mocha to track some fire from Particle Illusion to a dumpster I had filmed. Once I dialed in the emitter, I launched Mocha and tracked the dumpster.

The first time I went into Mocha I didn’t see the preset tracks for the emitter or the world in which the emitter lives. The second time I launched Mocha, I saw track points. From there you can track where you want your emitter to track and be placed. Once you are done and happy with your track, jump back to your timeline where it should be reflected. In Media Composer I noticed that I had to go to the Mocha options and change the option from Mocha Shape to no shape. Essentially, the Mocha shape will act like a matte and cut off anything outside the matte.

If you are inside of After Effects, most parameters can now be keyframed and parented (aka pick-whipped) natively in the timeline. The Particle Illusion plugin is a quick, easy and good-looking tool to add sparks, Milky Way-like star trails or even fireworks to any scene. Check out @SurfacedStudio’s tutorial on Particle Illusion to get a good sense of how it works in Adobe Premiere Pro.

Continuum Title Studio
When inside of Media Composer (prior to the latest release 2018.12), there were very few ways to create titles that were higher resolution than HD (1920×1080) — the New Blue Titler was the only other option if you wanted to stay within Media Composer.

Title Studio within Media Composer

At first, the Continuum Title Studio interface appeared to be a mildly updated Boris Red interface — and I am allergic to the Boris Red interface. Some of the icons for the keyframing and the way properties are adjusted looks similar and threw me off. I tried really hard to jump into Title Studio and love it, but I really never got comfortable with it.

On the flip side, there are hundreds of presets that could help build quick titles that render a lot faster than New Blue Titler did. In some of the presets I noticed the text was placed outside of 16×9 Title Safety, which is odd since that is kind of a long standing rule in television. In the author’s defense, they are within Action Safety, but still.

If you need a quick way to make 4K titles, Title Studio might be what you want. The updated Title Studio includes realtime playback using the GPU instead of the CPU, new materials, new shaders and external monitoring support using Blackmagic hardware (AJA will be coming at some point). There are some great pre-sets including pre-built slates, lower thirds, kinetic text and even progress bars.

If you don’t have Mocha Pro, Continuum can still access and use Mocha to track shapes and masks. Almost every plugin can access Mocha and can track objects quickly and easily.
That brings me to the newly updated Mocha, which has some new features that are extremely helpful including a Magnetic Spline tool, prebuilt geometric shapes and more.

Mocha Pro 2019
If you loved the previous version of Mocha, you are really going to love Mocha Pro 2019. Not only do you get the Magnetic Lasso, pre-built geometric shapes, the Essentials interface and high-resolution display support, but BorisFX has rewritten the Remove Module code to use GPU video hardware. This increases render speeds about four to five times. In addition, there is no longer a separate Mocha VR software suite. All of the VR tools are included inside of Mocha Pro 2019.

If you are unfamiliar with what Mocha is, then I have a treat for you. Mocha is a standalone planar tracking app as well as a native plugin that works with Media Composer, Premiere and After Effects, or through OFX in Blackmagic’s Fusion, Foundry’s Nuke, Vegas Pro and Hitfilm.

Mocha tracking

In addition (and unofficially) it will work with Blackmagic DaVinci Resolve by way of importing the Mocha masks through Fusion. While I prefer to use After Effects for my work, importing Mocha masks is relatively painless. You can watch colorist Dan Harvey run through the process of importing Mocha masks to Resolve through Fusion, here.

But really, Mocha is a planar tracker, which means it tracks multiple points in a defined area that works best in flat surfaces or at least segmented surfaces, like the side of a face, ear, nose, mouth and forehead tracked separately instead of all at once. From blurs to mattes, Mocha tracks objects like glue and can be a great asset for an online editor or colorist.

If you have read any of my plugin reviews you probably are sick of me spouting off about Mocha, saying how it is probably the best plugin ever made. But really, it is amazing — especially when incorporated with plugins like Continuum and Sapphire. Also, thanks to the latest Media Composer with Symphony option you can incorporate the new Color Correction shapes with Mocha Pro to increase the effectiveness of your secondary color corrections.

Mocha Pro Remove module

So how fast is Mocha Pro 2019’s Remove Module these days? Well, it used to be a very slow process, taking lots of time to calculate an object’s removal. With the latest Mocha Pro 2019 release, including improved GPU support, the render time has been cut down tremendously. In my estimation, I would say three to four times the speed (that’s on the safe side). In Mocha Pro 2019 removal jobs that take under 30 seconds would have taken four to five minutes in previous versions. It’s quite a big improvement in render times.

There are a few changes in the new Mocha Pro, including interface changes and some amazing tool additions. There is a new drop-down tab that offers different workflow views once you are inside of Mocha: Essentials, Classic, Big Picture and Roto. I really wish the Essentials view was out when I first started using Mocha, because it gives you the basic tools you need to get a roto job done and nothing more.

For instance, just giving access to the track motion objects (Translation, Scale, Rotate, Skew and Perspective) with big shiny buttons helps to eliminate my need to watch YouTube videos on how to navigate the Mocha interface. However, if like me you are more than just a beginner, the Classic interface is still available and one I reach for most often — it’s literally the old interface. Big Screen hides the tools and gives you the most screen real estate for your roto work. My favorite after Classic is Roto. The Roto interface shows just the project window and the classic top toolbar. It’s the best of both worlds.

Mocha Pro 2019 Essentials Interface

Beyond the interface changes are some additional tools that will speed up any roto work. This has been one of the longest running user requests. I imagine the most requested feature that BorisFX gets for Mocha is the addition of basic shapes, such as rectangles and circles. In my work, I am often drawing rectangles around license plates or circles around faces with X-splines, so why not eliminate a few clicks and have that done already? Answering my need, Mocha now has elliptical and rectangular shapes ready to go in both X-splines and B-splines with one click.

I use Continuum and Mocha hand in hand. Inside of Media Composer I will use tools like Gaussian Blur or Remover, which typically need tracking and roto shapes created. Once I apply the Continuum effect, I launch Mocha from the Effect Editor and bam, I am inside Mocha. From here I track the objects I want to affect, as well as any objects I don’t want to affect (think of it like an erase track).

Summing Up
I can save tons of time and also improve the effectiveness of my work exponentially when working in Continuum 2019 and Mocha Pro 2019. It’s amazing how much more intuitive Mocha is to track with instead of the built-in Media Composer and Symphony trackers.

In the end, I can’t say enough great things about Continuum and especially Mocha Pro. Mocha saves me tons of time in my VFX and image restoration work. From removing camera people behind the main cast in the wilderness to blurring faces and license plates, using Mocha in tandem with Continuum is a match made in post production heaven.

Rendering in Continuum and Mocha Pro 2019 is a lot faster than previous versions, really giving me a leg up on efficiency. Time is money right?! On top of that, using Mocha Pro’s magic Object removal and Modules takes my image restoration work to the next level, separating me from other online editors who use standard paint and tracking tools.

In Continuum, Primatte Studio gives me the leg up on greenscreen keys with its exceptional ability to auto analyze a scene and perform 80% of the keying work before I dial-in the details. Whenever anyone asks me what tools I couldn’t live without, I without a doubt always say Mocha.
If you want a real Mocha Pro education you need to watch all of Mary Poplin’s tutorials. You can find them on YouTube. Check out this one on how to track and replace a logo using Mocha Pro 2019 in Adobe After Effects. You can also find great videos at Borisfx.com.

Mocha point parameter tracking

I always feel like there are tons of tools inside of the Mocha Pro toolset that go unused simply because I don’t know about them. One I recently learned about in a Surfaced Studio tutorial was the Quick Stabilize function. It essentially stabilizes the video around the object you are tracking allowing you to more easily rotoscope your object with it sitting still instead of moving all over the screen. It’s an amazing feature that I just didn’t know about.

As I was finishing up this review I saw that Boris FX came out with a training series, which I will be checking out. One thing I always wanted was a top-down set of tutorials like the ones on Mocha’s YouTube page but organized and sent along with practical footage to practice with.

You can check out Curious Turtle’s “More Than The Essentials: Mocha in After Effects” on their website where I found more Mocha training. There is even a great search parameter called Getting Started on BorisFX.com. Definitely check them out. You can never learn enough Mocha!


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: iOgrapher Multi Case for mobile filmmaking

By Brady Betzel

Thanks to the amazing iPhone X, Google Pixel and Samsung Galaxy, almost everyone has a high-end video camera on their person at all times and this is helping to spur on mobile filmmaking and vlogging.

From YouTube to Instagram to movies like Unsane (Steven Soderbergh) or Tangerine (Sean Baker) — and regardless of whether you think a $35,000 camera setup tells a story better than a $1,000 cell phone (looking at you Apple Phone XS Max) — mobile filmmaking is here to stay and will only get better.

iOgrapher’s latest release is the iOgrapher Multi Case, a compact mobile filmmaking mounting solution that works with today’s most popular phones. iOgrapher has typically created solutions that were tied to the mobile device being used for filmmaking, such as an iPhone, the latest Samsung Galaxy phones, iPads or even action cameras like a GoPro Hero 7 Black.

With the new iOgrapher Multi Case you can fit any mobile device that measures more than 5 ½” x 2 ¼” and less than 6 ½” by 3 ⅜”. Unfortunately, you won’t be fitting an iPad or a GoPro in the iOgrapher Multi Case, but don’t fret! iOgrapher makes rigs for those as well. On the top of the Multi Case are two cold shoe mounts for lights, microphones or any other device, like a GoPro. To mount things with ¼” 20 screw mounts in the cold shoes you will need to find a cold shoe to ¼” 20 adapter, which is available on iOgrapher’s accessory page. You can also find these at Monoprice or Amazon for real cheap.

And if you are looking to order more mounts you may want to order some extra cold shoe adapters that can be mounted on the handles of the iOgrapher Multi Case in the additional ¼” 20 screw mounts. The mounts on the handles are great for adding in additional lighting or microphones. I’ve even found that if you are going to be doing some behind-the-scenes filming or need another angle for your shooting, a small camera like a GoPro can be easily mounted and angled. With all this mounting you should assume that you are going to be using the iOgrapher on a sturdy tripod. Just for fun, I mounted the iOgrapher Multi Case onto a GoPro 3-Way Grip, which can also be used as a light tripod. It wasn’t exactly stable but it worked. I wouldn’t suggest using it for more than an emergency shooting situation though.

On the flip side (all pun intended), the iOgrapher can be solidly mounted vertically with the ¼” 20 screw mounts on the handles. With Instagram making headway with vertical video in their Instagram Stories, iOgrapher took that idea and built that into their Multi Case, further cementing grumbling from the old folks who just don’t get vertical video.

Testing
I tried out both a Samsung Galaxy s8+ as well as an iPhone 7+ with their cases on inside of the iOgrapher Multi Case. Both fit. The iPhone 7+ was stretching the boundaries of the Multi Case, but it did fit and worked well. The way the phones are inserted into the Multi Case is by a spring-loaded bottom piece. From the left or top side, if you are shooting vertically, you push the bottom of the mobile device into the corner covered slots of the iOgrapher Multi Case until the top or the left side can be secured under the left or top side of the Multi Case. It’s really easy.

I was initially concerned with the spring loading of the case; I wasn’t sure if the springs would be resilient enough to handle the constant pulling in and out of the phones, but the springs are high quality and held up beautifully. I even tried inserting my mobile phones tons of times and didn’t notice any issues with the springs or my phones.

Take care when inserting your phone into the Multi Case if you have a protective shield on the screen of your device. If you aren’t extra careful it can pull or snag on the cover — especially with the tight fit of a case. Just pay attention and there will be nothing to worry about. The simple beauty of the iOgrapher is that with a wider grip of your filmmaking device, you have a larger area to distribute any shaking coming from your hands, essentially helping stabilize your filmmaking without the need for a full-fledged gimbal.

If you accidentally drop your iOgrapher you may get a scratch, but for the most part they are built sturdy and can withstand punishment, whether it’s from your four year old or from weather. If you want to get a little fancy, you can buy affordable lights like the Litra Torch (check out my review) to attach to the cold shoe mounts, or even a Rode microphone (don’t forget the TRS to TRRS adapter if you are plugging into an iPhone), and you are off and running.

Summing Up
I have been really intrigued with iOgrapher’s products since day one. They are an affordable and sturdy way to jump into filmmaking using cameras everyone carries with them every day: their phones.

Whether you are a high school student looking to get steady and professional mobile video, or a journalist looking for a quick way to make the most of your shots with just a phone, light, mic and tripod mount, the iOgrapher Multi Case will unlock your mobile filmmaking potential.

The iOgrapher Multi Case is a very durable protective case for your mobile filmmaking devices that is a steal at $79.99. If you are a parent that is looking for an inexpensive way to try and tease your child’s interest in video take a look at www.iographer.com and grab a few accessories like a Manfrotto light and Rode VideoMicro to add some subtle lighting and pick up the best quality audio.

Make sure to check out Dave Basulto’s — the creator of iOgrapher — demo of the iOgrapher Multi Case, including trying out the fit of different phones.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

You can now export ProRes on a PC with Adobe’s video apps

By Brady Betzel

Listen up post pros! You can now natively export ProRes from a Windows 10-based PC for $20.99 with the latest release of Adobe’s Premiere, After Effects and Media Encoder.

I can’t overstate how big of a deal this is. Previously, the only way to export ProRes from a PC was to use a knock-off reverse-engineered codec that would mimic the process — creating footage that would often fail QC checks at networks — or be in possession of a high-end app like Fusion, Nuke, Nucoda or Scratch. The only other way would be to have a Cinedeck in your hands and output your files in realtime through it. But, starting today, you can export native ProRes 4444 and ProRes 422 from your Adobe Creative Cloud Suite apps like Premiere Pro, After Effects, and Media Encoder. Have you wanted to use those two or three Nvidia GTX 1080ti graphics cards that you can’t stuff into a Mac Pro? Well, now you can. No more being tied to AMD for ProRes exports.

Apple seems to be leaving their creative clients in the dust. Unless you purchase an iMac Pro or MacBook Pro, you have been stuck using a 2013 Mac Pro to export or encode your files to ProRes specifications. A lot of customers, who had given Apple the benefit of the doubt and stuck around for a year or two longer than they probably should have waiting for a new Mac Pro — allegedly being released in 2019 — began to transition over to Windows-based platforms. All the while, most would keep that older Mac just to export ProRes files while using the more powerful and updated Windows PC to do their daily tasks.

Well, that day is now over and, in my opinion, leads me to believe that Apple is less concerned with keeping their professional clients than ever before. That being said, I love that Apple has finally opened their ProRes codecs up to the Adobe Creative Cloud.

Let’s hope it can become a system-wide feature, or at least added to Blackmagic’s Resolve and Avid’s Media Composer. You can individually rent Adobe Premiere Pro or After Effects for $20.99 month, rent the entire Adobe Creative Cloud library for $52.99 a month or, if you are a student or teacher, you can take advantage of the best deal around for $19.99 a month, which gives you ALL the Creative Cloud apps.

Check out Adobe’s blog about the latest Windows ProRes export features.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: DJI’s Mavic Air lightweight drone

By Brady Betzel

Since the first DJI Phantom was released in January of 2013, drones found a place in our industry. Turn on almost any television show airing on National Geographic and you will see some sort of drone videography at work. DJI and GoPro have revolutionized how everyone films over the last decade.

Nowadays drones are expected to be a part of every cameraperson’s kit. Once DJI released their second-generation flagship drone, the Mavic Pro, the quality of footage and still frame images went from prosumer-level to professional. One thing that has always been a tough sell for me with drones is the physical size of the unmanned aerial vehicles. The original DJI flagship drone, the Phantom, is a little big, you essentially need a duffle-sized backpack to carry it and its accessories. But now DJI has upped the ante with a smaller footprint — the Mavic Air.

The Mavic Air is absolutely the best drone I have ever had my hands on — from being the size of a few iPhones stacked on top of each other to recording high-quality footage that is 100% being used on television shows airing around the world. It’s not only the easiest drone to use with or without a remote, but it is by far the best picture I have seen from a consumer-level drone for under $1,000.

The Mavic Air is small, lightweight, and packed with amazing technology to help itself avoid slamming into the sides of buildings or trees. You can find all the nerdy tech specs here.

While there are super high-end drones flying Red Monstros around, sometimes there are restrictions that require the crew or cameraperson to downsize their equipment to only what is vital. So a drone that takes up a fraction of your carry-on luggage and will still yield 4K footage acceptable for broadcast is a win. Obviously, you won’t be getting the same sensors that you will find in the

Digging In
The Mavic Air has many features that set it apart from the pack. SmartCapture allows anyone to fly the drone without a remote, instead you just use a few specific gestures from your hands. An updated slow-motion feature allows the Mavic Air to shoot up to 1080p, 120fps for those uber-epic sweeps in slow motion. There are multiple Quickshot modes you can find in the DJI app — like the two newest: Asteroid and Boomerang.

DJI is known for advancing drone technology and keeping their prices relatively low. One of the most advanced features DJI consistently works on is flight sensors. Flight Autonomy 2.0 and Advanced Pilot Assistance Systems are the latest advances in technology for the Mavic Air. Flight Autonomy 2.0 takes information from the seven onboard infrared sensors to create its own 3D environmental map to avoid crashing. The Advanced Pilot Assistance System (APAS), which has to be enabled, will automatically tell the Mavic Air to avoid obstacles while flying.

Taking Flight
So really how is it to fly and work with the Mavic Air? It’s very easy. The drone is ultra-portable, and the remote folds up nicely as well — nice and tight in fact, and you can then unfold it and install the newly removable joysticks for flight. You mount your phone on the bottom and connect it with one of the three cables provided to you. I have a Samsung Galaxy, so I used a USB-C connection. I downloaded and updated the DJI Go App, connected the USB-C cable to my phone (which is a little clumsy and could hopefully be a little better integrated in the future), paired the remote to the Mavic Air and was flying… that was unless I had to update firmware. Almost every time I went to fly one piece of equipment — if not more — needed to be updated. While it doesn’t take a long time, it is annoying. Especially when you have three young boys staring at you to fly this bad boy around. But once you get up and running, the Mavic is simple to fly.

I was most impressed with how it handled wind. The Mavic Air lives up to its name, and while it is definitely tiny, it can fight for its position in wind with the best of them. The sensors are amazing as well. You can try your hardest (unless you are in sports mode — don’t try to fly into anything as the sensors are disabled) to run into stuff and the Mavic Air stops dead in its tracks.

The Mavic Air’s filming capabilities are just as impressive as its flying capabilities. I like to set my DJI footage to D-Cinelike to get a flatter color profile in my video, allowing for a little more range in the shadows and highlights when I am color correcting. However, the stock DJI color settings are amazing. Another trick is to knock the sharpening down to medium or off and add that back in when finishing your video. The Mavic Air records using a 3-axis stabilized camera for ultra-smooth video up to 4K (UHD) at 30fps in the newly upped 100Mb/s H.264/MPEG-4 AVC recording format. Not quite the H.265 compression, but I’m sure that will come in the next version. I would love to see DJI offer a built-in neutral density filter on their drones — this would really help get that cinematic look without sacrificing highlight and shadow detail.

In terms of batteries, I was sent two, which I desperately needed; they only lasted about 20 minutes apiece. The batteries take around an hour to charge, but when you buy the Fly More Combo for $999 you also get a sweet four-battery charger to charge them all at once. Check out all the goodies you get with the Fly More Combo.

You will want to buy a decent-sized memory card, probably a 128GB, but a 64GB would be fine. When inserting the memory card into the Air it can take a little practice, the slot and cover are a little clunky and hard to use.

Summing Up
In the end, the DJI Mavic Air is the best drone I have used hands down. From the ultra-portable size (due to its compact folding ability) to the amazing shooting modes, you get everything you would want in a drone for under $1,000 with the Fly More Combo. The Mavic Air is just the right balance of technology and fun that will make you want to fly your drone.

Sometimes I get intimidated when flying a drone because they are so large and distracting, but not the Mavic Air — it is tiny and unassuming but packed with raw power to capture amazing images for broadcast or personal use.

While we typically don’t rate our reviewed products, I will just this once. I would rate the Mavic Air a 10, and can only hope that they next iteration embraces the Hasselblad history to stretch the Mavic Air into even further professional directions.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Apple updates FCPX, Motion and Compressor

By Brady Betzel

While Apple was busy releasing the new Mac Mini’s last month, they were also quietly prepping some new updates. Apple has releasing free updates to FCPX, Motion and Compressor.

CatDV inside of FCPX

The FCPX 10.4.4 update includes Workflow Extensions, batch sharing, Comparison Viewer, built-in video noise reduction, timecode window and more. The Workflow Extensions are sure to take the bulk of the update cake: At launch Apple has announced Shutterstock, Frame.io and CatDV will have extensions directly usable inside of FCPX instead of through a web browser. Frame.io looks to be the most interesting extension with realtime reflection of who is watching your video and at what timecode they are at, a.k.a, “Presence.”

Frame.io being rebuilt from the ground up using Swift will make its venture inside of FCPX extremely streamlined and fast. Notwithstanding Internet bandwidth, Frame.io inside of FCPX looks to be the go-to approval system that FCPX editors will use. I am not quite sure why Apple didn’t create their own approval and note-taking system, but they didn’t go wrong working with Frame.io. Since many editors use this as their main approval system, FCPX users will surely love this implementation directly inside of the app.

When doing color correction, it is essential to compare your current work with either other images or the source image, and luckily for FCPX colorists you can now do this with the all new Comparison Viewer. Essentially, the Comparison Viewer will allow you to compare anything to the clip you are color grading.

One feature of this that I really love is that you can have access to scopes on both the reference image and your working image. If you understand how scopes work, color matching via parade or waveforms can often be quicker than by eyeball match.

Frame.io inside of FCPX

Final Cut Pro 10.4.4 has a few other updates like Batch Share, which allows you to cue a bunch of exports or projects in one step, Timecode Window (which is a “why wasn’t this there already” feature) is essential when editing video footage, and video noise reduction has been added as a built-in feature with adjustable amounts and sharpness. There are a few other updates like Tiny Planet, which allows you to quickly make that spherical 360-degree video look, not really an important technical update but fun nonetheless.

Motion
With Version 5.4.2, Apple has put the advanced color correction toolset from FCPX directly inside of Motion. In addition, you can now add custom LUTs to your work. Apple has added the Tiny Planet effect as well as a Comic filter inside Motion. Those aren’t incredibly impressive, but the addition of the color correction toolkit is an essential addition to Motion and will provide a lot of use.

Compressor
Compressor 4.4.2 in my opinion is the sleeper update. Apple has finally updated Compressor to a 64-bit engine to take advantage of all of your memory, as well as improved overall performance with huge files. And it will still work with legacy 32-bit formats. Closed captions can now be burned into a video, including the SRT format. Compressor has also added automatic configuration to apply correct frame rate, field order and color space to your MXF and QuickTime outputs.

The FCPX, Motion and Compressor updates are available now for free if you have previously purchased the apps. If not FCPX retails for $299.99. Motion and Compressor are $49.99 each.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: Puget Systems Genesis I custom workstation

By Brady Betzel

With so many companies building custom Windows-based PCs these days, what really makes for a great build? What would make me want to pay someone to build me a PC versus building it myself? In this review, I will be going through a custom-built PC sent to me to review by Puget Systems. In my opinion, besides the physical components, Puget Systems is the cream of the crop of custom -built PCs. Over the next few paragraphs I will focus on how Puget Systems identified the right custom-built PC solution for me (specifically for post), how my experience was before, during and after receiving the system and, finally, specs and benchmarks of the system itself.

While quality components are definitely a high priority when building a new workstation, the big thing that sets Puget Systems’ apart from the rest of the custom-built PC pack is the personal and highly thorough support. I usually don’t get the full customer experience when reviewing custom builds. Typically, I am sent a workstation and maybe a one-sheet to accompany the system. To Puget System’s credit they went from top to tail when helping me put together the system I would test. Not only did I receive a completely newly built and tested system, but I talked to a customer service rep, Jeff Stubbers, who followed up with me along the way.

First, I spoke with Jeff over the phone. We talked about my price range and what I was looking to do with the system. I usually get told what I should buy — by the way, I am not a person that likes to be told what I want. I have a lot of experience not only working on high-end workstations but have been building and supporting them essentially my entire life. I actively research the latest and greatest technology. Jeff from Puget Systems definitely took the correct approach; he started by asking which apps I use and how I use them. When using After Effects, am I doing more 3D work or simple lower thirds and titles. Do I use and do I plan to continue using Avid Media Composer, Adobe Premiere Pro or Blackmagic’s DaVinci Resolve the most?

Essentially, my answers were that I use After Effects sparingly, but I do use it. I use Avid Media Composer professionally more than Premiere, but I see more and more Premiere projects coming my way. However, I think Resolve is the future, so I would love to tailor my system toward that. Oh and I dabble in Maxon Cinema 4D as well. So in theory, I need a system that does everything, which is kind of a tall order.

I told Jeff that I would love to stay below $10,000, but need the system to last a few years. Essentially, I was taking the angle of a freelance editor/colorist buying an above mid-range system. After we configured the system, Jeff continued to detail benchmarks that Puget Systems performs on a continuing basis and why two GTX 1080ti cards are going to benefit me instead of just one, as well as why an Intel i9 processor would specifically benefit my work in Resolve.

After we finished on the phone I received an email from Jeff that contained a link to webpage that continually would update me on the details and how my workstation was being built — complete with pictures of my actual system. There are also some links to very interesting articles and benchmarks on the Puget System’s website. They perform more pertinent benchmarks for post production pros than I have seen from any other company. Usually you see a few generic Premiere or Resolve benchmarks, but nothing like Puget System’s, even if you don’t buy a system from them you should read their benchmarks.

While my system went through the build and ship process, I saw pictures and comments about who did what in the process over at Puget Systems. Beth was my installer. She finished and sent the system to Kyle who ran benchmarks. Kyle then sent it to Josh for quality control. Josh discovered the second GTX 1080ti was installed in a reduced bandwidth PCIe slot and would be sent back to Beth for correction. I love seeing this transparency! It not only gives me the feeling that Puget Systems is telling me the truth, but that they have nothing to hide. This really goes a long way with me. Once my system was run through a second quality control pass, it was shipped to me in four days. From start to finish, I received my system in 12 days. Not a short amount of time, but for what Puget Systems put the system through, it was worth it.

Opening the Box
I received the Genesis I workstation in a double box. A nice large box with sturdy foam corners encasing the Fractal Design case box. There was also an accessories box. Within the accessories box were a few cables and an awesome three-ring binder filled with details of my system, the same pictures of my system, including thermal imaging pictures from the website, all of the benchmarks performed on my system (real-world benchmarks like Cinebench and even processing in Adobe Premiere) and a recovery USB 3.0 drive. Something I really appreciated was that I wasn’t given all of the third-party manuals and cables I didn’t need, only what I needed. I’ve received other custom-built PCs where the company just threw all of the manuals and cables into a Ziploc and called it a day.

I immediately hooked the system up and turned it on… it was silent. Incredibly silent. The Fractal Design Define R5 Titanium case was lined with a sound-deadening material that took whatever little sound was there and made it zero.

Here are the specs of the Puget System’s Genesis I I was sent:
– Gigabyte X299 Designare EX motherboard
– Intel Core i9 7940X 3.1GHz 14 Core 19.25MB 165W CPU
– Eight Crucial DDR4-2666 16GB RAM
– EVGA GeForce GTX 1080 TI 11GB gaming video card
– Onboard sound card
– Integrated WiFi+Bluetooth networking
– Samsung 860 Pro 512GB SATA3 2.5-inch SSD hard drive — primary drive
– Samsung 970 Pro 1TB M.2 SSD hard drive — secondary drive.
– Asus 24x DVD-RW SATA (Black) CD / DVD-ROM
– Fractal Design Define R5 titanium case
– EVGA SuperNova 1200W P2 power supply
– Noctua NH-U12DX i4 CPU cooling
– Arctic Cooling MX-2 thermal compound
– Windows 10 Pro 64-bit operating system
– Warranty: Lifetime labor and tech support, one-year parts warranty
– LibreOffice software: courtesy install
– Chrome software: courtesy install
– Adobe Creative Cloud Desktop App software: courtesy Install
– Resolve 1-3 GPU

System subtotal: $8,358.38. The price is right in my opinion, and mixed with the support and build detail it’s a bargain.

System Performance
I ran some system benchmarks and tests that I find helpful as a video editor and colorist who uses plugins and other tools on a daily basis. I am becoming a big fan of Resolve, so I knew I needed to test this system inside of Blackmagic’s Resolve 15. I used a similar sequence between Adobe Premiere and Resolve 15: a 10-minute, 23.98fps, UHD/3840×2160 sequence with mixed format footage from 4K and 8K Red, ARRI Raw UHD and ProRes4444. I added some Temporal Noise Reduction to half of the clips, including the 8K Red footage, resizes to all clips, all on top of a simple base grade.

First, I did a simple Smart User cache test by enabling the User Cache at DNxHR HQX 10-bit to the secondary Samsung 1TB drive. It took about four minutes and 34 seconds. From there I tried to playback the media un-cached, and I was able to playback everything except the 8K media in realtime. I was able to playback the 8K Red media at Quarter Res Good (Half Res would go between 18-20fps playback). The sequence played back well. I also wanted to test the export speeds. The first test was an H.264 export without cache on the same sequence. I set the H.264 output in Resolve to 23.98fps, UHD, auto-quality, no frame reordering, force highest quality debayer/resizes and encoding profile: main. The file took 11 minutes and 57 seconds. The second test was a DNxHR HQX 10-bit QuickTime with the same sequence, it took seven minutes and 44 seconds.

To compare these numbers I recently ran a similar test on an Intel i9-based MacBook Pro and with the Blackmagic eGPU with Radeon Pro 580 attached, the H.264 export took 16 minutes and 21 seconds, while a ProRes4444 took 22 minutes and 57 seconds. While not comparing apples to apples, this is still a good comparison in terms of a speed increase you can have with a desktop system and a pair of Nvidia GTX 1080ti graphics cards. With the impending release of the Nvidia GTX 2080 cards, you may want to consider getting those instead.

While in Premiere I ran similar tests with a very similar sequence. To export an H.264 (23.98fps, UHD, no cache used during export, VBR 10Mb/s target rate, no frame reordering) it took nine minutes and 15 seconds. Going a step further it took 47 minutes to export an H.265. Similarly, doing a DNxHR HQX 10-bit QuickTime export took 24 minutes.

I also ran the AJA System test on the 1TB spare drive (UHD, 16GB test file size, ProRes HQ). The read speed was 2951MB/sec and the write speed was 2569MB/sec. Those are some very respectable drive speeds, especially for a cache or project drive. If possible you would probably want to add another drive for exports or to have your RAW media stored on in order to maximize input/output speeds.

Up next was Cinebench R15: OpenGL — 153.02fps, Ref. Match 99.6%, CPU — 2905 cb, CPU (single core) — 193cb and MP Ratio 15.03x. Lastly, I ran a test that I recently stumbled upon: the Superposition Benchmark from Unigine. While it is more of a gaming benchmark, I think a lot of people use this and might glean some useful information from it. The overall score was 7653 (fps: min 45.58, avg 57.24, max 72.11, GPU degrees Celsius: min 36, max 85, GPU use: max 98%.

Summing Up
In the end, I am very skeptical of custom-build PC shops. Typically, I don’t see the value in the premium they set when you can probably build it yourself with parts you choose from PCpartpicker.com. However, Puget Systems is the exception — their support and build-quality are top notch. From the initial phone conversation to the up-to-the minute images and custom-build updates online, to the final delivery, and even follow-up conversations, Puget Systems is by far the most thorough and worthwhile custom-build PC maker I have encountered.

Check out their high-end custom build PCs and tons of benchmark testing and recommendations on their website.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

HP offerings from Adobe Max 2018

By Brady Betzel

HP workstations have been a staple in the post community, especially for anyone not using a Mac or the occasional DIY/custom build from companies like Puget Systems or CyberPower PCs. The difference comes with customers who need workstation-level components and support. Typically, a workstation is run through much tougher and stringent tests so the client can be assured of 24/7/365 up-time. HP continues to evolve and become, in my opinion, a leader for all non-Apple dedicated workflows.

At Adobe Max 2018, HP announced updated components to its Z by HP line of mobile workstations, including the awesome ZBook Studio x360, ZBook Studio, ZBook 15 and ZBook 17. I truly love HP’s mobile workstation offerings. The only issue I constantly come up against is can I — or any freelance worker for that matter — justify the cost of their systems?

I always want the latest and greatest, and I feel I can get that with the updated performance options in this latest update to the ZBook line. They include the increased 6-core Intel i9 processors; expanded memory of up to 32GB (or 128GB in some instances); a really interesting M.2 SSD RAID-1 configuration from the factory that allows for constant mirroring of your boot drive (if one drive fails, the other will take over right where you left off); the ZBook Studio and Studio x360 getting a GPU increase with the Nvidia Quadro P2000; and the anti-glare touchscreen on the x360. This is all in addition to HP’s DreamColor option, which allows for 100% Adobe RGB coverage and 600 nits of brightness. But again, this all comes at a high cost when you max out the workstation with enough RAM and GPU horsepower. But there is some good news for those that don’t have a corporate budget to pull from: HP has introduced the pilot program Z Club.

The Z Club is essentially a leasing program for HP’s Z series products. At the moment, HP will take 100 creators for this pilot program, which will allow you to select a bundle of Z products and accessories that fit your creative lifestyle for a monthly cost. This is exactly how you solve the problem of getting prosumer and freelance workers who can’t quite justify a $5,000 price tag for purchase, but can justify a $100 a month payment. HP has touted categories of products for editors, photographers and many others. With monthly payments that range from $100 to $250, depending on what you order, this is much more manageable for mid-range end users who need the power of a workstation but up until now couldn’t afford it.

So what will you get if you are accepted to the Z Club pilot program? You can choose the products you want and not pay for three months. And you can continue or return your products, you can switch products and you will have access to a Z Club concierge service for any questions and troubleshooting.

On the call I had with HP, they mentioned that a potential bundle for a video editor could be an HP Z series mobile workstation or desktop, along with a DreamColor display, and an external RAID storage system to top it off.

In the end, I think HP (much like Blackmagic’s Resolve in the NLE/color world) is at the front of the pack. They are listening to what creatives are saying about Apple — how this giant company is not listening to their customers in an efficient and price-conscious way. Creating essentially a leasing program for mid- to high-range products with support is the future. It’s essentially Apple’s own iPhone program but with computers!

Hopefully this program takes off, and if you are lucky enough to be accepted into the pilot program, I would be curious to hear your experience, so please reach out. But with HP making strides in the workstation security initiatives like Sure Start, a privacy mode for mobile systems, and military-grade testing known as MIL-spec, HP is going from being a standard in the media and entertainment post industry. For those leaving Apple for a Windows-based PC, you should apply for the Z Club pilot program. Go to www.hp.com to find out more or follow along on Twitter @AdobeMax, @HP or using #AdobeMax.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: Blackmagic’s eGPU and Intel i9 MacBook Pro 2018

By Brady Betzel

Blackmagic’s eGPU is worth the $699 price tag. You can buy it from Apple’s website, where it is being sold exclusively for the time being. Wait? What? You wanted some actual evidence as to why you should buy the BMD eGPU?

Ok, here you go…

MacBook Pro With Intel i9
First, I want to go over the latest Apple MacBook Pro, which was released (or really just updated) this past July. With some controversial fanfare, the 2018 MacBook Pro can now be purchased with the blazingly fast Intel i9, 2.6GHz (Turbo Boost up to 4.3GHz) six-core processor. In addition, you can add up to 32GB of 2400MHz DDR4 onboard memory. The Radeon Pro 560x GPU with 4GB of GDDR5 memory and even a 4TB SSD storage drive. It has four Thunderbolt 3 ports and, for some reason, a headphone jack. Apple is also touting its improved butterfly keyboard switches as well as its True Tone display technology. If you want to read more about that glossy info head over to Apple’s site.

The 2018 MacBook Pro is a beast. I am a big advocate for the ability to upgrade and repair computers, so Apple’s venture to create what is essentially a leased computer ecosystem that needs to be upgraded every year or two usually puts a bad taste in my mouth.

However, the latest MacBook Pros are really amazing… and really expensive. The top-of-the-line MacBook Pro I was provided with for this review would cost $6,699! Yikes! If I was serious, I would purchase everything but the $2,000 upgrade from the 2TB SSD drive to the 4TB, and it would still cost $4,699. But I suppose that’s not a terrible price for such an intense processor (albeit not technically workstation-class).

Overall, the MacBook Pro is a workhorse that I put through its video editing and color correcting paces using three of the top four professional nonlinear editors: Adobe Premiere, Apple FCP X and Blackmagic’s Resolve 15 (the official release). More on those results in a bit, but for now, I’ll just say a few things: I love how light and thin it is. I don’t like how hot it can get. I love how fast it charges. I don’t like how fast it loses charge when doing things like transcoding or exporting clips. A 15-minute export can drain the battery over 40% while playing Spotify for eight hours will hardly drain the battery at all (maybe 20%).

Blackmagic’s eGPU with Radeon Pro 580 GPU
One of the more surprising releases from Blackmagic has been this eGPU offering. I would never have guessed they would have gone into this area, and certainly would never have guessed they would have gone with a Radeon card, but here we are.

Once you step back from the initial, “Why in the hell wouldn’t they let it be user-replaceable and also not brand dependent” shock, it actually makes sense. If you are Mac OS user, you probably can do a lot in terms of external GPU power already. When you buy a new iMac, iMac Pro or MacBook Pro, you are expecting it to work, full stop.

However, if you are a DIT or colorist that is more mobile than that sweet million-dollar color bay you dream of, you need more. This is where the BMD eGPU falls nicely into place. You plug it in and instantly see it populate in the menu bar. In addition, the eGPU acts as a dock with four USB 3 ports, two Thunderbolt 3 ports and an HDMI port. The MacBook Pro will charge off of the eGPU as well, which eliminates the need for your charger at your docking point.

On the go, the most decked out MacBook Pro can handle its own. So it’s no surprise that FCP X runs remarkably fast… faster than everything else. However, you have to be invested in an FCP X workflow and paradigm — and while I’m not there yet, maybe the future will prove me wrong. Recently, I saw someone on Twitter who developed an online collaboration workflow, so people are excited about it.

Anyway, many of the nonlinear editors I work with can also play on the MacBook Pro, even with 4K Red, ARRI and, especially, ProRes footage. Keep in mind though, with the 2K, 4K, or whatever K footage, you will need to set the debayer to around “half good” if you want a fluid timeline. Even with the 4GB Radeon 560x I couldn’t quite play realtime 4K footage without some sort of compromise in quality.

But with the Blackmagic eGPU, I significantly improved my playback capabilities — and not just in Resolve 15. I did try and plug the eGPU into a PC with Windows 10 I was reviewing at the same time and it was recognized, but I couldn’t get all the drivers sorted out. So it’s possible it will work in Windows, but I couldn’t get it there.

Before I get to the Resolve testing, I did some benchmarking. First I ran Cinebench R15 without the eGPU attached and got the following scores: OpenGL – 99.21fps, reference match 99.5%, CPU – 947cb, CPU (single core) 190cb and MP ratio of 5.00x. With the GPU attached: Open GL — 60.26fps, reference match 99.5%, CPU — 1057 cb, CPU (single core) 186cb and MP ratio of 5.69x. Then I ran Unigine’s Valley Benchmark 1.0 without the eGPU, which got 21.3fps and a score of 890 (minimum 12.4fps/maximum 36.2fps). With the eGPU it got 25.6fps and a score of 1073 (minimum 19.2 fps/max 37.1fps)

Resolve 15 Test
I based all of my tests on a similar (although not exact for the different editing applications) 10-minute timeline, 23.98fps, 3840×2160, 4K and 8K RAW Red footage (R3D files) and Alexa (.ari and ProRes444XQ) UHD footage, all with edit page resizes, simple color correction and intermittent sharpening and temporal noise reduction (three frames, better, medium, 10, 10 and 5).

Playback: Without the eGPU I couldn’t play 23.98fps, 4K Red R3D without being set to half-res. With the eGPU I could playback at full-res in realtime (this is what I was talking about in sentence one of this review). The ARRI footage would play at full res, but would go between 1fps and 7fps at full res. The 8K Red footage would play in realtime when set to quarter-res.

One of the most re-assuring things I noticed when watching my Activity Monitor’s GPU history readout was that Resolve uses both GPUs at once. Not all of the apps did.

Resolve 15 Export Tests
In the following tests, I disabled all cache or optimized media options, including Performance Mode.

Test 1: H.264 at 23.98fps, UHD, auto-quality, no frame reordering, force highest-quality debayer/resizes and encoding profile Main)
a. Without eGPU (Radeon Pro 560x): 22 minutes, 16 seconds
b. With BMD eGPU (Radeon Pro 580): 16 minutes and 21 seconds

Test 2: H.265 10-bit, 23.98/UHD, auto quality, no frame reordering, force highest-quality debayer/resizes)
a. Without eGPU: stopped rendering after 10 frames
b. With BMD eGPU: same result

Test 3:
ProRes4444 at 23.98/UHD
a. Without eGPU: 27 min and 29 seconds
b. With BMD eGPU: 22 minutes and 57 seconds

Test 4:
– Edit page cache – enabled Smart User Cache at ProResHQ
a. Without eGPU: 17 minutes and 28 seconds
b. With BMD eGPU: 12 minutes and 22 seconds

Adobe Premiere Pro v.12.1.2
I performed similar testing in Adobe Premiere Pro using a 10-minute timeline at 23.98fps, 3840×2160, 4K and 8K RAW Red footage (R3D files) and Alexa (DNxHR SQ 8-bit) UHD footage, all with Effect Control tab resizes and simple Lumetri color correction, including sharpening and intermittent denoise (16) under the HSL Secondary tab in Lumetri applied to shadows only.

In order to ensure your eGPU will be used inside of Adobe Premiere, you must use Metal as your encoder. To enable it go to File > Project Settings > General and change the renderer to Mercury Playback Engine GPU acceleration Metal — (OpenCL will only use the internal GPU for processing.)

Premiere did not handle the high-resolution media as aptly as Resolve had, but it did help a little. However, I really wanted to test the export power with the added eGPU horsepower. I almost always send my Premiere sequences to Adobe Media Encoder to do the processing, so that is where my exports were processed.

Adobe Media Encoder
Test 1: H.264 (No render used during exports: 23.98/UHD, 80Mb/s, software encoding doesn’t allow for profile setup)
a. Open CL with no eGPU: about 140 minutes (sorry had to chase the kids around and couldn’t watch this snail crawl)
b. Metal no eGPU: about 137 minutes (chased the kids around again, and couldn’t watch this snail crawl, either)
c. Open CL with eGPU: wont work, Metal only
d. Metal with eGPU: one hour

Test 2: H.265
a. Without eGPU: failed (interesting result)
b. With eGPU: 40 minutes

Test 3: ProRes4444
a. Without eGPU: three hours
b. With eGPU: one hour and 14 minutes

FCP X
FCP X is an interesting editing app, and it is blazing fast at handling ProRes media. As I mentioned earlier, it hasn’t been in my world too much, but that isn’t because I don’t like it. It’s because professionally I haven’t run into it. I love the idea of roles, and would really love to see that playout in other NLEs. However, my results speak for themselves.

One caveat to using the eGPU in FCP X is that you must force it to work inside of the NLE. At first, I couldn’t get it to work. The Activity Monitor would show no activity on the eGPU. However, thanks to a Twitter post, James Wells (@9voltDC) sent me to this, which allows you to force FCP X to use the eGPU. It took a few tries but I did get it to work, and funny enough I saw times when all three GPUs were being used inside of FCP X, which was pretty good to see. This is one of those use-at-your-own risk things, but it worked for me and is pretty slick… if you are ok with using Terminal commands. This also allows you to force the eGPU onto other apps like Cinebench.

Anyways here are my results with the BMD eGPU exporting from FCP X:

Test 1: H.264
a. Without eGPU: eight minutes
b. With eGPU: eight minutes and 30 seconds

Test 2: H.265: Not an option

Test 3: ProRes4444
a. Without eGPU: nine minutes
b. With eGPU: six minutes and 30 seconds

Summing Up
In the end, the Blackmagic eGPU with Radeon Pro 580 GPU is a must buy if you use your MacBook Pro with Resolve 15. There are other options out there though, like the Razer Core v2 or the Akitio Node Pro.

From this review I can tell you that the Blackmagic eGPU is silent even when processing 8K Red RAW footage (even when the MacBook Pro fans are going at full speed), and it just works. Plug it in and you are running, no settings, no drivers, no cards to install… it just runs. And sometimes when I have three little boys running around my house, I just want that peace of mind and I want things to just work like the Blackmagic eGPU.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: Maxon Cinema 4D R19 — an editor’s perspective

By Brady Betzel

It’s time for my yearly review of Maxon’s Cinema 4D. Currently in Release 19, Cinema 4D comes with a good amount of under-the-hood updates. I am an editor, first and foremost, so while I dabble in Cinema 4D, I am not an expert. There are a few things in the latest release, however, that directly correlate to editors like me.

Maxon offers five versions of Cinema 4D, not including BodyPaint 3D. There is the Cinema 4D Lite, which comes free with Adobe After Effects. It is really an amazing tool for discovering the world of 3D without having to invest a bunch of money. But, if you want all the goodies that come packed into Cinema 4D you will have to pay the piper and purchase one of the other four versions. The other versions include Prime, Broadcast, Visualize and Studio.

Cinema 4D Prime is the first version that includes features like lighting, cameras and animation. Cinema 4D Broadcast includes all of Cinema 4D Prime’s features as well as the beloved MoGraph tools and the Broadcast Library, which offers pre-built objects and cameras that will work with motion graphics. Cinema 4D Visualize includes Cinema 4D Prime features as well, but is geared more toward architects and designers. It includes Sketch and Toon, as well as an architecturally focused library of objects and presets. Cinema 4D Studio includes everything in the other versions plus unlimited Team Render nodes, a hair system, a motion/object tracker and much more. If you want to see a side-by-side comparison you can check out Maxon’s website.

What’s New
As usual, there are a bunch of new updates to Cinema 4D Release 19, but I am going to focus on my top three, which relate to the workflows and processes I might use as an editor: New Media Core, Scene Reconstruction and the Spherical Camera. Obviously, there are a lot more updates — including the incredible new OpenGL Previews and the cross-platform ProRender, which adds the ability to use AMD or Nvidia graphics cards — but to keep this review under 30 pages I am focusing on the three that directly impact my work.

New Media Core
Buckle up! You can now import animated GIFs into Cinema 4D. So, yes, you can import animated GIFs into Cinema 4D Release 19, but that is just one tiny aspect of this update. The really big addition is the QuickTime-free support of MP4 videos. Now MP4s can be imported and used as textures, as well as exported with different compression settings, directly from within Cinema 4D’s  interface — all of this without the need to have QuickTime installed. What is cool about this is that you no longer need to export image-based file sequences to get your movie inside of Cinema 4D. The only slowdown will be how long it takes Cinema 4D R19 to cache your MP4 so that you will have realtime playback… if possible.

In my experience, it doesn’t take that much time, but that will be dependent on your system performance. While this is a big under-the-hood type of update, it is great for those quick exports of a scene for approval. No need to take your export into Adobe Media Encoder, or something else, to squeeze out an MP4.

Scene Reconstruction
First off, for any new Cinema 4D users out there, Scene Reconstruction is convoluted and a little thick to wade through. However, if you work with footage and want to add motion graphics work to a scene, you will want to learn this. You can check out this Cineversity.com video for an eight-minute overview.

Cinema 4D’s Scene Reconstruction works by tracking your footage to generate point clouds, and then after you go back and enable Scene Reconstruction, it creates a mesh from the resulting scene calculation that Cinema 4D computes. In the end, depending on how compatible your footage is with Scene Detection (i.e. contrasting textures and good lighting will help) you will get a camera view with matching scene vertices that are then fully animatable. I, unfortunately, do not have enough time to recreate a set or scene inside of Cinema 4D R19, however, it feels like Maxon is getting very close to fully automated scene reconstruction, which would be very, very interesting.

I’ve seen a lot of ideas from pros on Twitter and YouTube that really blow my mind, like 3D scanning with a prosumer camera to recreate objects inside of Cinema 4D. Scene Reconstruction could be a game-changing update, especially if it becomes more automated as it would allow base users like me to recreate a set in Cinema 4D without having to physically rebuild a set. A pretty incredible motion graphics-compositing future is really starting to emerge from Cinema 4D.

In addition, the Motion Tracker has received some updates, including manual tracking on R, G, B, or custom channel — viewed as Tracker View — and the tracker can now work with a circular tracking pattern.

Spherical Camera
Finally, the last update, which seems incredible, is the new Spherical Camera. It’s probably because I have been testing and using a lot more 360 video, but the ability to render your scene using a Spherical Camera is here. You can now create a scene, add a camera and enable Spherical mapping, including equirectangular, cubic string, cubic cross or even Facebook’s 360 video 3×2 cubic format. In addition, there is now support for Stereo VR as well as dome projection.

Other Updates
In addition to the three top updates I’ve covered, there are numerous others updates that are just as important, if not more so to those who use Cinema 4D in other ways. In my opinion, the rendering updates take the cake. Also, as mentioned before, there is support for both Nvidia and AMD GPUs, multi-GPU support, incredible viewport enhancements with Physical Rendering and interactive Preview Renders in the viewport.

Under MoGraph, there is an improved Voronoi Fracture system (ability to destroy an object quickly) including improved performance for high polygon counts and detailing to give the fracture a more realistic look. There is also a New Sound Effector to allow for interactive MoGraph creation to the beat of the music. One final note: the new Modern Modelling Kernel has been introduced. The new kernel gives more ability to things like polygon reduction and levels of detail.

In the end, Cinema 4D Release 19 is a huge under-the-hood update that will please legacy users but will also attract new users with AMD-based GPUs. Moreover, Maxon seems to be slowly morphing Cinema 4D into a total 2D and 3D modeling and motion graphics powerhouse, much like the way Blackmagic’s Resolve is for colorists, video editors, VFX creators and audio mixers.

Summing Up
With updates like Scene Recreation and improved motion tracking, Maxon gives users like me the ability to work way above their pay grade to composite 3D objects onto our 2D footage. If any of this sounds interesting to you and you are a paying Adobe Creative Cloud user, download and open up Cinema 4D Lite along with After Effects, then run over to Cineversity and brush up on the basics. Cinema 4D Release 19 is an immensely powerful 3D application that is blurring the boundaries between 3D and 2D compositing. With Cinema 4D Release 19’s large library of objects, preset scenes and lighting setups you can be experimenting in no time, and I didn’t even touch on the modeling and sculpting power!


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: The Litra Torch for pro adventure lighting

By Brady Betzel

If you are on Instagram you’ve definitely seen your fair share of “adventure” photography and video. Typically, it’s those GoPro-themed action-adventure shots of someone cliff diving off a million-mile-high waterfall. I definitely get jealous. Nonetheless, one thing I love about GoPro cameras is their size. They are small enough to fit in your pocket, and they will reliably produce a great image. Where those actioncams suffer is with light performance. While it is getting better every day, you just can’t pull a reliably clean and noise-free image from a camera sensor so small. This is where actioncam lights come into play as a perfect companion, including the Litra Torch.

The Litra Torch is an 800 Lumen, 1.5 by 1.5-inch magnetic light. I first started seeing the tiny light trend on Instagram where people were shooting slow shutter photos at night but also painting certain objects with a tiny bit of light. Check out Litra on Instagram: @litragear to see some of the incredible images people are producing with this tiny light. I saw an action sports person showing off some incredible nighttime pictures using the GoPro Hero. He mentioned in the post that he was using the Litra Torch, so I immediately contacted Litra, and here I am reviewing the light. Litra sent me the Litra Paparazzi Bundle, which retails for $129.99. The bundle includes the Litra Torch,  along with a filter kit and cold shoe mount.

So the Litra Torch has four modes, all accessible by clicking the button on top of the light: 800 Lumen brightness, 450 Lumens, 100 Lumens and flashing. The Torch has a consistent color temperature of 5700 kelvin, essentially the light is a crisp white — right in between blue and yellow. The rechargeable lithium-ion battery can be charged via the micro USB cable and will last up to 30 minutes or more depending on the brightness selected. With a backup battery attached you could be going for hours.

Over a month with intermittent use I only charged it once. One night I had to check out something under the hood of my car and used the Litra Torch to see what I was doing. It is very bright and when I placed the light onto the car I realized it was magnetic! Holy cow. Why doesn’t GoPro put magnets into their cameras for mounting! The Torch also has two ¼-20 camera screw mounts so you can mount them just about anywhere. The construction of the Torch is amazing — it is drop-proof, waterproof and made of a highly resilient aluminum. You can feel the high quality of the components the first time you touch the Torch.

In addition to the Torch itself, the cold shoe mount and diffuser, the Paparazzi Bundle comes with the photo filter kit. The photo filter kit comes with five frames to mount the color filters onto the Torch; three sets of Rosco Tungsten 4600k filters; three sets of Rosco Tungsten 3200k filters; 1 White Diffuser filter; and one each of a red, yellow and green color filter. Essentially, they give you a cheap way to change white balance temperatures and also some awesome color filters to play around with. I can really see the benefit of having at least two if not three of the Litra Torches in your bag with the filter sets; you can easily set up a properly lit product shoot or even a headshot session with nothing more than three tiny Torch lights.

Putting It To The Test
To test out the light in action I asked my son to set-up a Lego scene for me. One hour later I had some Lego models to help me out. I always love seeing people’s Lego scenes on Instagram so I figured this would also be a good way to show off the light and the extra color filters sent in the Paparazzi Bundle. One thing I discovered is that I would love to have a slide-in filter holder that is built onto the light; it would definitely help me avoid wasting time having to pop filters into frames.

All in all, this light is awesome. The only problem is I wish I had three so I could do a full three-point lighting setup. However, with some natural light and one Litra Torch I had enough to pull off some cool lighting. I really liked the Torch as a colored spotlight; you can get that blue or red shade on different objects in a scene quickly.

Summing Up
In the end, the Litra Torch is an amazing product. In the future I would really love to see multiple white balance temperatures built into the Torch without having to use photo filters. Also, a really exciting but probably expensive prospect of building a Bluetooth connection and multiple colors. Better yet, make this light a full-color-spectrum app-enabled light… oh wait, just recently they announced the Litra Pro on Kickstarter. You should definitely check that out as well with it’s advanced options and color profile.

I am spoiled by all of those at home lights, like the LIFX brand, that change to any color you want, so I’m greedy and want those in a sub-$100 light. But those are just wishes — the Litra Torch is a must-have for your toolkit in my opinion. From mounting it on top of my Canon DSLR using the cold shoe mount, to using the magnetic ability and mounting in unique places, as well as using the screw mount to attach to a tripod — the Litra Torch is a mind-melting game changer for anyone having to lug around a 100-pound light kit, which makes this new Kickstarter of the Litra Pro so enticing.

Check out their website for more info on the Torch and new Litra Pro, as well as a bunch of accessories. This is a must-have for any shooter looking to carry a tiny but powerful light anywhere, especially for summer and the outdoors!


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: HP’s zBook x2 mobile workstation

By Brady Betzel

There are a lot of laptops and tablets on the market these days that can seemingly power a SpaceX Falcon 9 rocket launch and landing. If you work in media and entertainment like I do, these days you might even be asked to edit and color correct that Falcon 9 footage that could have been filmed in some insane resolution like 8K.

So how do you edit that footage on the go? You need to find the most powerful mobile solution on the market. In my mind there are only a few that can power editing 8K footage (even if the footage is transcoded into manageable ProRes proxies). There is Razer, which offers a 4K/UHD “gaming” laptop with its Razer Blade Pro. It sports a high-end Nvidia GTX 1060 GPU and i7 processor; Dell’s high-end Precision 7720 mobile workstation allows for a high-end Quadro GPU; and HP offers high-quality mobile workstations via its zBook line.

For this review, I am focusing on the transforming HP zBook x2 mobile workstation, complete with an Intel Core i7 CPU, 32GB memory, Nvidia Quadro and much more.

The zBook x2 allows you to go laptop-style to tablet by removing the keyboard. If you’ve ever used a Wacom Cintiq mobile tablet, you’ve likely enjoyed the matte finish of the display, as well as the ability to draw directly on screen with a stylus. Well, the zBook x2 is a full touchscreen as well as stylus-enabled matte surface compatible with HP’s own battery-less pen. The pen from HP is based off of Wacom’s Electro Magnetic Resonance technology, which essentially allows for cable- and battery-free pens.

In addition, the display bezel has 12 buttons that are programmable for apps like Adobe’s Creative Cloud. For those wondering, HP partnered with Adobe when designing the x2, so you will notice that Creative Cloud comes pre-installed on the system, and the quick access buttons around the bezel are already programmed for use in Adobe’s apps. However, they don’t give you a free subscription with purchase — Hey, HP, this would be a nice touch. Just a suggestion.

Digging In
I was sent the top-of-the-line version of the zBook x2, complete with a DreamColor UHD touchscreen display. Here are the specs under the hood:

– Windows 10 64-bit
– Intel Core i7 8650 (Quad Core — 8th gen)
– 4K UHD DreamColor Touch with anti-glare
– 32GB (2×16 GB) DDR4 2133 memory
– Nvidia Quadro M620 (2GB)
– 512GB HP Z-Turbo Drive PCIe
– 70Whr fast charging battery
– Intel vPro WLAN
– Backlit Bluetooth Keyboard
– Fingerprint reader
– One- or three-year warranty, including the battery
– Two Thunderbolt 3 ports
– HDMI 1.4 port
– USB 3.0 charging port
– SD card slot
– Fingerprint reader
– Headset/microphone port
– External volume controls

The exterior hardware specs are as impressive as the technical specs. I’ve got to be honest, when I first received the x2, I was put off by the sharp edged-octagon design. I’m so used to either square shaped tablets or rounded edges, so the octagon-edged sides were a little strange. After using it for a month, I got used to how sturdy and well built this machine is. I kind of miss the octagon shape now that I had to ship the x2 back to HP.

In addition, the zBook x2 I received weighed in at around 5lbs (with the bluetooth keyboard attached), which isn’t really lightweight. Part of that weight is the indestructible-feeling magnesium and aluminum casing that surrounds the x2’s internal components.

I’ve reviewed a few of these stylus-based workstations before, such as Microsoft’s Surface Pro and Wacom’s mobile Cintiq offering, and they each have their positives and negatives. One thing that consistently sticks out to me is the kickstand used to prop these machines up. When you use a stylus on a tablet you will have a height and angle you like to work at. Some tablets have a few specified heights like the Wacom offering. The Surface Pro has a somewhat limited angle, but the zBook x2 has the strongest and best working built-in stand that I have used. It is sturdy when working in apps, like Adobe Photoshop, with the stylus.

HP’s Wacom-infused stylus is very lightweight. I personally like a stylus that is a little hefty, like the Wacom Pro Pen, but don’t get me wrong, HP’s pen works well. The pen has a similar pressure sensitivity to the Wacom’s pens many multimedia pros are used to at 4,096 levels and includes tilt sensitivity. When using tablets, palm rejection is a very important feature, and the x2 has excellent palm rejection. HP’s fact sheets and website all have different information on whether the pen is included with the x2 or not, but when ordering it looks like it is bundled with your purchase. As it should be).

One final note on the build quality of HP’s zBook x2: the detachable Bluetooth keyboard is excellent. The keyboard not only acts like a full-sized keyboard, complete with numerical keypad (a favorite of mine when typing in specific timecodes), but it also folds up to protect the screen when not in use.

If you are looking at the zBook x2 to purchase, you are probably also comparing it to a Microsoft Surface Pro, a Wacom Cintiq mobile computer and maybe an iPad Pro. In my opinion, there is no contest. Te x2 wins hands down. However, you are also going to be paying a lot more for it. For instance, the x2 can be purchased with the latest Intel 8th gen i7 processors, an Nvidia Quadro GPU built into the tablet —not the keyboard like on the Microsoft Surface Book systems — it has the ability to be packed with 32GB of RAM as opposed to 16GB in all other tablets. And most importantly, in my opinion, this system offers a color-accurate UHD 10-bit-HP DreamColor display. As I said, it is definitely the beefiest mobile workstation/tablet that you will find out there, but will cost you.

One of my favorite practices that HP is starting to standardize among its mobile workstations is the use of quick charging, where you can charge 50% of your battery in a half an hour and the rest over a few more hours. I can’t tell you how handy this is when you are running around all day and don’t have four hours to charge your computer between appointments. When running apps like Blackmagic’s Resolve 14.3 with UHD video, you can drain the battery fast — something like four hours — but being able to quickly charge back up to 50% is a lifesaver in a lot of circumstances.

In the real world, I use my mobile workstation/tablets all the time. I surf the web, listen to music, edit in Adobe Premiere Pro or color correct in Resolve. This means my systems have to have some high-end processors to keep up. The HP zBook x2 is a great addition to your workstation lineup when you need to take your work on the road and not lose any features, like the HP DreamColor display with 100% Adobe RGB color accuracy. While it’s not a truly calibrated work monitor, DreamColor displays will, at the very least, give you a common calibration among all DreamColor monitors that you can rely on for color critical jobs on the run. In addition, DreamColor displays can display different color spaces like BT. 709, DCI-P3 and more.

Putting it to the Test
To test the x2, I ran a few tests using one of the free clips that Red offers to download from: http://www.red.com/sample-r3d-files. It is the Red One Mysterium clip with a resolution of 4096×2304 and runs at 29.97fps. For a mobile workstation this is a pretty hefty clip to run in Resolve or Premiere. In Premiere, the Red clip would play at realtime when dumbed down to half quality. Half quality isn’t bad to work in, but when spending $3,500 I would like to work in a better-quality Red files. Maybe the technology will be there in a year.

If you are into the whole offline/online workflow (a.k.a. proxy workflow — a.k.a. transcoding to a interframe codec like DNxHR or ProRes — then you will be able to play down the full 4K clip when transcoding to something like DNxHR HQ. Unfortunately, I couldn’t get a 10-bit DNxHR HQX clip to play at realtime, and with the sweet 10-bit display that could have been a welcome success. To test exporting speed I trimmed the R3D file (still raw Red) to 10 seconds and exported it as a DNxHR HQX 10-bit QuickTime (in the files native resolution and frame rate) and highly compressed H.264 at around 10,000mb/s.

The DNxHR HQX 10-bit QuickTime took 1 minute and 25 seconds to export. I then added a 110% resize and a color grade to really make sure the Quadro GPU kicked in, and unfortunately the export failed. I tried multiple times with different Lumetri color grades and all of them failed, probably a sweet bug.

Next, I exported an uncolored 10,000mb/s H.264 MP4 (a clip perfect for YouTube) in 2 minutes and 41 seconds. I then resized the clip to 110% and performed a color grade using the Lumetri tools inside of Premiere Pro. The MP4 exported in 1 minute and 30 seconds. This was pretty incredible and really showed just how important that Nvidia Quadro M620 with 2GB of memory is. And while things like resizing and color correcting will make sure your GPU kicks in to help, the HP zBook x2 was relatively quiet with the active cooling fan system that kicks all of the hot air up and out of the magnesium case.

Inside of Resolve 14.3, I performed the same tests on the same Red clip. I was able to play the Red clip at about 16fps in 1/16 debayer quality in realtime. Not great, but for a mobile tablet workstation, maybe it’s ok, although I would expect more from a workstation. When exporting the DNxHR HQX 10-bit QuickTime took 2 minutes and the same clip resized to 110% and color graded also took 2 minutes. The H.264 took 2 minutes and 33 seconds without any color grading and resizing, but it also took 2 minutes and 33 seconds when resized 110% and color graded. I had all caching and performance modes disabled when performing these tests. I would have thought Resolve would have performed better than Premiere Pro, but in this case Adobe wins.

As a bonus, I happen to have Fusion, GoPro’s 360 video camera, and ran it through Fusion Studio, GoPro’s stitching and exporting software. Keep in mind 360 video is a huge resource hog that takes lots of time to process. The 30-second test clip I exported in flat color, with image stabilization applied, took an hour to export. The resulting file was a 1.5GB – 4992×2496 4:2:2 Cineform 10-bit YUV QuickTime with Ambisonic audio. That’s a big and long render in my opinion, although it will also take a long time on many computers.

Summing up
In the end, the HP zBook x2 is a high-end mobile workstation that doubles as a stylus-based drawing tablet designed to be used in apps like Photoshop and even video editing apps like Premiere Pro.

The x2 is profoundly sturdy with some high-end components, like the Intel i7 8th gen processor, Nvidia Quadro M620 GPU, 4K/UHD HP DreamColor touchscreen display and 32GB of RAM.

But along with these high-end components comes a high price: the setup in this review retails for around $3,500, which is not cheap. But for a system that is designed to be run 24 hours a day 365 days a year, it might be the investment you need to make.

Do you want to use the table at the office when connected to a Thunderbolt 3 dock while also powering a 4K display? The x2 is the only mobile table workstation that will do this at the moment. If I had any criticisms of the HP zBook x2 it would be the high cost and the terrible speakers. HP touts the Bang & Olufsen speakers on the x2, but they are not good. My Samsung Galaxy S8+ has better speakers.

So whether you are looking to color correct on the road or have a Wacom-style table at the office, the HP zBook x2 is a monster that HP has certified with companies like Adobe using their Independent Software Vendor verifications to ensure your drivers and software will work as well as possible.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: HP’s ZBook Studio G4 mobile workstation

By Brady Betzel

It seems like each year around this time, I offer my thoughts on an HP mobile workstation and how it serves multimedia professionals. This time I am putting the HP ZBook Studio G4 through its paces. The ZBook Studio line of HP’s mobile workstations seems to fit right in the middle between ease of mobility, durability and power. The ZBook 14u and 15u are the budget series mobile workstations that run Intel i5/i7 processors with AMD FirePro graphics and top out at around $1,600. The ZBook 15 and 17 are the more powerful mobile workstations in the line with the added ability to include Intel Xeon processors, ECC memory, higher-end Nvidia Quadro graphics cards and more. But in the this review we will take the best of all models and jam them into the light and polished ZBook Studio G4.

The HP ZBook Studio G4 I was sent to test out had the following components:
– Windows 10 64 bit
– Intel Xeon 1535M (7th gen) quad-core processor – 3.10GHz with 4.2 Turbo Boost
– 4K UHD DreamColor/15.6-inch IPS screen
– 32GB ECC (2x16GB)
– Nvidia Quadro M1200 (4GB)
– 512GB HP Z Turbo Drive PCIe (MLC)
– 92Whr fast charging battery
– Intel vPro WLAN
– Backlit keyboard
– Fingerprint reader

According to the info I was sent directly from HP, the retail price is $3,510 on hp.com (US webstore). I built a very similar workstation on http://store.hp.com and was able to get the price at $3,301.65 before shipping and taxes, and $3,541.02 with taxes and free shipping. So actually pretty close.

So, besides the natural processor, memory and hard drive upgrades from previous generations, the ZBook Studio G4 has a few interesting updates, including the higher-wattage batteries with fast charge and the HP Sure Start Gen3 technology. The new fast charge is similar to the feature that some products like the GoPro Hero 5/6 cameras and Samsung Galaxy phones have, where they charge quicker than “normal.” The ZBook Studio, as well as the rest of the ZBook line, will charge 50% of your battery in around 30 minutes when in standby mode. Even when using the computer, I was able to charge the first 50% in around 30 minutes, a feature I love. After the initial 50% charge is complete, the charging will be at a normal rate, which wasn’t half bad and only took a few hours to get it to about 100%.

The battery I was sent was the larger of the two options and provided me with an eight-hour day with decent usage. When pushed using an app like Resolve I would say it lasted more like four hours. Nonetheless it lasted a while and I was happy with the result. Keep in mind the batteries are not removable, but they do have a three-year warranty, just like the rest of the mobile workstation.

When HP first told me about its Sure Start Gen 3, I thought maybe it was just a marketing gimmick, but then I experienced its power — and it’s amazing. Essentially, it is a hardware function available on only 7th generation Intel processors that allows the BIOS to repair itself upon identification of malware or corruption. While using the ZBook Studio G4, I was installing some software and had a hard crash (blue screen). I noticed when it restarted the BIOS was running through the Sure Start protocol, and within minutes I was back up and running. It was reassuring and would really set my mind at ease if deciding between a workstation-level solution or retail store computing solution.

You might be asking yourself why you should buy an enterprise-level mobile workstation when you could go buy a laptop for cheaper and almost as powerful at Best Buy or on Amazon? Technically, what really sets apart workstation components is their ability to run 24/7 and 365 days a year without downtime. This is helped by Intel Xeon processors that allow for ECC (Error Correcting Code memory), essentially bits don’t get flipped as they can with non-ECC memory. Or for laymen, like me, ECC memory prevents crashing by fixing errors itself before we see any repercussions.

Another workstation-level benefit is the environmental testing that HP runs the ZBooks through to certify their equipment as military grade, also known as MIL-810G testing. Essentially, they run multiple extreme condition tests such as high and low temperatures, salt, fog and even high-vibration testing like gunfire. Check out a more in-depth description on Wikipedia. Finally, HP prides itself on its ISV (Independent Software Vendors) verification. ISV certification means that HP spends a lot of time working with software vendors like Adobe, Avid, Autodesk and others to ensure compatibility with their products and HP’s hardware so you don’t have to. They even release certified drivers that help to ensure compatibility regularly.

In terms of warranty, HP gives you a three-year limited warranty. This includes on-site service within the Americas, and as mentioned earlier it covers the battery, which is a nice bonus. Much like other warranties it covers problems arising from faulty manufacturing, but not intentional or accidental damage. Luckily for anyone who purchases a Zbook, these systems can take a beating. Physically, the computer weighs in around 4.6lbs and is 18mm thin. It is machined aluminum that isn’t sharp, but it can start to dig into your wrists when typing for long periods. Around the exterior you get two Thunderbolt 3 ports, an HDMI port, three USB 3.1 ports (one on left and two on the right), an Ethernet port and Kensington Lock port. On the right side, you also get a power port — I would love for HP to design some sort of break-away cable like the old Magsafe cables on the MacBook Pros — and there is also a headphone/mic input.

DreamColor Display
Alright, so now I’ll go through some of the post-nerd specs that you might be looking for. Up first is the HP DreamColor display, which is a color-critical viewing solution. With a couple clicks in the Windows toolbar on the lower right you will find a colored flower — click on that and you can immediately adjust the color space you want to view your work in: AdobeRGB, sRGB, BT.709, DCI-P3 or Native. You can even calibrate or backup your own calibration for later use. While most colorists or editors use an external calibrated monitoring solution and don’t strictly rely on your viewing monitor as the color-critical source, using the DreamColor display will get you close to a color critical display without purchasing additional hardware.

In addition, DreamColor displays can play back true 24fps without frame rate conversion. One of my favorite parts of DreamColor is that if you use an external DreamColor monitor through Thunderbolt 3 (not using an SDI card), you can load your color profile onto the second or third monitor and in theory they should match. The ZBook Studio G4 seems to have been built as a perfect DIT (digital imaging technician) solution for color critical work in any weather-challenged or demanding environment without you having to worry about failure.

Speed & Testing
Now let’s talk about speed and how the system did with speed tests. When running a 24TB (6TB-4TB drives) G-Speed ShuttleXL with Thunderbolt 3 from G-Technology, I was able to get write speeds of around 1450MB/s and read speeds of 960MB/s when running the AJA System Test using a 4GB test file running RAID-0. For comparison, I ran the same test on the internal 512GB HP Z Turbo Drive, which had a write speed of 1310MB/s and read speed 1524MB/s. Of course, you need to keep in mind that the internal drive is a PCIe SSD whereas the RAID is 7200RPM drives. Finally, I ran the standard benchmarking app Cinebench R15 that comes from the makers of Maxon Cinema 4D, a 3D modeling app. For those interested, the OpenGL test ran at 138.85fps with a Ref. Match of 99.6%, CPU 470cb and CPU (Single Core) 177cb with an MP Ratio of 2.65x.

I also wanted to run the ZBook through some practical and real-world tests, and I wanted to test the rendering and exporting speeds. I chose to use Blackmagic’s DaVinci Resolve 14.2 software because it is widely used and an easily accessible app for many of today’s multimedia pros. For a non-scientific yet important benchmark, I needed to see how well the ZBook G4 played back R3D files (Red camera files), as well as QuickTimes with typical codecs you would find in a professional environment, such as ProRes and DNxHD. You can find a bunch of great sample R3D files on Red’s website. The R3D I chose was 16 seconds in length, shot on a Red Epic Dragon at 120fps and UHD resolution (3840×2160). To make sure I didn’t have anything skewing the results, I decided to clear all optimized media, if there was any, delete any render cache, uncheck “Use Optimized Media If Available” and uncheck “Performance Mode” just in case that did any voodoo I wasn’t aware of.

First was a playback test where I wanted to see at what decode quality I could playback in at realtime without dropping frames when I performed a slight color correction and added a power window. For this clip, I was able to get it to playback in a 23.98/1080p timeline in realtime when it was set to Half Resolution Good. At Half Resolution Premium I was dropping one or two frames. While playing back and at Full Resolution Premium, I was dropping five or six frames —playing back at around 17 or 18fps. Playing back at Half Resolution Good is actually great playback quality for such a high-quality R3D with all the head room you get when coloring a raw camera file and not a transcode. This is also when the fans inside the ZBook really kicked in. I then exported a ProRes4444 version of the same R3D clip from RedCine-X Pro with the LUT info from the camera baked in. I played the clip back in Resolve with a light color treatment and one power window with no frames dropped. When playing back the ProRes4444 file the fans stayed at a low pitch.

The second test was a simple DNxHD 10-bit export from the raw R3D. I used the DNxHD 175x codec — it took about 29 seconds, which was a little less than double realtime. I then added spatial noise reduction on my first node using the following settings: Mode: Better, Radius: Medium, Spatial Threshold (luma/chroma locked): 25. I was able to playback the timeline at around 5fps and exported the same DNxHD 175x file, but it took about 1 minute 27 seconds, about six times realtime. Doing the same DNxHD 175x export test with the ProRes4444 file, it took about 12 seconds without noise reduction and with the noise reduction about 1 minute and 16 seconds — about 4.5 times realtime. In both cases when using Noise Reduction, the fans kicked on.

Lastly, I wanted to see how Resolve would handle a simple one minute, 1080p, ProRes QuickTime in various tests. I don’t think it’s a big surprise but it played back without dropping any frames with one node of color correction, one power window and as a parallel node with a qualifier. When adding spatial noise reduction I started to get bogged down to about 6fps. The same DNxHD 175x export took about 27 seconds or a little less than half realtime. With the same spatial noise reduction as above it took about 4 minutes and 21 seconds, about 4.3 times realtime.

Summing Up
The HP ZBook Studio G4 is a lightweight and durable enterprise-level mobile workstation that packs the punch of a color-critical 4K (UHD — 3840×2160) DreamColor display, powered by an Nvidia Quadro M1200, and brought together by an Intel Xeon processor that will easily power many color, editing or other multimedia jobs. With HP’s MIL-810G certification, you have peace of mind that even with some bumps, bruises and extreme weather your workstation will work. At under 5lbs and 18mm thin with a battery that will charge 50% in 30 minutes, you can bring your professional apps like DaVinci Resolve, Adobe Premiere and Avid Media Composer anywhere and be working.

I was able to use the ZBook along with some of my Tangent Element color correction panels in a backpack and have an instant color critical DIT solution without the need for a huge cart — all capable of color correction and transcoding. The structural design of the ZBook is an incredibly sturdy, machined aluminum chassis that is lightweight enough to easily go anywhere quickly. The only criticisms are I would often miss the left click of the trackpad leaving me in a right-click scenario, the Bang & Olufsen speakers sound a little tin-like to me and, finally, it doesn’t have a touch bar… just kidding.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: Blackmagic’s DaVinci Resolve 14 for editing

By Brady Betzel

Resolve 14 has really stepped up Blackmagic’s NLE game with many great new updates over the past few months. While I typically refer to Resolve as a high-end color correction and finishing tools, this review will focus on the Editing tab.

Over the last two years, Resolve has grown from a high-end color correction and finishing app to include a fully-capable nonlinear editor, media organizer and audio editing tool. Fairlight is not currently at the same level as Avid Pro Tools, but it is still capable, and with a price of free or at most $299 you can’t lose. For this review, I am using the $299 version, which has a few perks — higher than UHD resolutions; higher than 60 frames per second timelines; the all-important spatial and/or temporal noise reduction; many plugins like the new face tracker; multi-user collaboration; and much more. The free version will work with resolutions up to UHD at up to 60fps and still gives you access to all of the powerful base tools like Fairlight and the mighty color correction tool set.

Disclaimer: While I really will try and focus on the Editing tab, I can’t make any promises I won’t wander.

Digging In
My favorite updates to Resolve 14’s Editing tab revolve around collaboration and conforming functions, but I even appreciate some smaller updates like responsiveness while trimming and video scopes on the edit page. And don’t forget the audio waveforms being visible on the source monitor!

With these new additions, among others, I really do think that Resolve is also becoming a workable nonlinear editor much like industry standards such as Avid Media Composer, Adobe Premiere Pro and Apple Final Cut Pro X. You can work from ingest to output all within one app. When connected to a collaborative project there is now bin-locking, sharing bins and even a chat window.

Multicam works as expected with up to 16 cameras in one split view. I couldn’t figure out how to watch all of the angles in the source monitor while playing down the sequence in the record monitor, so I did a live switch (something I love to do in Media Composer). I also couldn’t figure out how to adjust the multi-cam after it had been created, because say, for instance, audio was one frame out of sync or I needed to add another angle later on. But the multicam worked and did its job by allowing me to sync by in point, out point, timecode, sound or marker. In addition, you can make the multicam a different frame rate than your timeline, which is handy.

[Editor’s Note: Blackmagic says: “There are a few ways to do that. You can right click on the multicam clip and select ‘open in timeline.’ Or you can pause over any segment of a multicam clip, click on a different angle and swap out the shots. Most importantly, you get into multicam edit mode by clicking on the drop down menu on the lower left hand corner of the source viewer and selecting Multicam mode.”]

Another addition is the Position Lock located in the middle right, above the timeline. The Position Lock keeps all of your clips locked in time in your timeline. What is really interesting about this is that it still allows you to trim and apply other effects to clips while locking the position of your clips in place. This is extremely handy when doing conforms and online passes of effects when you don’t want timing and position of clips to change. It’s a great safety net. There are some more fancy additions like re-time curves directly editable in the timeline. But what I would really love is a comprehensive overhaul of the Title Tool that would allow for direct manipulation of the text on top of the video. It would be nice to have a shortcut to use the title as a matte for other footage for some quick and fancy titling effects, but maybe that is what Fusion is for? The title tool works fine and will now give you nice crisp text even when blown up. The bezier curves really come in handy here to make animations ease in and out nicely.

If you start and finish within Resolve 14, your experience will most likely be pretty smooth. For anyone coming from another NLE — like Media Composer or Premiere — there are a few things you will have to get used to, but overall it feels like the interface designers of Resolve 14 kept the interface familiar for those “older” editors, yet also packed it with interesting features to keep the “YouTube” editors’ interest piqued. As someone who’s partial to Media Composer, I really like that you can choose between frame view in the timeline and clips-only view, leaving out thumbnails and waveform views in the timeline.

I noticed a little bit of a lag when editing with the thumbnail frames turned on. I also saw recently that Dave Dugdale on YouTube found an interesting solution to the possible bug. Essentially, one of the thumbnail views of the timeline was a little slower at re-drawing when zooming into a close view in a sequence Regardless, I like to work without thumbnails, and that view seemed to work fluidly for me.

After working for about 12 minutes I realized I hadn’t saved my work and Resolve didn’t auto-saved. This is when I remembered hearing about the new feature “Live Save.” It’s a little tricky to find, but the Live Save feature lives under the DaVinci Resolve Menu > User > Auto Save and is off by default — I really think this should be changed. Turn this fuction on and your Resolve project will continually save, which in turn saves you from unnecessary conniptions when your project crashes and you try to find the spot that was last saved.

Coming from another NLE, the hardest thing for me to get used to in a new app was the keyboard layouts and shortcuts. Typically, trimming works similar to other apps and overwriting; ripple edits, dissolves and other edit functions don’t change, but the placement of their shortcuts does. In Resolve 14, you can access the keyboard shortcut commands in the same spot as the Live Save, but under the Keyboard Mapping menu under User. From here you can get grounded quickly by choosing a preset that is similar to your NLE of choice — Premiere, FCP X, Media Composer — or Resolve’s default keyboard layout, which isn’t terrible. If this could be updated to how apps like Premiere and/or Avid have their keyboard layouts designed, it would be a lot easier to navigate. Meaning there is usually a physical representation of a keyboard that allows you to drag your shortcuts to and from it realtime.

Right now, Resolve’s keyboard mapper is text-based and a little cumbersome. Overall, Resolve’s keyboard shortcuts (when in the editing tab) are pretty standard, but it would do you well to read and go through basic moves like trimming, trimming the heads and tails of clips or even just trimming by plus or minus and the total frames you want to trim.

Something else I discovered when trimming was when you go into actual “trim mode,” it isn’t like other NLEs where you can immediately start trimming. I had to click on the trim point with my mouse or pen, then I could use keyboard shortcuts to trim. This is possibly a bug, but what I would really love to happen is when you enter “trim mode,” you would see trimming icons at the A and B sides of the nearest clips on the selected tracks. This would allow you to immediately trim using keyboard shortcuts without any mouse clicks. In my mind, the more mouse clicks I have to use to accomplish a task means time wasted. This leads to having less time to spend on “important” stuff like story, audio, color, etc. When time equals money, every mouse click means money out of my pocket. [Note from Blackmagic: “In our trim tools you can also enter trim mode by hitting T on the keyboard. We did not put in specific trim tool icons on purpose because we have an all-in-one content sensitive trim tool that changes based on where you place the cursor. And if you prefer trimming with realtime playback, hit W for dynamic trim mode, and then click on the cut you want to trim with before hitting JKL to play the trim.”]

I have always treated Resolve as another app in my post workflow — I wasn’t able to use it all the way from start to finish. So in homage to the old way of working, a.k.a. “a round trip workflow,” I wanted to send a Media Composer sequence to Resolve by way of a linked AAF, then conform the media clips and work from there. I had a few objectives, but the main one was to make sure my clips and titles came over. Next was to see if any third-party effects would translate into Resolve from Media Composer and, finally, I wanted to conform an “updated” AAF to the original sequence using Resolve’s new “Compare with Current Timeline” command.

This was a standard 1080p, 23.98 sequence (transcoded to one mezzanine DNx175x codec with 12 frame handles) with plenty of slates, titles, clips, speed ramps, Boris Continuum Complete and Sapphire Effect. Right off the bat all of the clip-based media came over fine and in its correct time and place in the timeline. Unfortunately, the titles did not come over and were offline — none of them were recognized as titles so they couldn’t be edited. Dissolves came over correctly, however none of the third-party BCC or Sapphire effects came across. I didn’t really expect the third-party effects to come over, but at some point, in order to be a proper conforming application, Resolve will need to figure out a way to translate those when sending sequences from an NLE to Resolve. This is more of a grand wish, but in order to be a force in the all-in-one app for the post finishing circle, this is a puzzle that will need to be solved.

Otherwise, for those who want to use alternative nonlinear editing systems, they will have to continue using their NLE as the editor, Resolve as a color-only solution, and the NLE as their finisher. And from what I can tell Blackmagic wants Resolve to be your last stop in the post pipeline. Obviously, if you start your edit in Resolve and use third-party OpenFX (OFX) like BCC or Sapphire, you shouldn’t have any problems.

Last on my list was to test the new Compare with Current Timeline command. In order for this option to pop up when you right click, you must be in the Media tab with the sequence you want to compare to the one loaded. You then need to find the sequence you want to compare from, right click on it and click Compare with Current Timeline. Once you click the sequences you want to compare, a new window will pop up with the option to view the Diff Index. The Diff Index is a text-based list of each new edit next to the timeline that visually compares your edits between the two sequences. This visual representation of the edits between the sequences is where you will apply those changes. There are marks identifying what has changed, and if you want to apply those changes you must right click and hit Apply Changes. My suggestion is to duplicate your sequence before you apply changes (actually you should be constantly duplicating your sequence as a backup as a general rule). The Compare with Current Timeline function is pretty incredible. I tested it using an AAF I had created in Media Composer and compared it against an AAF made from the same sequence but with some “creative” changes and trimmed clips — essentially a locked sequence that suddenly became unlocked while in Online/Color and needed to reflect the latest changes from the offline edit.

I wasn’t able to test Resolve 14 in a shared-project environment, so I couldn’t test a simultaneous update coming from another editor. But this can come in really handy for anyone who has to describe any changes made to a particular sequence or for that pesky online editor that needs to conform a new edit while not losing all their work.

I can’t wait to see the potential of this update, especially if we can get Resolve to recognize third-party effects from other NLEs. Now don’t get me wrong, I’m not oblivious to the fact that asking Resolve engineers to figure out how to recognize third-party effects in an AAF workflow is a pie-in-the-sky scenario. If it was easy it probably would have already been done. But it is a vital feature if Blackmagic wants Resolve to be looked at like a Flame or Media Composer but with a high-end coloring solution and audio finishing solution. While I’m at it, I can’t help but think that Resolve may eventually include Fusion as another tab maybe as a paid add-on, which would help to close that circle to being an all-in-one post production solution.

Summing Up
In the end, Resolve 14 has all the makings of becoming someone’s choice as a sole post workflow solution. Blackmagic has really stepped up to the plate and made a workable and fully functional NLE. And, oh yeah not to mention it is one of the top color correction tools being used in the world.

I did this review of the editing tab using Blackmagic Design’s DaVinci Resolve 14.2. Find the latest version here. And check out our other Resolve review — this one from a color and finishing perspective.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: G-Tech’s G-Speed Shuttle XL with EV Series Adapters, Thunderbolt 3

By Brady Betzel

As video recording file sizes continue to creep up, so do our hard drive storage size, RAID and bandwidth needs. These days we are seeing a huge amount of recorded media needing to be offloaded, edited and color graded in the field. G-Technology has stepped up to bat with its 24TB G-Speed Shuttle XL with EV Series adapters, Thunderbolt 3 edition — a portable, durable, adaptable and hardware-RAID powered storage solution.

In terms of physical appearance, the G-Speed Shuttle XL is just shy of 10x7x16-inches and weighs in at around 25 pounds. It’s not exactly lightweight, but with spinning hard drives it’s what I would expect. To get a lighter RAID you would need SSD drives, but those would most likely triple the price. The case itself is actually pretty rugged; it seems like it could withstand a minimal amount of production abuse, but again without being an SSD RAID you have some risks since spinning disk drives are more volatile.

The exterior is made out of a hard plastic, and it would have been nice to have the rubberized feel on at least the handle of the Shuttle XL — similar to G-Tech’s Rugged line of drives — but it still feels solid. To open the Shuttle XL, there is an easy-to-access switch on the front. There is a lock switch that took me a few wiggles to work correctly, but I would loved to have seen a key lock that would to add a little more security since this RAID will most likely house important data. When closing the front door, the slide lock wouldn’t fully close unless I pushed hard a second time. If I didn’t do that the door would open by itself. On the back are the two Thunderbolt 3 ports, a power cable plug and a Kensington Lock slot. On the inside of this particular Shuttle XL are six 4TB 7200 RPM Enterprise Class HGST (Western Digital) drives configured to a RAID-5 by default and two EV Series Bay adapters.

Since I was on a Windows-based PC I had to download and format the drives since it comes formatted for Mac OS by default. The EV adapters allow for quick connection of memory products like Atomos Master Caddy drives, CFast 2.0 or even Red Mini Mags. This gives you fast connection on set for transferring and backing up your media without extra card readers. The HGST hard drives are enterprise class, which very simply means that the drives are rated to be run 24 hours a day, seven days a week, 365 days a year more reliably than standard hard drives. This should do as it says, but if it doesn’t there is a five-year limited warranty that will back up this product. Basically, if the RAID or its drives fail due to craftsmanship errors, they will replace or repair it. Keep in mind, you are responsible for shipping the item back to them, and with the heavy weight of the RAID it may be costly if it goes bad. They will not cover accidental damage or misuse. Another caveat to G-Technology’s limited warranty is that they will not cover commercial use, so if you plan to use the G-Speed on a commercial shoot, you might not be covered. You should contact G-Technology’s support to check if your use will be covered: 1.888.426.5214 if you are in North or South America, including Canada.

The Shuttle XL comes pre-RAID formatted in a RAID-5 configuration for the Mac OS, but it can also be formatted as RAID-0,-1, -6, -10 and 50 using the G-Speed Studio Utility. Here is a quick RAID primer in case you forget the differences:

– RAID-0: All drives are used as one large drive. This gives you the fastest RAID performance.
– RAID-1: Total amount of drive space is halved, so if one drive goes out you will not lose your data and it will be rebuilt over time once the bad drive is replaced. The speed is slower than RAID 0. Essentially each drive is mirrored to an identical drive in the RAID.
– RAID-5: Needs at least three drives and uses each drive to create a safety net if a drive fails. The plus side is that it is faster than RAID-1 and includes a safety net. This is why G-Technology ships this drive with this configuration. You will have about 80% of usable disk space. The downside is that if a drive goes out you, will have degraded speed until the RAID fully repopulates itself once the bad drive is replaced.
– RAID-6: Works similar to RAID-5 but has two safety nets (a.k.a. parita blocks). One drawback of RAID-5 is that if a second drive goes out while the RAID database is rebuilding, all data can be lost permanently. RAID-6 adds another safety net so that if two drives go out you can still rebuild your RAID. The downside is that you have only 60% of your storage usable.
RAID-10: RAID-10 requires at least four drives and cuts your usable drive count in half. The upside is that if a drive goes out, you will not have degraded speed during the RAID rebuilding process, which depending on the data involved can be multiple days or longer. RAID-10 essentially mirrors a striped RAID.
– RAID-50: RAID-50 can be thought of as a RAID-5 + 0 and needs a minimum of six drives. It’s two RAID arrays running RAID-5. In each of those RAID arrays you will lose one drive’s worth of usable space. The good news is that if you can have a drive in each RAID array go out while minimizing total RAID loss unlink RAID-5 alone.

Those RAID formatting options are a lot to think about, and frankly I have to look them up about every year or so. If you didn’t read all about RAID formatting, then you may want to stick with RAID-5, which gives you a nice combo of safety vs. speed. If you are a risk taker, backup your data regularly, or if you can survive a total RAID failure, RAID-0 might be more your style. But in the end, keep in mind no matter how good your equipment is or how high a level of RAID protection you have, you can always suffer a total RAID failure and backups are always important.

I tested the 24TB Shuttle XL in each of the available RAID configurations on an HP ZBook Studio G4 with a Thunderbolt 3 interface using the AJA System Test Utility. Each test used the 4GB testing size, 3840×2160 resolution and DNxHR 444 codec since I typically use RAIDS when editing video or motion based projects, which tend to be higher file size. As a caveat, when I lowered the file size to 1GB the speeds increased tremendously. For instance, in RAID-0 the Read/Write speeds were 953MB/s/2274MB/s compare to those below. Here are my results, including the total size of the RAID:

RAID-0: 21.83TB – Read: 960 MB/s – Write: 1451 MB/s
RAID-1: 10.91TB – Read: 439 MB/s – Write: 639 MB/s
RAID-5: 18.19TB – Read: 950 MB/s – Write: 1171 MB/s ***
RAID-6: 14.55TB – Read: 683 MB/s – Write: 750 MB/s
RAID-10: 10.91TB – Read: 506 MB/s – Write: 667 MB/s
RAID-50: 14.55TB – Read: 614 MB/s – Write: 933 MB/s

***I pulled a drive while working live in RAID-5 and while the read/write speed degraded to Read: 326MB/s and Write: 858MB/s it continued to function while the RAID rebuilt itself.

For some reference I also ran the AJA System Test on my local hard drive and got Read 1606MB/s — Write 1108MB/s, which is pretty fast, so the Shuttle XL is doing well. I also wanted to test real-world copy speed and using 50GB of Cinema DNG files here are the results:

RAID-0: 2 mins. 7 seconds (~403 MB/s)
RAID-1: 3 mins. 35 seconds (~238 MB/s)
RAID-5: 2 mins. 49 seconds (~303 MB/s)
RAID-6: 3 mins. 6 seconds(~275 MB/s)
RAID-10: 3 mins. 17 seconds (~260 MB/s)
RAID-50: 2 mins. 45 seconds (~310 MB/s)

While these results are pretty self explanatory, when configured in RAID-5 they are pretty impressive. If you ran the tests continuously for a day you would probably see some variation on the average, maybe a little higher than what you see here. The speeds are pretty impressive, especially when considering a drive can give out and you can still be running at a pretty high bandwidth. Technically, G-Technology states that the Shuttle XL can reach a maximum transfer rate up to 1500MB/s — which I was close to hitting. This is very surprising since typically these specs are a little like miles-per-gallon on new cars but not in this case. I really appreciate that accuracy and think it will go a long way with consumers. In terms of other apps, I wasn’t running anything other than the AJA System Test, but I also did not shut down anything in the background so it is possible some background apps affected these transfer numbers, but I wouldn’t say they had a lot of influence, you should see similar results.

Summing Up
In the end, the G-Technology G-Speed Shuttle XL with EV Series Adapters Thunderbolt 3 Edition is a great choice for a RAID that might need to stand up to a little abuse in the field. The 24TB version of the Shuttle XL that connects using Thunderbolt 3 has a retail price of $2,799.95 with EV adapters being sold separately for between $99.95 and $199.95. The Shuttle XL is available from 24TB all the way up to 72TB, which will cost you $7,699.95.

If you like the idea of multiple RAID options, including ones that require more than four drives, the Shuttle XL has a decent price, a great build quality that should last you for years thanks to its Enterprise Class hard drives, and high bandwidth — the only thing better would be to load it with some SSD drives, but that could cost another $10,000 and would call for another review. Check out the Shuttle XL at G-Technologies website.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: Red Giant Trapcode Suite 14

By Brady Betzel

Every year we get multiple updates to Red Giant’s Adobe After Effects plug-in behemoth, Trapcode Suite. The 14th update to the Trapcode suite is small but powerful and brings significant updates to Version 3 of Trapcode as well as Form (Trapcode Form 3 is a particle system generator much like Particular, but instead of the particles living and dying they stay alive forever as grids, 3D objects and other organic shapes). If you have the Trapcode Suite from a previous purchase the update will cost $199, and if you are new the suite costs $999, or $499 with an academic discount.

Particular 3 UI

There are three updates to the Suite that warrant the $199 upgrade fee: Trapcode 3, Form 3 and Tao 1.2 update. However, you still get the rest of the products with the Trapcode Suite 14: Mir 2.1, Shine 2.0, Lux 1.4, 3D Stroke 2.6, Echospace 1.1, Starglow 1.7, Sound Keys 1.1 and Horizon 1.1

First up is the Tao 1.2 update. Trapcode Tao allows you to create 3D geometric patterns along a path in After Effects. If you do a quick YouTube search of Tao you will find some amazing examples of what it can do. In the Tao 1.2 update Red Giant has added a Depth-of-Field tool to create realistic bokeh effects on your Tao objects. It’s a simple but insanely powerful update that really gives your Tao creations a sense of realism and beauty. To enable the new Depth-of-Field, wander over to the Rendering twirl-down menu under Tao and either select “off” or “Camera Settings.” It’s pretty simple. From there it is up to your After Effects camera skills and Tao artistry.

Trapcode Particular 3
Trapcode Particular is one of Red Giant’s flagship plugins and it’s easy to see why. Particular allows you to create complex particle animations within After Effects. From fire to smoke to star trails, it can pretty much do whatever your mind can come up with, and Version 3 has some powerful updates, including the overhauled Trapcode Particular Designer.

The updated designer window is very reminiscent of the Magic Bullet Designer window, easy and natural to use. Here you design your particle system, including the look, speed and overall lifespan of your system. While you can also adjust all of these parameters in the Effects Window dialog, the Designer gives an immediate visual representation of your particle systems that you can drag around and see how it interacts with movement. In addition you can see any presets that you want to use or create.

Particular 3

In Particular 3, you can now use OBJ objects as emitters. An OBJ is essentially a 3D object. You can use the OBJ’s faces, vertices, edges, and the volume inside the object to create your particle system.

The largest and most important update to the entire Trapcode Suite 14 is found within Particular 3, and it is the ability to add up to eight particle systems per instance of Particular. What does that mean? Well, your particle systems will now interact in a way that you can add details such as dust or a bright core that can carry over properties from other particle systems in the same same instance, adding the ability to create way more intricate systems than before.

Personally, the newly updated Designer is what allows me to dial in these details easily without trying to twirl down tons of menus in the Effect Editor window. A specific use of this is that you want to duplicate your system and inherit the properties, but change the blend mode and/or colors, simply you click the drop down arrow under system and click “duplicate.” Another great update within the multiple particle system update is the ability to create and load “multi-system” presets quickly and easily.

Now, with all of these particle systems mashed together you probably are wondering, “How in the world will my system be able to handle all of these when it’s hard to even playback a system in the older Trapcode Suite?” Well, lucky for us Trapcode Particular 3 is now OpenGL — GPU-accelerated and allowing for sometimes 4x speed increases. To access these options in the Designer window, click the cogwheel on the lower edge of the window towards the middle. You will find the option to render using the CPU or the GPU. There are some limitations to the GPU acceleration. For instance, when using mixed blend modes you might not be able to use certain GPU acceleration types — it will not reflect the proper blend mode that you selected. Another limitation can be with Sprites that are QuickTime movies; you may have to use the CPU mode.

Last but not least, Particular 3’s AUX system (a particle system within the main particle system) has been re-designed. You can now choose custom Sprites as well as keyframe many parameters that could not be keyframed before.

Form 3 UI

Trapcode Form 3
For clarification, Trapcode Particular can create particle emitters that emit particles that have a life, so basically they are born and they die. Trapcode Form is a particle system that does not have a life — it is not born and it does not die. Some practical examples can be a ribbon like background or a starfield. These particle systems can be made from 3D models and even be dynamically driven by an audio track. And much like Particular’s updated Designer, Form 3 has an updated designer that will help you build you particle array quickly and easily. Once done inside the Designer you can hop out and adjust parameters in the Effects Panel. If you want to use pre-built objects or images as your particles you can load those as Sprites or Textured Polygons and animate their movement.

Another really handy update in Trapcode Form 3 is the addition of the Graphing System. This allows you to animate controls like color, size, opacity and dispersion over time.

Just like Particular, Form reacts to After Effect’s cameras and lights, completely immersing them into any scene that you’ve built. For someone like me, who loves After Effects and the beauty of creations from Form and Particular but who doesn’t necessarily have the time to create from scratch, there is a library of over 70 pre-built elements. Finally, Form has added a new rendering option called Shadowlet rendering which adds light falloff to your particle grid or array.

Form 3

Summing Up
In the end, the Trapcode Suite 14 has significantly updated Trapcode Particular 3 with multiple particle systems, Trapcode Form 3 with a beautiful new Designer, and Trapcode Tao with Depth-of-Field, all for an upgrade price of $199. Some Trapcode Particular users have been asking for the ability to build and manipulate multiple particle systems together, and Red Giant has answered their wishes.

If you’ve never used the Trapcode Suite you should also check out the rest of the mega-bundle which includes apps like Shine, 3D Stroke, Starglow, MIr, Lux, Sound Keys, Horizon and Echospace here. And if you want to get more in-depth rundowns of each of these programs check out Harry Frank’s (@graymachine) and Chad Perkin’s tutorials on the Red Giant News website. Then immediately follow @trapcode_lab and @RedGiantNews on Twitter.

If you want to find out more about the other tools in the Trapcode Suite check out my previous two-part review of Suite 13 here on postPerspective: https://postperspective.com/review-red-giants-trapcode-suite-13-part-1 and https://postperspective.com/review-red-giant-trapcode-suite-13-part-2.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: Boxx’s Apexx 4 7404 workstation

By Brady Betzel

The professional workstation market has been blown open recently with companies like HP, Apple, Dell, Lenovo and others building systems containing i3/i5/i7/i9 and Xeon processors, and  AMD’s recent re-inauguration into the professional workstation market with their Ryzen line of processors.

There are more options than ever, and that’s a great thing for working pros, but for this review, I’m going to take a look at Boxx Technologies Apexx 4 7404, which the company sent me to run through its paces over a few months, and it blew me away.

The tech specs of the Apexx 4 7404 are:
– Processor: Intel i7-6950X CPU (10 cores/20 threads)
– One core is overclocked to 4.3GHz while the remaining nine cores can run at 4.1GHz
– Memory: 64GB DDR4 2400MHz
– GPUs: Nvidia Quadro P5000 (2560 CUDA cores, 16GB GDDR5X)
– Storage drive: NVMe Samsung SSD 960 (960GB)
– Operating system drive: NVMe Intel SSDPEDMW400 (375GB)
– Motherboard: ASUS X99-E WS/USB3.1

On the front of the workstation, you get two USB 3.0, two USB 2.0, audio out/mic in, and on the rear of the 7404 there are eight USB 3.0, two USB 3.1, two Gigabit Ethernet, audio out/mic in, line in, one S/PDIF out and two eSATA. Depending on the video card(s) you choose, you will have some more fun options.

This system came with a DVD-RW drive, which is a little funny these days but I suppose still necessary for some people. If you need more parts or drives there is plenty of room for all that you could ever want, both inside and out. While these are just a few of the specs, they really are the most important, in my opinion. If you purchase from Boxx all of these can be customized. Check out all of the different Boxx Apexx 4 flavors here.

Specs
Right off the bat you will notice the Intel i7-6950X CPU, which is a monster of a processor and retails for around $1,500, just by itself. With its hefty price tag, this Intel i7 lends itself to niche use cases like multimedia processing. Luckily for me (and you), that is exactly what I do. One of the key differences between a system like the Boxx workstation and ones from companies like HP is that Boxx takes advantage of the X or K series Intel processors and overclocks them, getting the most from your processors all while still being backed by Boxx’s three-year warranty. The 7404 has one core overclocked to 4.3GHz which can sometimes provide a speed increase for apps that don’t use multiple cores. While this isn’t a lot of cases it doesn’t hurt to have that extra boost.

The Apexx 4 case is slender (at 6.85-inches wide) and quiet. Boxx embraces liquid cooling systems to keep your enterprise-class components made by companies like Samsung, Intel, etc. running smoothly. Boxx systems are built and fabricated in Texas from aircraft grade aluminum parts and steel strengthening components.

When building your own system you might pick a case because the price is right or it is all that is available for your components (or that is what pcpartpicker.com tells you that is what fits). This can mean giving up build quality and potentially bad airflow. Boxx knows this and has gone beyond just purchasing other companies cases — they forge their own workstation case masterpieces.

Boxx’s support is based in Austin – no outsourcing — and their staff knows the apps we use such as Autodesk, Adobe and others.

Through Its Paces
I tested the Apexx 4 7404 using Adobe Premiere Pro and Adobe Media Encoder since they are really the Swiss Army knives of the multimedia content creation world. I edited together a 10-minute UHD (3840×2160) sequence using an XAVC MP4 I shot using a Sony a6300. I did a little color correction with the Lumetri Color tools, scaled the image up to 110% and exported the file through Media Encoder. I exported it as a 10-bit DNxHQX, UHD, QuickTime MOV.

It took seven minutes and 40 seconds to export to the OS drive (Intel) and about six minutes and 50 seconds to go to the internal storage drive (Samsung). Once I hit export I finally got the engines to rev up inside of the Boxx, the GPU fans seemed to kick on a little; they weren’t loud but you could hear a light breeze start up. On my way out of Premiere I exported an XML to give me a headstart in Resolve for my next test.

My next test was to import my Premiere XML into Blackmagic’s Resolve 14 Studio and export with essentially the same edits, reproduce the color correction, and apply the same scaling. It took a few minutes to get Resolve 14 up and running, but after doing a few uninstalls, installing Resolve 12.5.6 and updating my Nvidia drivers, Resolve 14 was up and running. While this isn’t a Boxx problem, I did encounter this during my testing so I figured someone might run into the same issue, so I wanted to mention it.

I then imported my XML, applied a little color correction, and double checked that my 110% scaling came over in the XML (which it did), and exported using the same DNxHQX settings that I used in Premiere. Exporting from Resolve 14 to the OS drive took about six minutes and 15 seconds, running at about 41 frames per second. When exporting to the internal storage drive it took about six minutes and 11 seconds, running between 40-42 frames per second. For those keeping track of testing details, I did not cache any of the QuickTimes and turned Performance Mode off for these tests (in case Blackmagic had any sneaky things going on in that setting).

After this, I went a little further and exported the same sequence with some Spatial Noise Reduction set across the entire 10-minute timeline using these settings: Mode: Better; Radius: Medium; Spatial Threshold: 15 on both Luma and Chroma; and Blend: 0. It ran at about nine frames per second and took about 25 minutes and 25 seconds to export.

Testing
Finally, I ran a few tests to get some geeky nerd specs that you can compare to other users’ experiences to see where this Boxx Apexx 4 7404 stands. Up first was the AJA System Test, which tests read and write speeds to designated disks. In addition, you can specify different codecs and file sizes to base this test off of. I told the AJA System Test to run its test using the 10-bit Avid DNxHQX codec, 16GB file size and UHD frame size (3860×2140). I ran it a few times, but the average was around 2100/2680 MB/sec write and read to the OS drive and 1000/1890 MB/sec write and read to the storage drive.

To get a sense of how this system would hold up to a 3D modeling test, I ran the classic Cinebench R15 app. OpenGL was 215.34 frames per second with 99.6% ref. match, CPU scored 2121cb and CPU (single core) cored 181cb with MP Ratio of 11.73x. What the test really showed me when I Googled Cinebench scores to compare mine to was that the Boxx Apexx 4 7404 was in the top of the heap for all categories. Specifically, within the top 20 for overall render speed being beaten only by systems with more cores and placed in the top 15 for single core speed — the OpenGL fps is pretty incredible at over 215fps.

Summing Up
In the end, the Boxx Apexx 4 7404 custom-built workstation is an incredible powerhouse for any multimedia workflow. From rendering to exporting to transcoding, the Boxx Apexx 4 7404 with dual Nvidia Quadro P5000s will chew through anything you throw at it.

But with this power comes a big price: the 7404 series starts at $7,246! The price of the one I tested lands much higher north though, more like just under $14,000 — those pesky Quadros bump the price up quite a bit. But if rendering, color correcting, editing and/or transcoding is your business, Boxx will make sure you are up and running and chewing through every gigabyte of video and 3D modeling you can run through it.

If you have any problems and are not up and running, their support will get you going as fast as possible. If you need parts replaced they will get that to you fast. Boxx’s three-year warranty, which is included with your purchase, includes getting next day on-site repair for the first year but this is a paid upgrade if you want it to continue for years two and three of your warranty. But don’t worry. If you don’t upgrade your warranty you still have two years of great support.

In my opinion, you should really plan for the extended on-site repair upgrade for all three years of your warranty — you will save time, which will make you more money. If you can afford a custom-built Boxx system, you will get a powerhouse workstation that makes working in apps like Premiere and Resolve 14 snappy and fluid.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: OWC’s USB-C dock

By Brady Betzel

Whether you have a MacBook Pro with only one USB 3.1 Gen 1 port (a.k.a. USB-C) or a desktop PC and aren’t fond of reaching around the back of your tower to plug in peripherals, you’ll need a dock. At first you might think a dock isn’t necessary, but it is. With the popularity of the USB-C connection you can use one single cable to plug in your dock and connect with many different devices, including HDMI, Mini DisplayPort, SD cards and multiple other USB connected devices.

OWC has a reputation for having high-quality, Mac-focused products like external RAID storage solutions. OWC branded SSD drives, memory upgrades and even refurbished Mac OS-based systems. One of the company’s latest products is the USB-C dock that is compatible with both Mac OS- and Windows-based computer systems. The OWC USB-C dock comes in two versions: Mini DisplayPort and HDMI. Otherwise, the rest of the ports are identical.

In the front of the dock is an SD card reader, 3.5 headphone/microphone combo port and high-powered USB 3.1 Gen 1 USB port. On the back are three USB 3.1 Gen 1 ports (one of those is another high-powered charging port), one USB 3.1 Gen 1 Type-C, Gigabit Ethernet, a USB 3.1 Gen 1 connection for your system, HDMI or MiniDP port, and the DC power connection. The docks come in four different colors that match Apple’s MacBook Pros, including space gray, silver, gold and rose gold. The Mini DisplayPort version costs $148.75, and the HDMI version costs anywhere from $127.99 to $148.75.

What I really love about the USB-C dock from OWC, aside from the abundance of ports, is the addition of high-powered charging ports. I have a Samsung Galaxy S8+ phone, which can charge at a high speed with ports like these, so having them on the dock is extra handy. Besides the S8+, other electronics like the GoPro Hero 5 Black Edition can benefit from these ports.

Where the USB-C dock will really shine is in an environment where you don’t want to carry around all your peripherals and you use a newer MacBook Pro that features USB 3.1 Gen 1 Type-C connections. Keep in mind that the dock will supply up to 60W of power for your computer in addition to the 20W for other peripherals, so if your computer needs more than 60W to charge it may charge slowly or not at all.

For us desktop users, the USB-C dock expands our connections by adding multiple USB ports, an HDMI connection, and even add a Gigabit network adapter all at close range instead of having to reach around the back of your workstation.

The HDMI port supports connection via HDMI 1.4b-enabled displays or televisions: and a high-speed HDMI cable is required for display resolutions of 1080p or higher. Most HDMI cables these days are high-speed, you can even find the AmazonBasics high-speed HDMI cables for $7.99.

As mentioned earlier, the OWC USB-C dock is compatible with both Windows and Mac OS systems, but a driver is required if using the Gigabit Ethernet port on Mac OS X system 10.10 and 10.11. You can find that driver here.

In a Windows-based environment you will not have to update the Ethernet driver, but in both Mac OS and Windows environments if you have the HDMI version, you will need to install the following firmware update.

Summing Up
Out of selfishness, I wish there was one more USB port on the back of the USB-C dock to host my four Tangent Element color correction panels, each of which has its own USB connections. Instead, I have one poking out of the front. In addition, it would be nice to have a Thunderbolt 1/2 port on the dock for my legacy Thunderbolt-connected RAIDs; instead, I will have to buy an additional adapter. Other than those two suggestions, the dock is awesome and works great. It measures just over an inch tall, 3.5 inches wide, and just under 8 inches long. It weighs .9 lbs and comes with a power supply that is actually heavier than the dock, and what I think is a way-too-short Type-C cable measuring at about 20 inches. Obviously, for those using the dock with a laptop this is sufficient, but for those using this dock with a tower something triple that length is needed.

The USB-C dock comes with a two-year limited warranty which in simple terms means that if anything goes wrong with the product because of bad manufacturing they will fix or replace it. They will not cover your data or shipping, so keep that in mind.

The dock’s manual features tips, like the Type-C USB 3.1 port between the traditional USB ports and the Ethernet port is for data and power only; it will not support video signals or video adapters. In addition, this dock is not compatible with Apple’s USB-C Digital AV multiport adapter or USB-C VGA multiport adapter. There are plenty of other usage notes you will want to read, so make sure that you check the manual out before you use the dock.

If you not only want a dock. but also want to update an older MacBook Pro, OWC has some great SSD and memory bundles.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: Røde Mic’s Stereo VideoMic Pro Rycote

By Brady Betzel

There are a ton of microphone types out there, and when it comes to adding an external mic to a camera you need to know what kind of mic works best in different situations. For interviews, you most likely want a microphone that picks up the audio that is placed directly in front of the microphone. If you are recording a music performance or ambience, you may want something that records in multiple directions without concentrating in any one specific location. I wanted to talk about Røde Microphones, which makes many of the on camera mics you see used YouTube content creators, Podcasters and filmmakers.

The way a microphone picks up audio is typically described as a polar pattern. The most common polar patterns in microphones are Cardioid, Super Cardioid and Omnidirectional. A Cardioid microphone will record mostly audio directly in front of it; a Super Cardioid will record mostly audio in the front, but also some directly in the rear; and an Omnidirectional will pick up audio equally from all directions. To further complicate the matter, if you record in stereo you can have a pickup pattern like X-Y Cardioid.

So after all that description of microphone polar patterns (you are welcome), here I’m going to focus on the Røde Stereo VideoMic Pro Rycote, which has an X-Y Cardioid polar pattern. You may be familiar with Rode’s famous VideoMic Pro, which is a great Super Cardioid, dual-mono microphone. The VideoMic Pro would be a great mic for recording a discussion outdoors, focusing on the conversation, but also picking up a little of the ambience. The Stereo VideoMic Pro Rycote uses its X-Y Cardioid to record a stereo scene without a focus on a smaller specific area. In plain language, the Stereo VideoMic Pro Rycote is great for recording ambience or performances. It is not the mic you want to use for an interview; that would be a job for the VideoMic Pro or better yet a directional mic like the Rode NTG series, which will give an intense focus on the area it is pointed at.

Capturing Ambience Outdoors
So why would someone want the Stereo VideoMic Pro Rycote? If you record scenes outdoors, like at a baseball or soccer game, where you want to generally focus your audio on an area but also catch the surrounding ambience of the crowd… or at an outdoor group performance. Something I see missing a lot in videography is ambience in timelapses or b-roll, the Stereo VideoMic Pro Rycote is exactly the on-camera mic you need to grab some great stereo ambience and add life to any b-roll shot, instantly. If you record traffic b-roll with cars crossing the camera from left to right or right to left, the stereo recording will convey traffic movement in a much more natural way engaging the viewer more than a mono recording of traffic that will not convey movement.

For all of the technical nerds out there, the Røde Stereo VideoMic Pro Rycote features a frequency range of 40Hz ~ 20,000Hz but can be limited with the built-in High Pass Filter that clips anything below 75Hz, matched pair ½” condenser capsules in X-Y stereo configuration, three position level control -10dB, 0 and +20dB. It uses a single 9V battery, measures 4.5” x 3.1” x 5.3”. It connects using a 3.5mm stereo mini-jack and weighs only .26 lbs.

The Stereo VideoMic Pro Rycote is an updated version from the original Stereo VideoMic Pro, which includes the nearly indestructible Rycote Lyre shock mount, new condenser capsule setup, new Kevlar reinforced braided cable and improved foam windscreen. Best of all it comes with a 10-year warranty (one-year warranty plus nine additional years when you register your microphone). If you read any online comments about Røde microphones, you will notice one common theme: great customer service. More often than not you will see someone talk about how they needed an addition piece or how their warranty ran out but Rode still helped them out free of charge.

Putting it to the Test
To test the Røde Stereo VideoMic Pro I tried a few different scenarios including some outdoor scenes where I showcased stereo and mono recording. You can check out my YouTube video. I recorded everything on a Blackmagic Pocket Cinema Camera, which is notorious for having low audio recording levels. Luckily, the Stereo VideoMic Pro Rycote has the three position level control and I was able to boost the recording by +20dB with very little noise. Otherwise my levels would have to be boosted when editing and that would definitely introduce more noise than I would have liked. While I did pick up a little bit of handling noise when recording, the Rycote Lyre shock-mount limited the little movements that other mics without this shock mount would have picked up. In terms of noise recorded on the mic, you can hear a little bit but nothing you couldn’t easily take out if it bothered you.

In the end the, Røde Stereo VideoMic Pro is a high-level external microphone that will add production value to any camera recording. It is priced between $279 and $300, although I found differing prices. Visit here to find a retailer near you, and one that will make sure you can take advantage of the tremendous warranty and customer service they offer. I also noticed that many retailers are including the Dead Kitten wind muff for free with purchase for when you are recording in a windy area.

When you listen to the difference between recording stereo with the Røde Stereo VideoMic Pro Rycote vs. recording mono on a mic like the Røde VideoMic Go, you can really feel your b-roll opening up. It adds a great level of depth to what would ordinarily be straight up the middle audio with no sense of left or right panning.

The Røde Stereo VideoMic Pro Rycote has an incredible recording and product build quality that will add depth to any footage you film. If you record outdoor b-roll, performances, or any other non-interview type footage. The Røde Stereo VideoMic Pro Rycote is going to be a vital piece of equipment you need to have in your bag.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Fangs Film Gear’s Wolf Packs and Panther lens bags

By Brady Betzel

Summertime is the perfect time to make sure you have the right gear bags to throw your cameras, memory cards and lenses in. Fangs Film Gear is a brand from the company Release the Hounds Studios. They also have other products, like Ground Control color correction LUTs, Wave Brigade royalty-free sound effects and ambience, and a podcast called Video Dogpound.

I found out about them when I was watching some music videos for inspiration and slowly fell down a rabbit hole that led me to a great organization called Heart Support. It’s basically a group that lends a helping hand to people having a hard time. I found co-owner Casey Faris’ YouTube page, where he has some awesome and easily digestible video editing and color correction tutorials — mainly on Blackmagic Design Resolve. You can find his co-owner Dan Bernard’s YouTube page here. They were promoting some of their LUTs from Ground Control and also some gear bags, which I’m now reviewing.

Fangs Film Gear Panther Lens Bags
These are black weather-resistant drawstring lens bags lined with lens-quality micro fiber cloth. They come in three sizes: small for $19.99, medium for $22.99 and large for $24.99. You can also buy one of each for $64.99. Once you touch them you will immediately feel the durability on the outside but the softness a lens demands on the inside. When using these bags you will always have a great lens cloth nearby.

Not only have I been toting around my lenses in these bags, but they’ve made great GoPro carrying bags since the GoPro’s lens is constantly exposed. The small bag works with a compact DSLR lens and is perfect for something like the Nifty Fifty Canon 50mm lens; it’s about the size of an iPhone 7 Plus or my Samsung Galaxy S8 +. Even the Blackmagic Pocket Cinema Camera fits well — it measures 5×7 inches when lying flat. The medium bag measures 6×8 inches and is good for multiple GoPros or a medium-sized lens like my micro four-thirds Lumix 14-140. The large measures 6.5×9.25 inches and is obviously great for a longer lens, but in a rush I keep my Blackmagic Pocket Cinema Camera with 14-140 lens attached sometimes. All the bags are weather resistant, meaning you can splash some water on them and it won’t get through. However, as they are drawstring so water can still get in through the top.

If you are looking for a GoPro-specific gear bag they also carry something called The Viper, a GoPro-focused sling bag. And if you are a DJI Mavic owner, they sell a two-pack of Panther bags that will fit the remote and Mavic — it looks essentially like the small and large Panther bags.

Fangs Film Gear Tactical Production Organizers
These are called Wolf Packs but I like to call them sweet dad bags. Not only do they have a practical production purpose, but they are great for dads who have to carry baby stuff around but want a little more stylish look.

So first the production purpose of the Wolf Packs. It’s really genius and simple: one side is green for your charged batteries or unused memory cards, and the red side is for depleted batteries and used memory cards. No more worrying about which cards have been used, or having to try and label a bunch of MicroSD cards with some gaffers tape. Now for the dad use of the Wolf Packs — green for the clean diapers and red for the used diapers! If you’ve ever used cloth diapers you may be a little more familiar with this technique.

The Wolf Packs are ultra durable and haven’t shed a stitch since I’ve used them in production scenarios, and even Disneyland dad scenarios. The zippers are extremely sturdy, but what impressed me the most were the included carabiner and carabiner grommet on the Wolf Packs themselves. The grommet is very high quality and won’t rip. The carabiner itself isn’t of rock climbing grade but will do for almost any situation you will need it for. The clip makes these bags easy to attach anywhere but specifically my backpack.

Inside the Wolf Pack is a durable fabric that isn’t the same as the Panther Lens bags, so do not clean your lenses with these! The pockets are made to stand up to the abuses of throwing batteries in and out all day long. I would love to see one of these with the microfiber lining like the Panther bags, but I also see the benefit of using both of them separate. The Wolf Packs break down like this: the small is 6.5×5.5 inches for $29.99, the medium is 8.25×7 inches for $34.99 and the large is 9×9 inches for $39.99. You can also purchase all three for $99.99.

Summing Up
I’ve definitely put these bags through the wringer over an extended period of time to make sure they will hold up. I am particularly concerned about things like zippers, stitching and cinches, so just a month or so of testing won’t give you a great sample. So over multiple months I’ve put the Panther Lens Bags and Wolf Packs through the wringer, hiking in the Simi Valley mountains and running into rattle snakes with GoPros, batteries, memory cards, lenses, Blackmagic Pocket Cinema Cameras and much more.

Even lightly dropping some of the bags with GoPros and BMPCC’s in them into the water and found no damage. I’ve really come to love the lens bags, especially when I need a quick lens cleaning and I know that I always have that with me. The Wolf Packs are something I constantly keep with me, great for shoots where I need to change out batteries and memory cards but also great for kid snacks, chapstick and sunscreen. Without hesitation I would order these again; the fabric and stitching is top notch. I had my wife, who really likes to sew and make clothing, take a look at them and she was really impressed with the Wolf Packs… so much so there is now one missing.

Check them out at their website www.fangsfilmgear.com, Twitter @FangsFilmGear and their main company site. Finally, if you are interested in some positivity you should check out www.heartsupport.com.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: Sony’s a6300 E-mount camera

By Brady Betzel

It’s fair to say that the still and motion camera market isn’t boring. Canon and Nikon have been the huge players in the market, more so a few years ago when Canon introduced the landscape-changing 7D and full-frame 5D cameras. The 5D was the magic camera for the filmmaking community. Over the years, other companies have been breaking the 5D mold with cameras like Blackmagic with its Pocket Cinema Camera, but once the filmmaking community started lusting for higher frame rates that filmed at higher than 1920×1080 resolution, along with a Log or Log-like color space, the field began to really open up.

It seems that’s when Sony started taking the prosumer camera market seriously and doubled down on the Sony a6000 mirrorless E-mount camera, which eventually led to the 4K (technically UHD) recording-capable a6300 and a6500.

Once people started seeing the images and video that the APS-C-based a6300 produced, mixed with the awesome low-light capabilities and wide dynamic range using picture profiles like SLog2/3 — Sony had a bonafide hit on their hands. And if you still wanted more sensor size than the a6300 can provide with its crop sensor, you have the full-frame A7SII and A7RII (and, hopefully, soon the A7R/S III).

In this review, I am going to cover the Sony a6300 and explain why it’s a good value for anyone looking to make some great content, or even just have top-notch 4K home videos. The image fidelity that comes from the Sony a6300 is truly incredible. It’s a little hard to quantify for me, but I think that the Sony a6300 has a look from the sensor that is superbly unique to a handheld camera. The Sony a6300 delivers a top-notch product for around $949.99 (not including lenses) or $1049.99, which includes a 16-50mm lens… but more on pricing later.

Technically, the Sony a6300 is a handheld camera camera with an interchangeable E-mount lens system. Since this is an APS-C crop sensor camera, it is not full frame. The sensor will record images at 24.2 megapixels and up to UHD (3840×2160) resolution when recording video, and since I work in video I am focusing on that aspect of the a6300. It records in the Sony created xvYCC color space — essentially an extended gamut color space that allows for more saturation but is compatible with existing YCC color space. Short answer: more saturation. It accepts Sony Memory Stick Duo or SD memory cards to record on, but do some research on your memory card as not all will allow for UHD recording at the full 100Mb/s. In movie mode you have an ISO range of 100-25600, which really shines in the high ISO range when filming in low light.

In terms of video recording formats, the Sony a6300 stays in the family with its XAVC S, AVCHD and MP4 all of which are 8-bit out of the camera. Keep in mind that if you edit a lot of footage in XAVC- or AVCHD-based codecs, your computer will need to be on the higher end and/or you will want to create proxy media to edit before finishing and color correcting. The XAVC and AVCHD codecs allow for pretty good quality video to be recorded, but this really stresses editing systems because of the way interframe codecs work. If you notice your system can’t play down your clips, it might be time to think about transcoding them to a more edit-friendly codec like ProRes, Cineform or DNxHR.

When recording in XAVC S 4K/UHD (3840×2160) you can shoot in multiple framerates and bitrates — 24p @ 100/60Mbps; 30p @ 100/60Mbps; XAVC S HD (1920×1080) 60p/30p/24p @ 50Mbps and 120p @ 100/50Mbps as well as many other options — but for this review those are the ones that really matter. The real beauty in the Sony a6300 is the ability to shoot in Log color space, which in very basic terms is a video with a grayish-flat color that allows for advanced color correction in post production because there is more information to pull out of the shadows and highlights aka dynamic range.

S-Log 2 split screen.

To enable the Log color spaces, find the Picture Profile menu under menu five and select PP7, PP8 or PP9. This is where you will find the Gamma menu and S-Log 2- or 3-enabled by default. There are more options but the next one that concerns a lot of people is the Color Mode, which can be changed to S-Gamut, S-Gamut3.Cine, S-Gamut3 and more. These are a little tricky, and my best piece of advice is to try each combination in different lighting environments like sunset, a bright blue sky with gradations and low light to see which works best for each situation. I noticed I got a good amount of noise in S-Log3 S-Gamut3.Cine, but I really tried to push the low light in that mode. I fixed excessive shadow noise when I was color correcting by using Red Giant’s Magic Bullet Suite Denoiser — read my review.

I noticed S-Log2 left me a little more detail in the highlights, while S-Log 3 gave a little more detail in the shadows; that may have just been my experience, but that is what I noticed. In addition, when shooting in S-Log 2/3 I noticed some macro-blocking/banding in shots that had color gradations, like a blue sky turning into white or even very bright lights — this will look like square digital artifacts or bands arcing across the gradient. I even saw a dead pixel flash when shooting some really low-light footage. The real test is to watch this footage on a huge TV or output monitor above 32-inches because you will really start to see the noise and banding that is present. I did some testing with noise removal, and with a little bit of noise removal elbow grease you can get a great picture. Overall, I am very impressed with the Log type images I was able to pull out of the a6300 and how well they held up in color correction. Typically, a camera that can pull this type of image would be at least over $5,000-6,000 or more plus lenses, so the a6300 is a steal.

After all that S-Log talk you might be asking, “What if I just want to shoot great video and not worry about Logs and Gamuts?” Well, you can set the Picture Profile to 1-6 and get a great image with little to no color correction needed. Specifically, Picture Profile 1 is really the automatic setting to use; it is described by Sony as being the “Movie Gamma,” which basically means your video will look good.

For more descriptions on the Picture Profiles of the Sony a6300, check out their help guide. You will need to test out all of the Picture Profiles though as they all have different characteristics, such as more detail in the shadows but less accurate color in the highlights. Just something to take a few hours and test out.

More Cool Stuff
The internal microphone on the a6300 is ok, but probably shouldn’t be used to use as your primary audio recording. I would suggest something like the Røde VideoMic Pro. Unfortunately without being able to monitor your audio by headphones you will definitely need to test your external microphone to check whether you need a pre-amp, or if something like the VideoMic Pro +20dB boost will be enough or too loud.

One thing that really stuck out to me was how fast the automatic focus was on the Sony a6300. I am used to using a Canon EOS Rebel t2i camera, and the Sony a6300 is lightning fast, almost instantaneous. It really impressed me. I was visiting Disneyland when I had the a6300 and was taking some stills and video around the park, I took a picture of my son, but the Sony a6300 had accidentally caught a bubble in the autofocus and very clearly took a picture of that bubble. It was accidentally incredible.

In addition to the camera, Sony let me borrow a few lenses when I tested out the a6300, including the 50mm f1.8 ($249.99), E 35mm f1.8 ($449.99) and E PZ 16-50mm f3.5-5.6 ($349.99). While the 35mm and 50mm are great , I felt that the 16-50mm zoom lens did the job for me overall. In low-light situations it definitely helped to have the f1.8 prime lenses in my bag, but during daylight, and even dusk, the zoom lens was great. However, when taking portraits or footage where I wanted a nice bokeh background, the prime lenses were what I had to use.

If I was going to buy this camera for myself I would weigh the idea of spending a little more money and grabbing a really nice lens, whether it be a prime or zoom. The only problem with that is most of the upgraded lenses are for full-frame cameras, which brings me to my next point: Would I just go all the way to a full-frame Sony A7rII or A7sII camera? In my mind, if I have enough money to get a full-frame camera I do it. The quality, in my eyes, is far superior. However, you are going to be paying an extra $1,000 to $2,500, depending on the lenses, whether you buy new or used. So a middle ground might be to buy the full-frame lenses like the G Master series for the a6300. This way when you find the right Sony body you don’t have to upgrade lenses as the full-frame lenses will work on the a6300. Keep in mind you will have a crop factor of 1.5, which means a full-frame 50mm lens will actually be a 75mm lens. That might be more confusing than helpful, but it is a constant fight for Sony a6300 owners after they see what the Sony cameras can do. Another option is to take a look at Craigslist or Ebay and see if anyone is selling a used a6300 or A7srII. I did a cursory search when writing this article and found a Sony a6300 with four lenses and extra accessories for $1,300, and another a6300 for sale with one lens for $700, so there are options for used cameras at a great price.

So what didn’t I like about the Sony a6300? There is no headphone jack to monitor your audio. That’s a big one, but one possible solution is using the micro-HDMI port. If you use an external monitoring solution, like an Atomos or a SmallHD monitor, you will be able to use their audio monitoring. Also, I just can’t get used to Sony’s menu and button setup. Maybe because I’ve been used to Canon’s menu, button and wheel setup for a while, but Sony’s setup for some reason throws me off. I feel like I have to go in to one or two extra menus before I get to the settings I want.

Summing Up
The bottom line is that the Sony a6300 is an incredible UHD-capable camera that can be purchased with a lens for around $1,000. It lacks things like proper audio monitoring but gives you great control over your color correction when filming in SLog 2 or 3, and with a little noise reduction you will have clean low-light footage in the palm of your hand.

There is a newer version of this camera in the a6500, which has the following upgrades over the a6300 — 5-axis image stabilization, touchscreen LCD (can swipe to change focus on an object or touch to set focus) and improved menu system. The a6500 costs $1,399.99 for the body only. The image stabilization is what really sells the a6500 since you can now use any lens (with adapters) that you want while still benefitting from image stabilization. Either way, the a6300 is the best bet to get a great UHD-capable camera at a great price, especially if you can find someone selling a used one with a bunch of lenses and batteries. The video that comes from the Sony line of cameras is unmistakable, and will add a level of professionalism to anyone’s videography arsenal.

You can see my Sony a6300 Slow Motion SLog 2/SLog 3 test as well as my UHD tests on YouTube.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

My top workstation accessories

By Brady Betzel

As a working video editor, I’m at my desk and on my computer all day. So when I get home I want my personal workstation to feel as powerful as possible and having the right tools to support that experience are paramount.

I’m talking workstation accessories. I’ve put together a short list based on my personal experience. Some are well known, while some are slightly under the radar. Either way, they all make my editing life easier and more productive.

They make my home-based workstation feel like a full-fledged professional edit suite.

Wacom Intuos Pro Medium
In my work as an offline editor, I started to have some wrist pain when I used a mouse in conjunction with my keyboard. That is when I decided to jump head first into using a Wacom tablet. Within two weeks, all of my pain went away and I felt that I had way more control over drawing objects and shapes. I specifically noticed more precision when working inside of apps like Adobe’s After Effects and Photoshop when drawing accurate lines and shapes with bezier handles.

In addition, you can program minimal macros on the express keys on the side of the tablet. While the newest Wacom Intuos Pro Medium tablet costs a cool $349.95, it will pay for itself with increased efficiency and, in my experience, less wrist pain.

Genelec 8010A Studio Monitors
One workstation accessory that will blow you away is a great set of studio monitors. Genelec is known for making some great studio monitors and the 8010A are a set I wish I could get. These monitors are small —  around 8-inches tall by 4-inches deep and 4-inches wide — but they put out some serious power at 96dBs.

Don’t be fooled by their small appearance; they are a great complement to any serious video and audio power user. They connect via XLR, so you may need to get some converters if you are going straight out of your station, without runing through a mixer. These speakers are priced at $295 each; they aren’t cheap, but they are another important accessory that will further turn your bay into a professional suite.

Tangent Element & Blackmagic Resolve Color Correction Panels
If you work in color correction, or aspire to color correct, color correction panels are a must. They not only make it easier for you to work in apps like Blackmagic DaVinci Resolve, but they free your mind from worrying about where certain things are and let your fingers do the talking. It is incredibly liberating to use color correction panels when doing a color grade — it feels like you have another arm you can use to work.

The entire set of Tangent Element Panels costs over $3,300, but if you are just getting started, the Tangent Element Tk (just the trackballs) can be had for a little over $1,100. What’s nice about the Tangent panels is that they work with multiple apps, including Adobe Premiere, FilmLight Baselight, etc. But if you know you are only going to be using Resolve, the Resolve Micro or Mini panels are a great deal at under $1,000 and $3,000, respectively.

Logitech G13 Advanced Gameboard
This one might sound a bit odd at first, but once you do some research you will see that many professional editors use these types of pads to program macros of multiple button pushes or common tasks. Essentially, this is a macro pad that has 25 programmable keys as well as a thumb controlled joystick. It’s a really intriguing piece of hardware that might be able to take place of your mouse in conjunction with your keyboard. It is competitively priced at only $79.99 and, with a little Internet research on liftgammagain.com, you can even find forums of user’s custom mappings.

Logickeyboard Backlit Keyboard
Obviously, the keyboard is one of the most used workstation accessories. One difficulty is trying to work with one in a dark room. Well, Logickeyboard has a dimmable backlit keyboard series for apps like Resolve and Avid Media Composer.

In addition to being backlit, they also have two powered USB 2.0 ports that really come in handy. These retail for around $140, so they are a little pricey for a keyboard but, take it from me, they will really polish that edit suite.

OWC USB-C Dock
With ports on Mac-based systems being stripped away, a good USB-C dock is a great extension to have in your edit suite. OWC offers a Mini-DisplayPort or HDMI-equipped version in the colors that match your MacBook Pro, if you have one.

In addition, you get five USB 3.1-compatible ports — including two of those being a high-powered charging port and a USB type C port — a Gig-E port, front facing SD card reader, combo audio in/out port and Mini-DisplayPort or HDMI port. These retail for under $150.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

 

Review: The GoPro Karma drone

By Brady Betzel

It seems like every week there is a remarkable update to drone technology or the introduction of a completely new drone. From DJI, GoPro, Yuneec or even Parrot, there are a lot of drones to choose from.

I’ve reviewed the DJI Phantom 4 drone and it was awesome. There were a few issues I had with the Phantom 4, like wanting a higher data rate for the footage and a smaller form factor — DJI answered both of those requests with the DJI Mavic Pro and their more recent small drone Spark. So what would GoPro’s Karma drone offer that DJI could not? To be honest, I wasn’t sure GoPro could rise up to DJI’s level. But the GoPro Karma is actually pretty different from the DJI Phantom.

Last year, GoPro sent me up to Squaw Valley in Northern California for the unveiling of the GoPro Hero 5 and Karma Drone. I’ve written about the Hero 5 line of cameras on this site and I still think the Hero 5 is a top-notch camera. If you follow tech news you probably saw that GoPro had to recall the first version of the Karma Drone due to the power from the battery disconnecting mid-flight. So that wasn’t good, but GoPro found a solution by adding a latch to the battery compartment to keep it in place. So here I am with the new and improved GoPro Karma Drone.

The GoPro Karma drone can be purchased in a few different configurations: $1,099 for the entire package, including the Hero 5 Black camera, Karma grip and Karma drone; $799 for the Karma grip and Karma drone (no camera); and $599 for the Flight Kit, which includes just the drone (you have to supply the Karma grip and camera). You can buy it here. If you have a Hero 4 you can purchase that camera harness for $29.99.

Unpacking the Box
When you buy the complete Karma with Hero 5 Black kit, you get the drone itself, a slick carrying case that doubles as a backpack, a charger, a battery, all-in-one-controller (no need for a phone), Karma grip and Hero 5 Black Edition. When I opened the box I immediately charged the batteries on each component: the camera, the Karma grip, controller and the Karma drone battery. It’s a lot of things to charge so make sure you have enough outlets. The charger that comes with the kit will charge a battery as well as a USB-C connected device, like the Karma drone controller or the Hero 5 Black Edition itself.

My advice is to let everything charge overnight if you can contain yourself. If you can’t, then let everything charge for an hour or so. You should at least be able to get a few minutes of flight time. If nothing else get that Karma controller plugged in and run through the built-in Flight Simulator and Learn to Fly apps; they will at least get you comfortable with the Karma and how it operates.

You do not need the Karma drone powered on like you do with the DJI Phantom to access the flight simulator, so you can pretty much start practicing immediately. After you master the flight simulator, do some research and check your local drone laws. One day you might have to register your drone, or you might not. It’s a constantly changing landscape of drone laws right now and you don’t want to get into trouble or accidentally hurt someone, so checking out the FAA website is a good place to start your research.

After you have your entire GoPro package charged, insert the Karma drone battery completely into the drone body, insert just the stabilizer from the Karma grip into the drone body and lock it into place (you can pack the Karma battery grip for later), spin the propellers on and tighten with the supplied tool, unfold the landing gear and legs, press the power button on the drone, and press the power button on the remote — now you will be flying. One of the first things I noticed when putting the Karma Drone together was that I definitely liked the way the Phantom 4’s propellers connected to the drone more than the way the Karma’s attached.

Before leaving the ground, I got into the routine of setting my Hero 5 video settings before I launched (when I remembered). Personally, I think the video settings sweet spot on the Hero 5 Black Edition is at a resolution of 2.7K and running at a frame rate of 60fps to allow for smooth slow motion when editing (slow-motion drone footage when done right seems to always make people say “wow”). In terms of the ProTune, I set the appropriate white balance and knocked down EV compensation to -1 when the sun is out or clouds are bright. Knocking the EV down helps to retain the details in bright white colors. Think of it like built-in sunglasses (or digital ND filters). And 2.7K, 60fps seems to be a pretty happy medium in terms of quality vs. storage space on the Hero 5.

Keep in mind that the data rate of your video will stay between 50Mb/s and 60Mb/s no matter what resolution you use on the Hero 5. Logically, that means that 4K will stretch that data rate out, leaving you with a bigger image but technically less detail. Hopefully, GoPro will ramp up their data rates and check out another codec like the H.265 or maybe a new Cineform codec in their next release; everyone would really appreciate the extra image detail and color. And while I’m at it suggesting things, I wouldn’t mind seeing some 10-bit 4:2:2 recording — know that is wishful thinking.

Like most drones, the Karma batteries didn’t last all day. They were lasting between 18 to 24 minutes, depending on wind conditions. The heavier the wind gusts the more your Karma will try and compensate to stay straight, which will drain your batteries fast. Mine started to get between 13 to 14 minutes with medium wind gusts. A second battery is definitely worth it.

The Remote
Arguably my favorite part of the Karma drone, besides the actual drone, is the remote. The screen could be a little brighter outside but it looks good: it is a touchscreen, and it is very comfortable to hold. I never really liked the way the DJI Phantom remote felt, but the GoPro Karma remote feels awesome. In my opinion, the GoPro Karma drone remote is the best drone remote I’ve used. Besides controlling the settings of your GoPro camera from the remote, you can run the flight simulator and access any maps you have downloaded as well as the automatic flight settings.

You get four auto shot paths: Dronie, Cable Cam, Reveal and Orbit. Dronie starts off either close to the operator and flies up and out, or the reverse. Cable Cam has you set two points and will fly between those points. Reveal starts with the camera pointed down and slowly pans up to reveal the horizon. Orbit will circle an object you pre-determine. With all of these paths, you set the start and end points as well as speed and distance they travel. Once you tell these auto shot paths to begin you can control other parts of the drone easier, such as camera movement and orientation, as well as speed. They are awesome to play with and make great opening or closing shots for a movie.

When you are running out of battery, the Karma will automatically return to home base where you took off. This is something you need to keep in mind when flying because the GoPro Karma does not have collision avoidance, and if you simply hit return to home or it does it on its own, it could fly straight into power lines or something like a tree… and that will not go well. But when you are ready to fly back to your home base or on top of your Karma carrying case, which makes a great launch pad, you can hit “Return to You” or “Return to Launch.” If you’ve walked away and you want your Karma to come to where the remote is “Return to You” is what you want to hit. Again, this is when you need to be aware of what obstacles are in the Karma’s path.

On one of my outings I was going hiking about a half mile away from where I live in the hills of Simi Valley, California. It was a few months back when the hills were lush green from the recent rain, but it was warm. I had the Karma backpack on and was walking up a narrow path for about 15 minutes as I was chanting, “Please no snakes, please no snakes” in my head. Well, low and behold, Mr. Snake popped his head out from the side of the trail and said hi. It was probably a rattlesnake that wasn’t mad as we have tons of those in the hills around Ventura County. Nonetheless, I got out of there without any footage. If I had my wits about me I probably could have got a decent shot with the Karma Grip…nope. Let’s be real. I was out of there faster than the Flash.

I did discover that if you leave your Hero 5 in the Karma stabilizer while plugged into the drone or the Karma Grip, it will drain your Hero 5 battery. So take it out while it’s in storage. I really don’t have much to criticize in the Karma drone, but my wish list would include proximity sensors for collision avoidance, a higher data rate for the Hero line of cameras, which may come in their next release of the Fusion camera, and possibly a smaller form factor.

Summing Up
In the end, you won’t really get the idea of how fun drones are to fly until you get your hands on one. Drone filmmaking is not easy; it takes time to get beautiful shots. Think about it, there are camera people who make a good living off getting great shots. So don’t beat yourself up if it takes a few times to get the hang of just being comfortable flying a drone around while trying to keep everyone and everything safe.

However, once you get past the initial paranoia when flying a drone, you can get some unique shots that you may never have thought were possible. In my opinion, the GoPro Karma is the easiest and overall best drone to use. While it may not have all of the collision avoidance that the Phantom or Mavic have, it has an ease of use that is unrivaled.

The controller is so easy. My wife, who doesn’t really care about drones and would rather sew, was able to pick it up and fly within 20 minutes. This definitely wasn’t possible on the Phantom. In addition, being able to pull out the Karma stabilizer and attach it to the Karma grip within minutes is a game changer for someone running around the beach or hiking in the mountains.

If you are already a fan of the GoPro products, the Karma drone and grip are definite items to add to your shopping cart. GoPro even sent me the Karma Grip extension cable to play with. You can use it to stash the grip handle away from the stabilizer and then use a chest mount, or even the mount on the strap of the GoPro Seeker backpack, bringing stabilization everywhere.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: Zylight’s IS3 LED lights

By Brady Betzel

I see a lot of footage from all over the world captured on all sorts of cameras and shot in good and bad lighting conditions. Besides camera types and lenses, proper lighting is consistently an area that needs the most attention.

If you troll around YouTube, you will see all sorts of lighting tutorials (some awful, but some outstanding) — some tutorials offer rundowns on what lighting you can get for your budget, from the clamp-style garage lights with LED bulbs that can be purchased at your local Lowe’s, a standard three-piece lighting kit, or even the ever-trendy Kino Flo lights. There are so many choices it’s hard to know what you should be looking at or even why you are choosing things like LED over tungsten or fluorescent.

In this review, I am going to go over the Zylight IS3/c LED light. The “c” in IS3/c stands for the Chimera softbox, which can be purchased with the light.

Recently, I have really been interested in lighting, and a few months back Zylight sent me the IS3/c to try out. Admittedly, I am not a world-famous DP or photographer with extensive experience in lighting. I know my way around a mid-level lighting setup and can get my way through a decent-looking three-light setup, so my apologies if I don’t touch on the difference between the daylight and tungsten foot candle output. Not that footcandles are not interesting subjects, but those can take a while to figure out and are probably best left to a good Lynda.com tutorial, or better yet a physics class on optics and lighting like the one I took in college.

Diving In
The Zylight IS3/c comes with the light head itself, Yoke bar with 5/8-inch baby pin-adapter, some knobs and washers, AC adapter and hanging pouch, safety cable, guide and the Chimera softbox (if you purchased the IS3/c package). Before reading the manual, which would have been the proper thing to do, I immediately opened the box and plugged in the light. It lit up the whole interior of my house at night — think Christmas Vacation when Clark plugged in the Christmas lights (good movie). I saw, in one second, how I could immediately paint a wall (or all of my walls) with the IS3.

The beauty of LED lights is that they are typically lightweight and some can reproduce any color you can dream of while staying cool to the touch. So I wanted to see if I could paint a 15-foot wall chromakey green. With little effort I switched into color mode by flipping the rocker switch on the back of the light, turned the Hue knob until I hit green, and adjusted the saturation to 100% to try and literally paint my wall green with light. It was pretty incredible and dead simple.

The IS3 has a 90-degree beam angle on center with a 120-degree beam angle total (I found multiple specs on this like 95/115-degree beam angle, so this is approximate), has a power consumption of 220 watts max, can be purchased in black and white and is made in the USA. The IS3 has two presets for white light and two presets for color. In white mode the IS3 can output any color temperature between 2500K and 10,000K. The Kelvin range is adjusted in 50K steps. Because LEDs are known for giving off a green tint, there is an adjustment knob to lower or raise the green adjustment. There is also a dimmer knob that allows for dimming with little color shift. In color mode, there are three adjustments: hue, saturation and dimming.

One of the big features among IS3 lights, and Zylight lights in general, is the built-in wireless transmitter that can talk to the Zylink bridge and Zylink iOs app. You can link multiple lights together and control them simultaneously. With the iOs app you can set hue values and even color presets like crossfade, strobe, police and flame. You can run the Zylight by either the AC adapter or rechargeable battery. The outside of the light is built sturdy with a rubberized front and a metal back that doubles as heat dissipation as well. In addition to the Zylink wireless connection, you can use the DMX connection to connect to and control the Zylight.

In the end, the Zylight IS3/c is the soft light as well as wall wash light that I’ve been dreaming of. I was even thinking I could use the IS3 as Christmas lights. I could get a couple IS3s to paint the house red and green.

The Zylight is as easy to configure as any light I have ever used; unfortunately the price doesn’t match its ease of use. It’s pricey. The IS3/c is currently listed on Adorama.com for $2,699, and just the IS3 is $2,389. But you get what you pay for — it’s a professional light that will run 50,000 hours without needing calibration, it weighs 11 pounds and measures 18.5” x 10.75” x 1.9” — and you will most likely not need to replace this light.

If you run a stage show and need to control multiple lights with multiple color combinations quickly, the Zylink wireless bridge and iOs app may be just for you.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: Red Giant Magic Bullet Suite 13, Part 2

By Brady Betzel

In Part 1 of my review of Red Giant’s Magic Bullet Suite 13, I went through Magic Bullet Looks 4 as well as Colorista IV. Both are color correction and grading plug-ins that are compatible with Adobe’s After Effects and Premiere Pro and Apple’s Final Cut Pro X and Motion.

For this review, I will cover the rest of the Magic Bullet Suite 13, including Denoiser III, Mojo II, Cosmo II, Renoiser and Magic Bullet Film. Denoiser III is my definite favorite, by the way.

Denoiser III
In my opinion, Denoiser III is one of the standouts in the Magic Bullet Suite 13 plug-in package because of its magical ability to remove noise quickly and easily. Noise reduction is typically a very long render because of the sheer complexity that is involved in the process. Denoiser III has been rewritten and now adds near realtime playback in most cases; the better the discrete graphics card you have, the better your playback will be. You can check out graphics card compatibility here.

Denoiser 3 – Looks and Denoiser applied.

The options are limited and I believe we are better off for it. A lot of denoise plug-ins have an abundance of options when, in reality, unless you are an online editor nerd like me or a colorist, you probably don’t have time to mess around with the different noise removal options and render each time. Denoiser III has five options: Reduce Noise, Smooth Colors, Preserve Detail, Sharpen Amount and Sharpen Radius.

When removing noise from footage, remember that the more you crank up your noise reduction the longer your render time will be, and you will also start to lose detail in your image. Occasionally, you will remove and sharpen an image and think that it looks a little too cleaned up. This is when you will want to jump into Magic Bullet Renoiser or apply your own film grain or noise to the footage. Another tip is to always place Denoiser III first in your effects chain. You can even put Magic Bullet Looks 4 after Denoiser III, apply your look in Looks and jump back into Denoiser III to adjust your noise after corrected.

When testing in Premiere Pro, I imported some Sony A6300 S-Log3/S-Gamut3 footage lying around. I filmed a close-up of my wife sewing, with just the light of the sewing machine to light the scene. In addition, I shot it in slow motion at 1920×1080 at 23.98fps. I exported 15 seconds of the “raw” footage from Premiere Pro via Adobe Media Encoder as a 1920×1080, 50Mb/s H.264, which took 33 seconds. With Denoiser III applied, it took 46 seconds, about a 29% increase in time. With Denoiser III and Magic Bullet Looks 4 applied, the export took the same 46 seconds. After I was done exporting, I saw that all of the H.264 exports with Denoiser III applied to them were corrupt and unusable in a traditional sense. In another sense, they were pretty awesome. Either way, be sure to check out that link to graphics card compatibility that I have earlier in this review. That should be strictly adhered to. In my head, I thought “Yeah right, I know you say you need an Nvidia or AMD GPU, but I’m sure my Intel…. Nope. Didn’t work.” So check out the compatibility before jumping in head first like me. Aside from that one hiccup, the screen grabs I took will show you how well the Denoiser III works. I tried Denoiser III on a much higher-end system with a discrete Nvidia Quadro card, and Denoiser III worked just as described.

Denoiser III either comes bundled in the Magic Bullet Suite 13 for $899 or can be purchased separately for $199.

Renoiser
As you have read, I just pulled out a lot of noise with Denoiser III, but I definitely want to put back just a touch of texture and noise with Renoiser. A common problem when removing noise from footage is that you are left having an overly processed look from smoothing. Typically, you can somewhat fix that with a little sharpening and/or adding back in some sort of film grain.

Denoiser and Renoiser

Magic Bullet Renoiser will allow you to add GPU-accelerated film-style grain and/or digital style noise to your footage in Premiere Pro, After Effects, Final Cut Pro X, Motion, Blackmagic’s Resolve, Eduis and HitFilm. You can find specfic version compatibility info for these apps with Renoiser here.

Typically, to add noise or grain back to your footage you would purchase stock film grain from somewhere like www.rampantdesigntools.com or gorillagrain.com. Both companies have some great grain offerings, so you should check them out, but stock grain is applied over the footage with a blending mode, like overlay. What Renoiser does is apply the grain and noise as part of your moving image; it’s not an overlay. If you zoom into your image after applying one of the Renoiser presets you will see the actual picture being pseudo-recreated with the grain and/or noise.

Under the Renoiser plug-in you get some quick sharpening options that are handy, even though it’s not a sharpening plug-in. There are 16 presets, including 8mm and 16mm, as well as grain settings like size, color channels, monochrome options and even Tonal Range adjustments to dial in your highlight, midtone and shadow grain work. While this plug-in is GPU-accelerated, it does work with Intel GPUs, so it did not give the same errors on export that I got with Denoiser III.

For my tests, I liked the preset Big Kahuna for the sunset shot I had. This was shot on the Sony a6300 in S-Log3/S-Gamut3, but this time with a UHD 3840×2160 clip in a 1920×1080 timeline, lending itself to a good amount of noise in the shadows. First I added Denoiser III, then Colorista IV to do some balancing and add some saturation and then I finished it off with Renoiser.

I really cranked up the Renoiser to show off its work, but adding noise is usually a practice in subtly. Typically, adding noise is to give an overall cohesion to your film or, oppositely, a disruption. For my footage, while I cranked Renoiser way up it really didn’t overdo it, which is nice; it seems like Red Giant kind of allows you to go all out without destroying your footage with too much artificial grain or noise.

While watching Stu Maschwitz’s (@5tu) Renoiser tutorial I picked up a great tip that I had never thought about — when sending your project through a second compression service, like YouTube, you may notice some of your footage can get artifacting such as banding (rings in things like sunsets or gradients). Stu suggests that because your footage may be too smooth, the compressor can cheat a bit and not fully process your footage, leaving certain areas with banding. A workaround can be to add a light amount of noise to your footage to ensure that the compressor processes your footage completely. In Renoiser, there are a couple of presets like Light Noise, Image Vitamins, and Compression Proofing that might help in getting past those issues.

To test the rendering/exporting power of Renoiser I made a 30-second 50Mb/s H.264 QuickTime from my UHD media, downscaled to HD through Media Encoder. With Magic Bullet Colorista IV and Renoiser applied to the 30-second clip, it took three minutes and four seconds to export. Without Renoiser but with Colorista IV, it took one minute and 30 seconds — roughly 100% speed decrease when using Renoiser. Keep in mind I am on a slower Intel-based GPU, so if you have something like an Nvidia GTX 1080, your results will probably be significantly improved. However, adding and removing noise is an intensive process so that is definitely something to remember.

At first I wasn’t sure if I would be pumped on using Renoiser, since there are so many options out there for adding noise to footage, but I have been convinced. The quality of noise generation and options to personalize it are outstanding. Using Denoiser III, Looks 4 or Colorista IV and Renoiser seems to be a great combo when finishing your project. In the future, I would like to see some more options in the preset category, but with the 16 presets there now, and even an option for a custom preset of your own, you have plenty to choose from.

Mojo II
Out of all the Magic Bullet Suite 13 plug-ins. Mojo II is the one that will provide an instantly recognizable look with one click. Mojo II will basically pull the orange and blue trick while adding a nice contrast curve. A common trend in color correction is to cool off the shadows with blue and warm up the skin tones/mid tones with orange; this is a very popular look from Michael Bay films, hence the preset “Optimus” inside of the Mojo II presets.

Mojo II

To begin, you need to specify whether you have footage that is video, flat, Log or Log Pro. Essentially, Log Pro is footage shot with high-end cameras like the Alexa. But you’ll need to experiment because these are essentially a starting LUT and you have control over what looks the best. For instance, I brought in some more Sony s6300 S-Log3/S-Gamut3 footage, and at first I thought regular old Log would do it, but Log Pro was actually the right fit. There are 15 presets in Mojo II, including Optimus, Light, Mojo and War. Applying a preset really seems like the best way to start in Mojo II.

Since you probably aren’t going to dive too heavily into color correction, adding a preset is the best way to start. There are 13 options in Mojo. A few important ones are: Mojo, which essentially lets you customize the amount of orange and blue that goes into a shot; Punch It, which is contrast; Bleach It; Fade it; and Corrections, which allows for exposure adjustments and other important footage correcting options.

To test my export speed, I used my handy YouTube-friendly preset: 1920×1080 50Mb/s H.264. Exporting a 30-second clip took 30 seconds without Mojo II and one minute and one second with Mojo II applied. So, like the other plug-ins, Mojo II took me about double realtime for the render. At $99, Mojo II is the fastest way to take your footage and give it that orange and blue Hollywood-style look. I was pleasantly surprised at how well the plug-in worked. Immediately, I thought of how someone not familiar with color correction could apply Magic Bullet Suite Mojo II and have a great color grade without the hassle of diving deep into color correction tools. Even if you need to do a little cleanup, there are options like the Skin Tone overlay to get your skin colors right.

Cosmo II
One of the most underused color correction methods is skin correction. A lot of people will color correct for a wrong skin tone, or color cast in a shot, but most will not do beauty work. Why? Because it’s not so easy. A lot of times you have to pull a color key of the skin you want to correct, do a light blur, re-sharpen, re-noise and hope the talent doesn’t move their head too much or you will be tracking as well. With Red Giant’s Cosmo II you can easily select a skin tone to balance, remove lines or even attempt to remove blemishes. Skin correction is a very tough skill to master; there is a delicate balance between overly corrected and not corrected enough.

Cosmo II

With Cosmo II, you can select the skin tone of the subject you want to correct with the eyedropper and adjust how far outside of your color selection you want Cosmo to go with tolerance and offset. Further down the effects menu are two other categories of options: Skin Smoothing and Skin Color. Typically, in skin correction you might see someone go overboard with the softening (or blurring). One way that Red Giant is combatting bad skin correction is with adjustments like Preserve Detail, Preserve Contrast and even Restore Noise. When used in concert you can achieve some great wrinkle removal, but allow some of the authentic contours of the skin to stay intact with Preserving Detail.

Under the Skin Color menu you can fix things like blotchy colors with Skin Yellow/Pink and Skin Color Unify. Much like the other Red Giant Magic Bullet Suite 13 plugins, you can also enable the “Show Skin Overlay,” which throws an orange grid over your skin tone selection to help guide you towards a proper skin color. Nothing is better, however, than the human eye when using a properly calibrated monitor, so don’t forget to take a step back and actually digest the adjustments you are making.

Magic Bullet Film
Last in the Magic Bullet Suite 13 package is Magic Bullet Film — a set of negative stocks and print stocks that help you emulate the look of actual film. First you choose your type of video: Video, Flat, or Log. Then you can choose a Negative Stock and Print Stock. While I am a post nerd, I definitely do not have every print and negative stock committed to memory, so cycling through the options is helpful. I had some footage I shot at Disneyland California Adventure that was captured inside of a room with tons of crazy lights and screens, but it seemed to be a great shot to test Magic Bullet Film on. I applied the negative stock Prolochrome P4400 and the print stock Fujifilm 3521XD. My shot had some nice greens and blues in it and these seemed to complement it well. I was really impressed with how the footage looked with Magic Bullet Film applied; it gave a really over-the-top teal look. After you apply the look you can dial-in some specifics like color temperature, exposure, contrast, skin tone, grain and even a vignette.

Magic Bullet Film

Another interesting adjustment is the Vintage/Modern slider, which, when boosted, adds a contrasty blue and yellow look. When lowered, it adds a brown, washed-out look.

Summing Up
When I finally finished this two-part extra-long review, my appreciation for “set-it-and-forget-it” plug-ins. Magic Bullet Suite 13 is a phenomenal set of color correction plug-ins that allow you to do as much or as little as you want to your footage while always having a great end product.

You really can’t put a value on a truly great colorist — they put a certain shine on video that sometimes can’t even be put into words. But with that responsibility and skill comes a heavy price tag — for the rest of us you can still get a great look with Magic Bullet Suite 13. Find out more on Red Giant’s website — the entire suite runs $899, but you can purchase each plug-in separately.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

My NAB 2017 top five

By Brady Betzel

So once again, I didn’t go to NAB. I know, I should go, but to be honest I get caught up in my day job and my family, so usually I forget about NAB until the week before and by that time it’s too late to pull off. I’m hoping to go next year, like really hoping I make plans.

So there or not, I was paying close attention to the announcements that came out of new products, and even updates to older products. Let’s be real, other than doing some face-to-face networking, you can really get the same if not more info by lurking online. Below are five announcements that really got my attention.

Blackmagic’s DaVinci Resolve 14
Blackmagic saw that this year’s Resolve update from 12.5.5 to 14 is so good they skipped 13. There was a significant drop in the DaVinci Resolve Studio price from $999 to $299, while adding features that many of the top NLE/color correction software dogs are lacking.

The beauty of Resolve is that it is first and foremost an industry-proven color correction powerhouse, one that is used on many of the top movies and television shows in the industry.

They are also expanding their footprint laterally to encompass professional audio as well as professional video. In Resolve 14, Blackmagic has added a Fairlight audio page to allow for a much more Pro Tools-like editing experience within the same Resolve app we have all grown to become extremely excited about. In my mind that means that at a professional facility, or your own garage, you can have a editor/colorist sitting with a re-recording engineer to review a movie or a show with the client at the same time.

The Fairlight page within Resolve 14.

As long as you have two separate workstations, the colorist and audio mixer can be addressing notes on the same sequence inside of Resolve 14 because of the newly updated collaboration enhancements. The Audio mixer or colorist could then refresh their sequence to update it with any changes the other had made and see them immediately reflected.

I haven’t gotten my hands on this update in a proper environment to test out the collaboration functionality, but the timeline comparison and review features seem like a godsend to anyone who does any sort of conform work. It is the beginning of Blackmagic’s path toward Avid Media Composer’s lock on the industry with their sequence and project sharing.

On Twitter, Blackmagic’s director of DaVinci software engineering, Rohit Gupta answered my question about whether EDLs and AAFs will fall in line with the timeline review. He said it will work “irrespective of how you create the timeline. So it will work with EDL/AAF too.”

Clip, sequence and bin locking are the future for collaborative workflow inside of Resolve. I would love to see how someone uses these features in a large collaborative environment of 10 or more editors, sound editors and colorists. How does Resolve 14 handle multiple sequence updates and multiple people knocking on a bin? How does Resolve work on something like an Avid Nexis?

Moving on, while I’m not an audio guy I do realize that Fairlight is a big player in the pro audio industry, maybe not as sizable a footprint as Avid Pro Tools in the United States, but it still has its place. So Blackmagic inserting Fairlight technology, including hardware compatibility, into Resolve 14 is remarkable.

The Resolve 14 update seems to have been focused on everything but the color correction tools. Except for the supposed major speed boost and options like face tracking, Blackmagic is putting all its eggs into the general NLE basket. It doesn’t bother me that much to be honest, and I think Blackmagic is picking up where a few other NLE players are leaving off. I just hope they don’t spread Resolve so thin that it loses its core audience. But again, with the price of Resolve 14 Studio coming in at $299 it’s becoming the major player in the post nonlinear editor, color correction, and now audio finishing market.

Keep in mind, Resolve 14 is technically still in beta so you will most likely run into bugs, probably mostly under the Fairlight tab, so be careful if you plan on using this version in time-critical environments.

You can find all of Blackmagic’s NAB 2017 updates at www.blackmagicdesign.com, including a new ATEM Studio Pro HD switcher, UltraStudio HD Mini with Thunderbolt 3 and even a remote Bluetooth camera control app for the Ursa Mini Pro.

SmallHD Focus
There was a lot of buzz online about SmallHD’s Focus monitor. It’s an HDMI-based external touchscreen monitor that is supposedly two to three times brighter than your DSLR’s monitor. People online were commenting about how bright the monitor actually was and about the $499 price tag. It looks like it will be released in June, and I can’t wait to see it.

In addition to being a bright external monitor it has a built-in waveform, false color, focus assist, 3D LUTs, Pixel Zoom and many more features. I really like the feature that offers auxiliary power out to power your camera with the Focus’ Sony L Series battery. You can check it out here.

Atomos Sumo
Another external monitor that was being talked about was the 1,200-nit Atomos 19-inch Sumo, a self-proclaimed “on-set and in-studio 4Kp60 HDR 19-inch monitor-recorder.” It boasts some heavy specs, like the ability to record 4K 12bit Raw and 10-bit ProRes/DNxHR — plus it’s 19 inches!

What’s really smart is that it can double as an HDR grading monitor back in the edit suite. It will map color formats Log, PQ and HLG with its AtomHDR engine. Technically, it supports Sony SLog2/SLog3, Canon CLog/CLog 2, Arri Log C, Panasonic Vlog, JVC JLog, Red LogFilm Log formats and Sony SGamut/SGamut3/SGamut3.cine, Canon Cinema, BT2020, DCI P3, DCI P3+, Panasonic V Gamut and Arri Alexa Wide Gamut color gamuts. While the Sumo will record in 4K, it’s important to note that the monitor is actually a 10-bit, 1920×1080 resolution monitor with SDI and HDMI inputs and outputs.

The Atomos Sumo is available for pre-order now for $2,495. Get the complete list of specs here.

Avid Everywhere
This year, Avid Media Composer editors saw a roadmap for future updates like an updated Title Tool that is higher than HD compatible (finally!), an advanced color correction mode and Avid Everywhere based on the MediaCentral platform.

If you’ve ever seen an app like Avid Media Composer work through the cloud, you will probably agree how amazing it is. If you haven’t, essentially you will log in to Media Composer via a web browser or a light on machine app that runs all of the hard processing on the server that you are logging in to. The beauty of this is that you can essentially log in wherever you want and edit. Since the hard work is being done on the other end you can log in using a laptop or even a tablet that has decent Internet speed and edit high-resolution media. Here comes that editing on the beach job I was wanting. You can check out all of the Avid Everywhere updates here.

In addition Avid announced Media Composer First — a free version of Media Composer. They also released an updated IO – DNxIQ, essentially with the Thunderbolt 3 update along with a live cross-convert .

Sony a9
With all eyes on Sony to reveal the most anticipated full frame cameras in prosumer history — a7RIII and a7SIII — we are all surprised when they unveiled the 24.4MP a9. The a9 is Sony’s answer to heavy weights Canon and Nikon professional full-frame cameras that have run the markets for years.

With a pretty amazing blackout-free continuous shooting ability alongside an Ethernet port and dual SD card slots, the a9 is a beauty. While I am not a huge fan of Sony’s menu setup, I am really interested to see the footage and images come out on the web; there is something great about Sony’s images and video in my eyes. Besides my personal thoughts, there is also a five-axis in-body stabilization, UHD (3840×2160) video recording across the entire width of the sensor and even Super 35 recording. Check out more info here .

In the end, NAB 2017 was a little lackluster in terms of barn-burner hardware and software releases, however I feel that Blackmagic has taken the cake with the DaVinci Resolve 14 release. Keep in mind Blackmagic is also releasing updates to products like the Ursa Mini Pro, new Hyperdeck Studio Mini and updates to the ultra-competitive Blackmagic Video Assist, adding ever-valuable scopes.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: Red Giant Magic Bullet Suite: Looks 4 and Colorist IV

By Brady Betzel

Color grading is an art unto itself. A dedicated colorist can make your footage look so good, your response upon seeing it will likely be, “I had no idea it could look like this.”

Unfortunately, as an editor you don’t have the opportunity to spend 10 hours a day honing the craft of color correction. You don’t sit in front of high-end color correction panels while surrounded by thousands of dollars worth of equipment whose reason for being is taking everyday footage and pulling peoples’ minds inside out.

Even with Blackmagic’s free version of DaVinci Resolve out in the wild, color correction is a skill that takes a lot of time to hone. Don’t fool yourself, one, two or three years doesn’t really even scratch the surface of the dedication you need to dive into the dark art of color correction, and most color correction artists are more than willing to tell you that. Hopefully, that doesn’t discourage you on your journey to becoming a color master, because it is an awesome career in my opinion. I mean how many people get to play with what are essentially digital crayons all day and get paid to do it?

For those who don’t have hundreds of hours to learn the magic of apps like Resolve, Pablo Rio and Baselight, or even color correction inside of an NLE like Avid Symphony or Adobe Premiere for that matter — you still have the ability to create stunning footage with plug-ins like Red Giant’s Magic Bullet Suite.

The Magic Bullet Suite is a set of color correction and video finishing plug-ins that work in multiple multimedia apps like Adobe’s Premiere and After Effects, and some even work inside of Final Cut Pro X, Motion, Resolve and Avid Media Composer/Symphony.

If you’ve been around editing for a while, you’ve probably heard of Magic Bullet Looks; it’s one of the most common color correction plug-ins that people use to quickly and easily color correct and grade their footage. The full contents in Magic Bullet Suite 13 include Magic Bullet Looks, Magic Bullet Colorista IV, Magic Bullet Film, Magic Bullet Cosmo II, Magic Bullet Denoiser III, Magic Bullet Mojo II and Magic Bullet Denoiser. While all of these Magic Bullet plug-ins can be purchased separately, they are available as a suite for $899 as well as at an academic price of $449. There is also an upgrade price of $299 if you are a previous version user.

Magic Bullet Suite 13 has been overhauled, with one of the biggest additions being OpenGL and OpenCL support, allowing incredible speed gains. In this review, I am going to provide a plug-in-by-plug-in review, so you can see if $899 is an investment you should make. Up first is the heavy hitter of the suite: Magic Bullet Looks.

Magic Bullet Looks
At $399, Magic Bullet Looks 4 is just one piece of the Magic Bullet Suite 13 set, but it’s probably the most well known. Looks is a color correction plug-in that works with most major nonlinear editing apps, including Premiere Pro CC, After Effects CC, Final Cut Pro X 10.2.3 and up, Apple Motion 5.2.3 and up, Magix Vegas Pro 14, Avid Media Composer 8.5-8.7, Blackmagic DaVinci Resolve 12.5, Edius 8.2 and Hit Film Pro 2017.

Essentially, Magic Bullet Looks 4 is a color correction plug-in that you would typically start with a preset correction or grade that Looks has built in. Think of it like a set of over 200 color grading LUTs (Look-Up Tables) that can be finessed, changed and layered — from there you can add vignetting, video noise or even one of my favorites: chromatic aberration.

There are a few new updates in Looks 4 that make this version the one to purchase. Version 3 had GPU acceleration, but Version 4 includes OpenCL- and OpenGL-compatibility for better realtime playback with color graded footage; the Renoiser Tool which can help to add video noise or film grain back into denoised footage; and my favorite technical feature that I hope other apps include — the ability to resize the color scopes and even zoom into the Hue/Saturation scope.

While using Magic Bullet Looks, I discovered just how easy Red Giant made it to add a “look” to footage, add some grain and a vignette and then export. While the Lumetri Color tools in Premiere Pro CC were a great addition, they left some things to be desired, and in my opinion, Red Giant Magic Bullet Looks picks up where Premiere’s Lumetri Color tools left off. To apply Magic Bullet Looks 4, you can find it in the Effects drop-down under Video Effects > Magic Bullet > Looks, scrub over to the Effect Controls window and click on “Edit Look.” From there you will be launched into the Magic Bullet Looks plugin GUI. Before you get started it might be handy to open up the Magic Bullet Looks user guide, especially the keyboard shortcuts.

To try out Magic Bullet Looks, I had some Sony a6300 footage lying around waiting to be color corrected. Once inside of the Magic Bullet Looks’ GUI I saw another new feature called the Source tool, which quickly allows you to specify if you are working with Log, flat or video footage — basically, a quick LUT to get you to a starting point — nothing ground breaking, but definitely handy. From there you can open the “Looks” slideout and choose from the hundreds of preset looks. I immediately found “Color Play” and chose “Skydance,” a trippy, ultra-saturated preset with a couple of color gradients, a preset color grade using Colorista (which I will cover later in this space) and some curve adjustments.

If you want to check the values against a scope you can click on the “Scopes” slideout. If you are on a small monitor you may have to close the “Looks” slideout to see the scopes. From here you can check out your footage on a scalable RGB Parade, Slice Graph (displays color values from one line in your image), zoomable Hue/Saturation, Hue/Lightness, Memory Colors (really interesting and deserves a read), and Skin Tone Overlay, which adds lines over your image where it believes the true skin tone colors are coming through.

To apply and begin customizing a look, you can add a preset by double clicking it. It will then apply itself to your clip or adjustment layer. At the same time, it will layout any specific tools used into what Red Giant calls the “Tool Chain,” or the row of tools along the bottom of the GUI. This Tool Chain is important because it is the order of operations. If you put a tool on the right side of the Tool Chain it will impact the preceding tools on the left. For example, if you double click the Print Bleach Bypass tool (which is also awesome and gives a shiny silvery-like polish) it will place the effect naturally at the end of the Tool Chain. This impacts all previous effects, basically creating an end to the order of operations. If you want to get tricky, Magic Bullet Looks allows you to disrupt the order of operations in the Tool Chain by Alt+dragging the tool to a different spot in the Tool Chain. This can be a great method for building a unique look, essentially disrupting the normal order of operations to get a new perspective (on a Mac it is Option+drag).

Once I completed my quick look build, I clicked the check mark on the bottom of the window and was back in Premiere Pro CC 2017 playing my Sony a6300 footage in realtime with the Magic Bullet Look applied at 100 percent strength with no slowdowns. To be clear, I am not running a super-fast machine. In fact, it’s essentially a powerful tablet with an Intel i7 3.10Ghz processor, 8GB of RAM and an Intel Iris GPU, so playing down this look in realtime is pretty amazing.

For a test I trimmed my clip down to one minute in length and added Magic Bullet Look’s Color Play preset “Skydance” — which I mentioned earlier adds chromatic aberration inside of the plugin. I then exported it as a 1080p H.264 QuickTime at around 10 to12 Mb/s, which is basically a highly compressed QuickTime for YouTube, Instagram or Twitter. It took about four minutes and eighteen seconds with the look applied, and one minute without the look applied. So it took quite a bit longer to export with the look, but that can expected with a heavy color grading process. Obviously, with a fast system with a GPU like an Nvidia GTX 1080, you will be chewing through this type of export.

In the end, Magic Bullet Looks 4 is a great paint-by-numbers way of color correcting and grading, but with the ability to highly modify what is being used to create that look. I really love it. As someone who color corrects most of his footage the “old” way with wheels and such, it’s a breath of fresh air to jump into a plug-in that will give you a pretty great output with the same ability to dial in your look that a colorist may be used to but in half the time.

One thing you will notice as I review the rest of the plug-ins in the rest of the Magic Bullet Suite 13 is that the plugins Colorista, Renoiser, and Mojo II are also included with Magic Bullet Looks 4 but only when used inside of the plug-in. When you purchase the entire Magic Bullet Suite 13 you get those as standalone plugins, a feature that I actually like a lot better than working inside of the Looks plug-in. It’s something to consider if you can’t shell out the full $899.

Colorista IV
Colorista IV is a color correction and grading plug-in that is similar in function to Adobe’s Lumetri color tools, but surpasses it. It is compatible with Premiere Pro CC, After Effects CC, Final Cut Pro X 10.2.3 and up, and Motion 5.2.3 and up. Inside of Premiere, Colorista IV offers a much more intuitive workflow for color correction and color grading than Lumetri. But I really think Colorista shines in apps like After Effects where you don’t have as robust color correction options.

Colorista IV consists of a three-way color corrector with the standard Shadows, Midtones and Highlight adjustments, new Temperature and Tint controls to help adjust white balance issues, an exposure compensation, Highlight Recovery, Pop, Hue-Saturation-Lightness wheels and many more adjustments. Right off the bat you can now specify whether your footage is video (basically Rec.709) or Log. When you specify Log, Colorista will actually work a little differently and a little better for your footage than if you tried to use the video color mode. While that is not an uncommon feature/workflow in color correction apps, it is an important update to the Colorista toolset.

Another update is the Guided Color correction, which walks you through color balancing an image in seven steps. I have to say it’s not too bad. After about five different guided color balances using the Guided Color Correction I came to the same conclusion: it looks a little too contrasty but it’s not a bad starting point. Think of it like a guided auto correct. It even shows before and afters of your image while making adjustments in the Guided section. In fact, if you are learning to color correct this is a great way to simply understand a basic initial step when correcting.

After you run through the guided correction, you can fine-tune anything you did in that process, including using Colorista’s Key Mode. The Key Mode of Colorista is a simplified version of secondary keying in color correcting apps. If you want to isolate a skin tone or a specific color that is possibly too saturated, you can enable the Key Mode, select the color you want to adjust by using the color selectors or even using the HSL Cube, and adjust your selection From here Colorista gives you a few options: “Apply” the key, “Cutout” the key, or “Show Key” which will turn the image into a black and white matte for some more advanced adjustments that reach beyond the scope of this review but can be fun and extremely helpful.. You can also select the “Show Skin Overlay” checkbox that will overlay a checkerboard-like pattern on the parts of your image when they have “proper” skin tone colors; it’s pretty useful when doing beauty work and keys in Colorista.

The last two categories in Colorista IV are the Structure & Lighting and Tone Curve & LUT. Structure allows for quick adjustments to the shadows, highlights, pop (basically sharpness) and adding a vignette. The Tone Curve is a multipoint curve, much like curves in every other color correcting app, and at the bottom is where you can load your own LUT or choose a specific technical LUT, such as Sony’s Slog 2 or 3.

Summing Up
In the end, Magic Bullet Looks and Colorist IV are plug-ins that can be super simple or very meticulous depending on your mood or skill level. While almost everything in these plug-ins can be achieved without a plug-in, Colorista IV and Looks gives a simple and straightforward interface for accomplishing great color balance, a correction and grade.

One of my favorite features inside the updated Colorista IV is the new panel, which can be opened by clicking on the menu bar: Window > Extensions > Magic Bullet Colorista IV. You can keep this panel open and instantly begin color correcting a clip without having to drag the effect onto every clip you want to correct; it will automatically apply it to whichever layer is selected. If you are interested in color correction and want to ease into the complexity, Magic Bullet Looks 4 and Colorist IV are a great way to learn.

In the next Magic Bullet review I will be covering the rest of the plugins that comprise the Suite 13 plug-in set, including Magic Bullet Denoiser III which has been revamped and rivals Neat Video (an industry standard noise reduction plug-in), Cosmo II, Renoiser and Film. To buy all of these together check this out, where you can occasionally find everything at a discount.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: Blackmagic’s DaVinci Resolve Mini Panel

By Brady Betzel

If you’ve never used a color correction panel like the Tangent Element, Tangent Ripple, Avid Artist Color, or been fortunate enough to touch the super high-end FilmLight Blackboard 2, Blackmagic Advanced Panel or the Nucoda Precision Control Panel, then you don’t know what you are missing.

If you can, reach out to someone at a post house and sit at a real color correction console; it might change your career path. I’ve talked about it before, but the first time I sat in a “real” (a.k.a. expensive) color correction/editing bay I knew that I was on the right career path.

Color correction can be done without using color correction panels, but think of it like typing with one hand (maybe even one finger) — sure it can be done, but you are definitely missing out on the creative benefit of fluidity and efficiency.

In terms of affordable external color correction panels, Tangent makes the Ripple, Wave and Element panel sets that range from $350 to over $3,300, but work with pretty much every color correction app I can think of (even Avid if you use the Baselight plug-in). Avid offers the Artist Color panel, which also works with many apps, including Avid Media Composer, and costs about $1,300. Beyond those two, you have the super high-end panels that I mentioned earlier; they range from $12,000 to $29,999.

Blackmagic recently added two new offerings to their pool of color correction panel hardware: the DaVinci Resolve Micro Panel and DaVinci Resolve Mini Panel. The Micro is similar in size and functionality to the Avid Artist panel, and the Mini is similar to the center part of most high-end color correction panels.

One important caveat to keep in mind is that you can only use these panels with Blackmagic’s Resolve, and Resolve must be updated to at least version 12.5.5 to function. They connect to your computer via USB 3 Type C or Ethernet.

I received the Resolve Mini Panel to try out for a couple of weeks, and immediately loved it. If you’ve been lucky enough to use a high-end color correction panel like Blackmagic’s Advanced Panel, then you will understand just how great it feels to control Resolve with hardware. In my opinion, using hardware panels eliminates almost 90 percent of the stumbling when using color correction software as opposed to using a keyboard and mouse. The Resolve Mini Panel is as close as you are going to get to professional-level color correction hardware panel without spending $30,000.

Digging In
Out of the box, the panel feels hefty but not too heavy. It’s solid enough to sit on a desk and not have to worry about it walking around while you are using it. Of course, because I am basically a kid, I had to press all the buttons and turn all the dials before I plugged it in. They feel great… the best-feeling wheels and trackballs on a $3,000 panel I’ve used. The knobs and buttons feel fine. I’m not hating on them, but I think I like the way the Tangent buttons depress better. Either way, that is definitely subjective. The metal rings and hefty trackballs are definitely on the level of the high-end color correction panels you can see in pro color bays.

Without regurgitating Blackmagic’s press release in full, I want to go over what I think really shines on this panel. I love the two five-inch LCD panels just above the main rings and trackballs. Below the LCDs and above the row of 12 knobs are eight more knobs that interact with the LCDs. Above the LCDs are eight soft buttons and a bunch of buttons that help you navigate around the node tree and jump into different modes, like qualifiers and tracking.

Something I really loved when working with the Mini Panel was adding points on a curve and adjusting those individual points. This is one of the best features of the Mini Panel, in my opinion. Little shortcuts like adding a node + circle window in one key press are great features. Directly above the trackballs and rings are RGB, All and Level buttons that can reset their respective parameters for each of the Lift Gamma and Gain changes you’ve made. Above those are buttons like Log, Offset and Viewer — a quick way to jump into Log mode, Offset mode and full-screen Viewer mode.

When reading about the user buttons and FX buttons in the Resolve manual it states that they will be enabled in future releases, which gets me excited about what else could be coming down the pike. NAB maybe?

Of course, there can be improvements. I mean, it is a Version 1 product, but everything considered Blackmagic really hit it out of the park. To see what some pros think needs to be changed and/or altered troll over to the holy grail of color correction forums: Lift Gamma Gain. You’ll even notice some Blackmagic folks sniffing around answering questions and hinting at what is coming in some updates. In addition, Blackmagic has their own forum where an interesting post popped up titled DaVinci Mini Panel Suggestion Box. This is another great post to hang around.

Wishlist/Suggestions
When using the panels, when I would exit Resolve the LCDs didn’t dim or go into screen-saver mode like some other panels I’ve used. Furthermore, there isn’t a dimmer for the brightness of the LCD screens and backlit buttons. In the future, I would love the ability to dim or completely shut off the panels when I am in other apps or presenting to a client and don’t want the panel glowing. The backlit keys aren’t terribly bright though, so it’s not a huge deal.

While in the forums, I did notice posts about the panel’s inability to do the NLE-style of transport control: double tapping fast forward to go faster. Furthermore, a wheel might be a nice transport addition for scrubbing. In the node shortcut buttons, I couldn’t find an easy way to delete a node or add an outside node directly from the panel. On other panels, I love moving shapes/windows around using the trackballs but, unfortunately, you can only move/adjust the windows around with knobs, which isn’t terrible but is definitely less natural than using the trackballs. Lastly, I kind of miss the ability to set and load memories from a panel, with the Mini Panel we don’t have that option….yet. Maybe it will come in an update since there are buttons with numbers on them, but who knows.

Mini and Micro Panel
Technically, the Mini Panel is the Micro Panel but with the addition of the top LCDs and buttons. It also has the ability to connect the panel not just by USB-C but also via Ethernet. If connecting via Ethernet, there has been some talk of power over Ethernet (PoE) compatibility, which powers your panel without the need for a power cable. Some folks have had less success with standard PoE, but have had success using PoE+ appliances — something to keep in mind.

Both the Micro and Mini Panels have the standard three trackballs and rings, 12 control knobs and 18 keys hard coded for specific tasks and transport controls. In addition, the Mini Panel has two 5-inch screens, eight additional soft buttons, eight additional soft knobs and 30 additional hard-coded buttons that focus on node navigation and general mode navigation.

Both the Micro and Mini Panels are powered via USB-C, but the Mini Panel also adds PoE connection as mentioned earlier, as well as a 4-pin XLR DC power connection. Something to note: I thought that when I received the Mini Panel I might have been missing a power cable from the box because I had a test unit, but upon more forum reading I found that you do not get a power cable with the Mini Panel. While Blackmagic does ship a USB 3.0 to USB-C adapter cable with the Mini and Micro Panels, they do not ship a power cable, which is unfortunate and an odd oversight, but since the panels are affordable I guess it’s not that big of a deal. Plus, if you are a post nerd like me, you probably have a few 5-15 to C13 power cables lying around the house.

I can’t shake the feeling that Blackmagic is going to be adding some additional external panels to piece together something like the Advanced Panel set-up (much like how the Tangent Element panel set can be purchased). Things like an external memory bank or an X-Keys type set-up seem not too far off for Blackmagic. I would even love to be able to turn the LCD screens into scopes if possible, and even hook up an Ultrascope via the panel so I don’t have to purchase additional hardware. Either way, the Mini Panel gets me real excited about the path Blackmagic is carving for their Resolve users.

Summing Up
In the end, if you are a professional colorist looking for a semi-portable panel and haven’t committed to the Tangent Element ecosphere yet, the Resolve Mini Panel is for you … and your credit card. The Mini Panel is as close to a high-end color correction panel that I have seen, and has a wallet-friendly retail price of $2,995. It is very solid and doesn’t feel like a substitute for a full-sized panel — it can hold its own.

One thing I was worried about when I began writing this review was questioning whether or not tying myself down to one piece of software was a good idea. When you invest in the Mini Panel, you are wholeheartedly dedicating yourself to DaVinci Resolve, and I think that is a safe bet.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.