Tag Archives: Brady Betzel

Review: NewBlueFX’s ColorFast 2 for editors

By Brady Betzel

Basic color correction is rapidly becoming a skill that is expected of an editor, or even an assistant editor. If you have had the luxury of using a colorist and/or an online editor, you have probably seen them use apps such as Blackmagic Resolve, Avid Symphony, FilmLight’s Baselight or other color grading tools. These systems have so many levels of intricacy that without years of experience in color correction, most editors’ knowledge starts at the beginning stage.

If you are an editor looking to do basic color correction, slight secondary correction and, maybe, even a creative grade, you probably want to stay inside of your NLE, whether it’s Adobe Premiere, Apple FCPX, Avid Media Composer, Magix Vegas, or even After Effects. This is where NewBlueFX’s latest color correction and grading plug-in comes into play.

Featuring over 60 different looks (sometimes referred to as creative LUTs or preset color grades), skin tone isolation and the ability to isolate regions of an image for the video scopes to analyze, New Blue ColorFast 2 is a modest color correction app without the overwhelming toolset of a full-fledged color correction application.

The Details
ColorFast 2 costs $99 and works in apps like Vegas Pro 10+, Resolve 11+, Premiere CS6/6.5/CC, After Effects 5+, FCPX, Media Composer/Symphony 6+ and Grass Valley Edius 7 and 8. If you are using apps like Resolve you probably would only use ColorFast 2 for its preset looks since you already have access to all of the color correction tools included in the plug-in — unless you like the region isolating feature for the video scopes, something I find really intriguing.

ColorFast2 RGB Scope and the Lumetri RGB scope.

Most people reading this review will probably want to know why they should buy ColorFast 2 when Premiere Pro has a lot of these features built into their Lumetri color correction tools. To be honest, there are only a few things that ColorFast 2 has that Premiere, or other apps for that matter, don’t have: region-controlled video scopes, skin color isolating and NewBlueFX’s color presets. You should really check out NewBlueFX’s product page for ColorFast 2 to see some more examples of the color presets and download a trial for yourself.

Right off the bat, I felt that stacking ColorFast 2 after the Lumetri color correction tools in the effects panel in Premiere is the proper order of operations. If you are familiar with LUTs and how the chain of command works, you probably have experimented with color correcting before and after the LUT is applied.

Typically, a LUT gives the colorist a good starting point to grade from, but these days you may see creative LUTs. If the creative LUT doesn’t quite look right you will want add color correction first in the chain of command and then the LUT. This is how I would work with ColorFast 2 and Lumetri color correction tools. You will be correcting the footage to work with your creative LUT instead of correcting the LUT, which most of the time will give you inadequate results. Long story short: stack your ColorFast 2 effect after Lumetri tools in the effects window and then fine-tune the Basic Correction settings with your ColorFast 2 preset to get a great color grade.

The ColorFast2 waveform with isolated scope region.

Video Scopes
I was excited to check out the video scopes inside of ColorFast 2, so I jumped to the bottom where the Region Scopes twirl-down menu is. Under that is the Video Scopes menu, which contains Vectorscope (Classic), Vectorscope (Color), Vectorscope (Sat, RGB Parade), Waveform and Histogram. The real beauty is that NewBlueFX gives you the ability to isolate a square region of your footage to be output through the video scope. This allows you to pinpoint your correction a little easier, and I really love this feature… but I also noticed that when you have both the Lumetri video scopes, as well as the ColorFast 2 scopes there is a discrepancy in values. I tended to like the Lumetri video scopes a little better. In fact, they go all the way up to 100, where the ColorFast 2 scopes only go up to 80 — this could very well be a bug in the compatibility between ColorFast 2 and the new Adobe Premiere CC 2015.4.

One issue I found with the ColorFast 2 scopes was that I couldn’t move the actual scope around or have more than one on at a time. While the region selection is an awesome feature, being able to see your full image is sometimes more important, so that is why I would probably stick to the NLEs built-in scopes.

Primary, Secondary, Output Correction Menus
Going back to the top of the ColorFast 2 Effect Editor menus, up first is the Primary Correction twirl-down menu. Here you can quickly white-balance your footage with an eyedropper, even keyframe it. In addition, you can adjust the White Strength, White Tweak (fine-tune control of the white color), Hue, Saturation, Exposure, Brightness and Film Gamma. A problem I encountered was that if you do a primary color correct on your image and then choose a color preset, all of your primary work gets reset, which is a real bummer if you want to correct and then grade your footage. So, if you want to work in ColorFast 2 in a more traditional way, where you color correct then color grade, you may want to do it in two separate effects. Moreover, you may want to primary color correct inside of the Lumetri tools then stack the ColorFast 2 on top.

Secondaries menu.

Next up is the Secondary Correction twirl-down menu, which gets you into the real meat and potatoes of the plug-in. There is a helpful “Show Mask” drop down that will allow you to isolate and view Highlights, Midtones, Shadows, Skin Color Mask and a Shape Mask. Inside each of these you can adjust Tint, Saturation, overall Level, and even enable and disable this secondary if you want. Further down in the secondary menu you can adjust the High, Mid and Shadow thresholds (basically transitions from high to mid or mid to shadow), and even the blending and spread.

While still in the secondary twirl-down menu you can jump into the Skin Mask, which will quickly help you identify skin color, soften imperfections and even help keep skin color fidelity while adjusting the rest of your image.

The last menu is the Output Correction twirl-down. Here you can do a widespread correction that lands after the fine-tuning. You can adjust overall Saturation, Exposure and Brightness.

Summing Up
In the end, I think ColorFast 2 is best suited for people who want a quick color grade by applying a preset look but who also want a little ability to fine-tune that look. ColorFast 2 has some pretty good-looking presets like Vintage, Fallout, Gotham and even some black and white presets like B&W Ink. It’s even more fun to go and purposely change your white balance to something crazy, like a deep purple, for interesting grades. You should definitely try NewBlueFX’s ColorFast 2 if you are looking for some additional creative grade looks while still being able to individually tweak the output.

Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: DJI’s Phantom 4 drone

By Brady Betzel

I’ve been trying to get my hands on a professional drone to review for a few years now. My wife even got me a drone from a local store that was a ton of fun to play with but was hard to master.  For years, I’ve been working on television shows that use drone footage and capture incredible imagery, but it always seemed out of reach for me as an editor. Finally, after much persistence (or pestering, depending on who you ask), DJI agreed to send me the Phantom 4 to test out, and boy is it awesome!

By now you’ve probably made your way through the ubiquitous reviews, including the endless supply of YouTube reviews, but in that small chance you are reading this without much prior drone knowledge and work in production or post production, I have some ideas for you.

When reading this review, think about how you could take a drone, run outside and maybe grab some b-roll for something you are working on. If you create opening titles or sizzle reels, you could grab some great aerial shots or fast-paced shots to use as transitions. The possibilities are really endless, as long as you get your video picture settings dialed in.

Before I started as an online editor (which, for those who don’t know, focuses on the technical side of editing — color correction, grading, transcoding, outputting, exporting, anything that ends in “-porting” or “-linking” basically), I worked my way through being a post coordinator, post production supervisor and all the way to offline editor. One thing I noticed in many of the non-union live-to tape shows (like late night comedy or talk shows), is that the editor has a lot of freedom to be creative and can push the envelope a little.

Maybe the editor needs some b-roll for an edit that isn’t in the system, so as the post supervisor you might run out and shoot it yourself. Why not with a drone? If you need a quick aerial of a house from directly above, you might be able to get away with footage from your own drone, saving the project money while showing some talent that may get you more jobs in the future!

I really love the idea of people acquiring as much knowledge in different job positions as possible, whether you are in the craft service or executive producer, if you can do things like operate a camera, hold a boom mic or fly a drone, you will probably make a lasting impression and be known as someone who is hungry to work and to create a great end product, regardless of your position.

Fly Legally!
Not to be a total wet blanket and put a huge wrench in your drone flying, but there are some laws that recently have been passed (more like clarified) to standardize drone use between hobbyists and commercial fliers (basically someone who wants to make money from their footage). You should definitely check out the Federal Aviation Administration’s Getting Started page for more info.

closeup-cameraIf you are flying your drone for fun and as long as it is has the weight and footprint size of the DJI Phantom 4 — the weight is about 3lbs and it measures about 14 inches diagonally without propellers, which can add a couple of inches — there is minimal work that you need to do. However, if you are planning on making money from your drone footage, there are many steps you must take, including taking an official test. There is a a lot you need to know that is beyond the scope of this review, so definitely check out the FAA link above for more.

Easy to Use
Since I hadn’t flown a professional drone before I had nothing to compare it to, but I can tell you that I picked up the Phantom 4 and was flying it within five minutes. It really is that easy to get up and running.

Step 1, charge your remote and battery; Step 2, plug in your phone or tablet via USB to the remote; Step 3, attach propellers; Step 4, fly! You should probably boot up your Phantom before you go outside to check to make sure it is functional, and to update your firmware. As a side note, I’m not sure if I was up and running so quickly because the Phantom 4 I was loaned for review had been charged and used before, or if it was really that easy.

For this review, I really wanted to see how easy it was to get shots like wide sweeping pans and tilts or tracking shots, and it was relatively easy. Obviously, you will need to practice your camera work with the Phantom 4 to get nice shots that aren’t boring and have substance, but it’s pretty simple. I brought the Phantom 4 to an open field where I had tons and tons of space. I immediately turned on the Phantom 4 by pressing the power button once and then holding it down until it powered on, I forgot to download the DJI Go app to my iPhone 6, so after I downloaded it, I connected the USB to lightning cable from my iPhone 6 to the controller. While the iPhone 6 worked great, you do have minimal screen real estate with so many controls available, so I would suggest you use an iPad if you can or an iPhone 6 or 7 Plus. I tried using an iPad mini, but had trouble getting the Phantom 4 and the iPad to connect, so I stuck to the iPhone.

Once my propellers were spinning, I flew it straight up into the sky, I felt like a little kid with my first remote control car, except that the handling and precision that the Phantom 4 offers is exceptional. You can even take your hands off the joysticks and the Phantom 4 will hover. I noticed that once I got the Phantom 4 high in the air, I could hear it battle the winds. It really stuck to its position in the air even with some decent-strength gusts.

Collision Avoidance
When I took the Phantom 4 out for a second time, I wanted to test out its upgraded collision avoidance system. I also wanted to test out my camera moves. The collision avoidance was awesome! Not only does it sense the ground beneath it, but objects in front of it. I started flying toward a basketball hoop and it caught it in its sights and maneuvered to the right. Then, with just one prior flight, I noticed I was really getting the hang of long shots while tilting and panning the camera — a real testament to how easy it is to control.

Keep in mind that the DJI Go app has a built-in flight simulator to help you get your moves and techniques down before you go outside. Unfortunately, you have to be connected to your drone while using the flight simulator, but still it’s pretty handy for practicing — something you should definitely use before you fly, even if your pride is telling you not to.

Tech Specs
Beyond my pure joy at flying the Phantom 4 there are some fancy tech specs that you should know about. For my money, the DJI Phantom 4 really shows its worth in its camera, a 4K capable 1/2.3-inch CMOS image sensor, ISO range between 100-3,200 for video (100-1,600 for photos) and a shutter speed between eight seconds and 1/8000 of a second.

There are many different recording modes, including 4096×2160 (true 4K resolution) at 24/25 progressive frames per second, 3840×2160 (UHD) at 24/25/30p, 2704×1520 (2.7K) at 24/25/30p, 1920×1080 (HD) at 24/25/30/48/50/60/120p and 1280×720, for some reason, at 24/25/30/48/50/60p. All these resolutions are recorded at a max bit rate of 60Mbps, which is decent, but really should be higher in my opinion (probably more in the 100Mbps range).

In terms of image quality, the Phantom 4 is amazing for being a flying ship that captures video. However, it isn’t going to match cameras like the Sony a7S II, , Panasonic GH4 or Blackmagic Cinema Cameras, exactly. The Phantom 4 definitely rivals the GoPro Hero 5 Black in video quality, or at least gives them a good run for their money. The only problem is that the camera isn’t removable from the gimbal on the Phantom 4. I would really like a removable camera from the Phantom 4, much like the new GoPro Karma drone with its connection to the Karma Gimbal.

So after flying the Phantom 4 a few times I began to realize how volatile and important the picture and video profile settings are. The first time I recorded video I simply hit record. I was in Vivid mode, presumably at the baseline of Saturation, Sharpening and Contrast: 0,0,0. It looked great at first glance and for anyone who just wants to pick up the Phantom 4 and shoot you should probably just leave it at this or maybe knock the sharpness down to -1. If you plan on color correcting later or adding a creative LUT on top of your footage, then you are going to want a more flat-in-color image.

I thought the D-Log setting would be the way to go, as that should give you the flattest image in terms of saturation and exposure to pull the most life out of your image. Unfortunately, I found out that is not the case. I tried many variations of Saturation, Sharpness and Contrast from 0,0,0 to -3,-3,-3 and wasn’t really happy with any of them. After running through the usable color profiles (I’m omitting black and white and any other filters like that because you should really just go ahead and apply those looks while color correcting or editing since all NLEs have an easy way to add them), I found that D-Cinelike and None were the profiles I should really stay in, and I started to like Sharpness: -1, Contrast -2, and Saturation -2.

Before I go on about D-Cinelike and None, I think anyone buying a drone should consider ND filters (short for neutral density filters). When shooting outdoors you will get a lot of contrasting light values, such as dark shadows and blown out highlights. To get around having to pick your favorite, you can knock the exposure down on your camera externally with an ND filter while allowing you to keep your shutter speed and ISO values at more appropriate levels.

Without ND filters, you are going to have to ramp up the shutter speed on your Phantom 4 when filming using an ISO, such as 100, to properly expose your image, lending your footage to look a little choppier and less cinematic (I hate using the word cinematic to describe this, but essentially cinematic = motion blur in this instance).

If this sounds interesting to you, you should Google shutter speed techniques and rules, but be careful. It is a deep rabbit hole. From my simple research, I found ND filters ranging anywhere from $20 to $99 or more depending on quality and where you buy them. Polar Pro looks to make some sweet ones, including the Vivid Collection in their Cinema Series of polarized ND filters at $99 for a three-pack — another rabbit hole, be careful not to get G.A.S., Gear Acquisition Syndrome.

Moving on… D-Cinelike and None are flat color profile shooting modes that allow for decent color grading in post production but with less midtone muddiness like the D-Log seemed to produce for me. D-Cinelike seemed to warm up the shot a little with more orange and yellow tints and possibly less shadow detail. In None, I felt like I got the flattest color profile possible, which allowed for the best color correction and grading scenario with the Phantom 4 footage. Don’t forget to dial in your custom picture profile settings. Personally, I liked the picture best when I knocked Sharpness down to -1 or -2. Contrast and Saturation could also be knocked down a little, but this is something you should test when you buy a Phantom 4, since it is definitely a personal taste.

If you go on YouTube and search Phantom 4 color settings you will find a lot of videos. You should probably sort by upload date and watch the more recent videos that might take into account firmware updates. I really liked watching Bill Nichol’s YouTube Channel BillNicholsTV. He has a bunch of great and practical reviews.

You should still try out the Phantom 4’s D-Log mode. Hopefully, it works for you better than it did for me. If you use Blackmagic Resolve, you can check out DJI’s D-Log to sRGB LUT instructions and find the LUT under the software downloads here.

While I didn’t want to get too deep into the technical side of the Phantom 4, I did fall down the picture profile settings abyss and still want to highlight some automated flight modes that the Phantom 4 excels at. Some of the new features that separate the Phantom 4 from previous Phantom models include Active Track, TapFly, Obstacle Sensing System, Sport Mode, easier-to-use push and release propellers, up to 28-minute battery life (although I only got between 20-22 minutes with the Phantom 4 automatically returning to home when the battery was running low), improved camera with less chromatic aberration, and much more.

New Features That Editors Will Like
I now want to touch on the upgraded features that would get me, as an editor, interested in the Phantom 4. Active Track is an amazing feature that can track objects specified through the DJI Go app. You simply click the object or person you want to track and bam! The Phantom 4 will follow them from what DJI calls a “safe distance,” and it really is.

TapFly is another great feature that will help pilots who aren’t as comfortable flying in tight spaces to fly in a straight line. Simply tap the remote icon on your phone or tablet, tap TapFly, click on a visual point you want the Phantom 4 to fly to, and it will basically move into autopilot. You still have control over camera and even the Phantom 4, but it’s basically a coached flying system.

Again, there are a lot of technical specs I didn’t go into too much detail on, but if you want more info you can find it on DJI’s Phantom 4 page. For some simple and short videos check out: http://www.dji.com/edu/edu_videos or download the DJI Go app.

Summing Up
In the end, I really, really, really loved flying the Phantom 4! One of the easiest parts was installing the propellers — easy turn and lock. If you find yourself getting frustrated when filming or flying the Phantom 4, remember that it takes people many hours to get good at shooting with a camera, let alone a drone, with a camera and gimbal to control all at once. I spent many nights watching YouTuber’s reviews wondering why I couldn’t get a great picture out of the D-Log setting until I found Casey Faris’ video on the Mavic Pro, which described the same problem I was having with the Phantom 4. With some more tests, I was able to fail and succeed in the different picture profiles.

When reviewing products, I try to break them, and I did that with the Phantom 4. Really. I accidentally crashed it while in Sport mode and only one of the propellers caps flew off in that yard sale — a real testament to the sturdy construction of the Phantom 4.

Once back online, I tried to fly it into a tree but the Obstacle Sensing System and the Forward Vision System prevented the Phantom 4 from crashing. It’s like an extra layer of insurance.

I really like how the Phantom 4 has very advanced controls and features, but is also “dummy” proof. If I you’re editing a project and it begs for a tracking shot of a car that just isn’t in the dailies, you can grab a Phantom 4 and run out and film something. Even if it doesn’t make it into the final edit, it will give the producers and director a greater sense of what you are trying to convey. You could really help sell your vision, and your future job prospects.

I haven’t been able to get my hands on the recently announced MavicPro foldable drone from DJI, but I was able to get the recently announced GoPro Karma (you can see some of my in-flight footage on my  YouTube page.

In my opinion, I really don’t think these drones compare to one another, so I won’t really be going into a “tit for tat” comparison, but with so much drone competition it is an exciting time in the UAV world.

One thing I did notice when I went out to test out the Phantom 4 was how many people were ready to become FAA/police authorities and tell you that you can’t fly. It was almost laughable. In fact, every time I think about it I laugh. Moral of the story is to keep that in mind that before making a purchase like this, if you live in a city you probably live within five miles of an airport, helipad, etc., and technically you can’t fly your drone. It is a conversation starter whether or not you want it to be.

Definitely check out the FAA’s website to get the rules on where and when you can fly drones, otherwise you might have an awesome grey box in your room with nowhere to fly. On the flip side, I’ve been reading people’s comments on forums, and if you are a hobbyist flyer, have registered your drone and want to fly, you can contact your local airport and let them know you want to fly at a certain altitude or below, and they usually will say it’s fine. Those aren’t my words but a summation of what I have been reading — of course do your own research please!

The only criticism is that the Phantom 4’s 60Mbps data rate isn’t high enough to get the best quality footage from your drone. If you’ve been paying attention to the news lately or my Twitter: @allbetzroff, you may have seen DJI’s latest reveal of the Phantom 4 Pro, Pro + and Inspire 2, which can film at a much better data rate of 100Mbps. Maybe this is a simple firmware update to the Phantom 4 (but it’s probably not). Nonetheless, 60Mbps is acceptable for 1920×1080 or maybe 2.7K video (2704 x 1524,16×9 aspect ratio) or below, but once you get up into the higher frame sizes, you can really see the video footage breakdown. If you zoom into the footage, the compression becomes noticeable and the color fidelity begins to fade.

While writing this review, the DJI Phantom 4 retailed for $1,199 on the DJI online store without any accessories. More like $1399 with two extra batteries and an external battery charger. I even just found a refurbished Phantom 4 on DJI’s site for $899. The Phantom 4 Pro starts at $1,499 and Phantom 4 Pro + $1,799. Oh yeah, don’t forget a few 64GB MicroSD cards at $20-$35 a piece. A pretty expensive investment if you ask me, but If you find yourself being a major gear nerd like me or editing and needing to shoot your own footage, the DJI Phantom 4 is a must-have. Once you fly the Phantom 4 you will be hooked.

Watch some of the video I shot with the Phantom 4 on my YouTube Channel:

Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Earlier this year, Brady was nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration. Follow him on Twitter @allbetzroff.

Review: The HP Z1G3 All-in-One workstation

By Brady Betzel

I’ll admit it. I’ve always been impressed with HP’s All-in-One workstations — from their z840 to their zBook mobile workstation and now their HP Z1G3. Yes, I know, the HP line of workstations are not cheap. In fact, you can save quite a bit of money building your own system, but you will probably have tons of headaches unless you are very confident in your computer-building skills. And if you don’t mind standing in the return line at the Fry’s Electronics.

HP spends tons of time and money on ISV certifications for their workstations. ISV certification stands for Independent Software Vendor certification. In plain English it means that HP spends a lot of time and money making sure the hardware inside of your workstation works with the software you use. For an industry pro that means apps like Adobe’s Premiere Pro and After Effects, Avid Media Composer, Autodesk products like 3DS Max and many others.

For this review,  I tested apps like Avid Media Composer, FilmLight’s Baselight for Media Composer color correction plug-in, Adobe Premiere Pro, Adobe Media Encoder and Adobe After Effects, as well as Blackmagic’s Resolve 12.5.2, which chewed through basic color correction. In terms of testing time, I typically keep a review computer system for a couple of months, but with this workstation I really wanted to test it as thoroughly as possible — I’ve had the workstation for three months and counting, and I’ve been running the system through all the appropriate paces.

I always love to review workstations like the HP Z1G3 because of the raw power they possess. While HP sent me one of the top-of-the-line Z1G3 configurations, which retails for a list price of $3,486, they have a pretty reasonable starting price at $1,349. From Intel i3, i5 and i7 configurations all the way up to the all mighty Intel Xeon — the HP Z1G3 can be customized to fit into your workflow whether you just need to check your email or color correct video from your GoPro.

Here are the specs that make up the HP Z1G3 All-in-One workstation I received:

● 23.6-inch UHD/4K non-glare and non-touch display (3840×2160)
● Intel Xeon E3-1270 v5 CPU, 3.6GHz (4 Cores / 8 Threads)
● 64GB DDR4 SODIMM 2133 GHz (4 x 16GB)
● Nvidia Quadro M2000M graphics (4GB)
● Two Z Turbo drives (512GB, PCIe M.2)
● Wireless keyboard and mouse
● Two Thunderbolt 3/USB 3.1 ports
● USB charging port
● Media card reader
● DisplayPort out

As I mentioned earlier, I tested the Z1G3 with many different apps, but recently I’ve been diving deeper into color correction, and luckily for my testing this fits right in. A few of the most strenuous real-world tests for computer systems is running 3D modeling apps like Maxon Cinema 4D and color correction suites like Resolve. Of course, apps like After Effects are great tests as well, but adding nodes on nodes on nodes in Resolve will really tax your CPU, as well as your GPU.

One thing that can really set apart high-end systems like the Z1G3 is the delay when using a precision color correction panel like Tangent’s Elements or Ripple. Sometimes you will move one of the color wheel balls and a half a second later the color wheel moves on screen. I tried adding a few clips and nodes on the timeline and when using the panels, I noticed no discernible delay (at least more than what I would expect). While this isn’t a scientific test, it is crucial for folks looking to plug in external devices.

For more scientific tests I stuck to apps like Cinebench from Maxon, AJA’s System Test and Blackmagic’s Disk Speed Test. In Cinebench, the Z1G3 ranked at the top of the list when compared to similar systems. In AJA’s System Test I tested the read/write speed of the hp-z1g3-aja-system-test-copynon-OS drive (basically the editing or cache drive). It sustained around 1520MB/s read and 1490MB/s write. I say around because I couldn’t get the AJA app to display the entire read/write numbers because of the high-resolution/zoom in Windows, I tried scaling it down to 1920×1080 but no luck. In Blackmagic’s Disk Speed Test, I was running at 1560MB/s read and 1497.3MB/s write. The drive that I ran this test on is HP’s version of the M.2 PCIe SSD powered by Samsung, more affectionately known by HP as a Z-Turbo drive. The only thing better at the moment would be a bunch of these drives arranged in a RAID-0 configuration. Luckily, you can do that through the Thunderbolt 3 port with some spare SSDs you have lying around.

Almost daily I ran Premiere Pro CC, Media Encoder and Resolve Studio 12.5.2. I was really happy with the performance in Premiere. When working with QuickTimes in inter-frame codecs like H.264 and AVC-HD (non-edit friendly codecs), I was able to work without too much stuttering in the timeline. When I used intra-frame codecs like ProRes HQ from a Blackmagic’s Pocket Cinema Camera, Premiere worked great. I even jumped into Adobe’s Lumetri color tools while using Tangent’s Ripple external color correction panel and it worked with little discernable delay. I did notice that Premiere had a little more delay when using the external color correction panel than Media Composer and Resolve, but that seemed to be more of a software problem rather than a workstation problem.

One of my favorite parts about using a system with an Nvidia graphics card, especially a Quadro card like the M2000M, is the ability to encode multiple versions of a file at once. Once I was done editing some timelapses in Premiere, I exported using Media Encoder. I would apply three presets I made: one square 600×600 H.264 for Instagram, one 3840×2160 H.264 for YouTube and an Animated GIF at 480×360 for Twitter. Once I told Media Encoder to encode, it ran all three exports concurrently — a really awesome feature. With the Nvidia Quadro card installed, it really sped along the export.

Media Composer
Another app I wanted to test was Media Composer 8.6.3. Overall Media Composer ran great except for the high-resolution display. As I’ve said in previous reviews, this isn’t really the fault of HP, but more of the software manufacturers who haven’t updated their interfaces to adapt to the latest UHD displays. I had filmed a little hike I took with my five-year-old. I gave him a GoPro while I had my own. Once we got the footage back home, I imported it into Media Composer, grouped the footage and edited it using the multi-cam edit workflow.

Simply put, the multi-camera split was on the left and the clip I had in the sequence was playing simultaneously on the right. Before I grouped the footage into a multi-group, I transcoded the H.264s into DNxHD 175 an intra-frame, edit-friendly codec. The transcode was nearly realtime, so it took 60 minutes to transcode a 60-minute H.264 — which is not bad. In the end, I was able to edit the two-camera multi-group at 1920×1080 resolution with only minor hiccups. Occasionally, I would get caught in fast forward for a few extra seconds when J-K-L editing, but nothing that made me want to throw my keyboard or mouse against the wall.

Once done editing, I installed the FilmLight color correction plug-in for Media Composer. I had a really awesome experience coloring using Baselight in Media Composer on the Z1G3. I didn’t have any slowdowns, and the relationship between using the color correction panel and Baselight was smooth.

Resolve
The last app I tested with HP’s Z1G3 All-in-One Workstation was Blackmagic’s Resolve 12.5.2. Much like my other tests, I concentrated on color correction with the Tangent Ripple and Element-Vs iOS app. I had four or five nodes going in the color correction page before I started to see a slow down. I was using the native H.264 and ProRes HQ files from the cameras, so I didn’t make it easy for Resolve, but it still worked. Once I added a little sharpening to my clips, the HP Z1G3 really started to kick into gear. I heard the faint hum of fans, which up until this point hadn’t kicked in. This is also where the system started to slow down and become sluggish.

Summing Up
The Z1G3 is one of my favorite workstations, period. A while ago, I reviewed the previous All-in-One workstation from HP, the Z1G2, and at the time it was my favorite. One of my few complaints was that, while it was easy to fix, it was very heavy and bulky. When I opened the Z1G3 box, I immediately noticed how much lighter and streamlined the design was. It almost felt like they took away 50 percent of the bulk, which is something I really appreciate. I can tell that one of the main focuses with the Z1G3 was minimizing its footprint and weight, while increasing the power. HP really knocked it out of the park.

One of the only things that I wish was different on the Z1G3 I tested was the graphics card. While the Nvidia Quadro M2000M is a great graphics card, it is a “mobile” version of a Quadro, which has 128 fewer CUDA cores and 26GB/s less bandwidth than its desktop equivalent the M2000. I would love the option of a full-sized Quadro and instead of the mobile version but I also understand the power consumption will go up as well as the form factor, so maybe I give HP a pass here.

In the end, I know everyone reading this review is saying to themselves, “I love my iMac so why would I want the HP Z1G3?” If you are a die-hard Apple user, or you just saw the new Microsoft Surface Studio announcement, then it might be a hard sell, but I love both Windows- and Mac OS-based systems, and the Z1G3 is awesome. What’s even more awesome is that it is easily upgradeable. I took off the back cover, and with simple switch I could have added a 2.5-inch hard drive or two in under a minute. If you are looking for a new powerful workstation and want one that not only stands up to Resolve and Premiere Pro CC, the HP Z1G3 is for you.


Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Earlier this year, Brady was nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.

Review: Red Giant’s Universe 2

By Brady Betzel

Throughout 2016, we have seen some interesting acquisitions in the world of post production software and hardware — Razer bought THX, Blackmagic bought Ultimatte and Fairlight and Boris FX bought GenArts, to name a few. We’ve also seen a tremendous consolidation of jobs. Editors are now being tasked as final audio mixers, final motion graphics creators, final colorists and much more.

Personally, I love doing more than just editing, so knowing tools like Adobe After Effects and DaVinci Resolve, in addition to Avid Media Composer, has really helped me become not only an editor but someone who can jump into After Effects or Resolve and do good work.

hudUnfortunately, for some people it is the nature of the post beast to know everything. Plug-ins play a gigantic part in balancing my workload, available time and the quality of the final product. If I didn’t have plug-ins like Imagineer’s Mocha Pro, Boris’s Continuum Complete, GenArt’s Sapphire and Red Giant’s Universe 2, I would be forced to turn down work because the time it would take to create a finished piece would outweigh the fee I would be able to charge a client.

A while back, I reviewed Red Giant’s Universe when it was in version 1, (check it out here). In the beginning Universe allowed for lifetime, annual and free memberships. It seems the belt has tightened a little for Red Giant as Universe 2 is now $99 a year, $20 a month or a 14-day free trial. No permanent free version or lifetime memberships are offered (if you downloaded the free Universe before June 28, you will still be able to access those free plug-ins in the Legacy group). Moreover, they have doubled the monthly fee from $10 to $20 — definitely trying to get everyone on to the annual subscription train.

Personally, I think this resulted from too much focus on the broad Universe, trying to jam in as many plug-ins/transitions/effects as possible and not working on specific plug-ins within Universe. I actually like the renewed focus of Red Giant toward a richer toolset as opposed to a full toolset.

Digging In
Okay, enough of my anecdotal narrative and on to some technical awesomeness. Red Giant’s Universe 2 is a vast plug-in collection that is compatible with Adobe’s Premiere Pro and After Effects CS6-CC 2015.3; Apple Final Cut Pro X 10.0.9 and later; Apple Motion 5.0.7 and later; Vegas 12 and 13; DaVinci Resolve 11.1 and later; and HitFilm 3 and 4 Pro. You must have a compatible GPU installed as Universe does not have a CPU fallback plan for unsupported machines. Basically you must have 2GB or higher GPU, and don’t forget about Intel as their graphic support has improved a lot lately. For more info on OS compatibility and specific GPU requirements, check out Red Giant’s compatibility page.

Universe 2 is loaded with great plug-ins that, once you dig in, you will want to use all the time. For instance, I really like the ease of use of Universe’s RGB Separation and Chromatic Glow. If you want a full rundown of each and every effect you should download the Universe 2 trial and check this out. In this review I am only going to go over some of the newly added plug-ins — HUD Components,  Line, Logo Motion and Color Stripe — but remember there are a ton more.

I will be bouncing around different apps like Premiere Pro and After Effects. Initially I wanted to see how well Universe 2 worked inside of Blackmagic’s DaVinci Resolve 12.5.2. Resolve gave me a little trouble at first; it began by crashing once I clicked on OpenFX in the Color page. I rebooted completely and got the error message that the OpenFX had been disabled. I did a little research (and by research I mean I typed ”Disabled OpenFX Resolve” into Google), and  stumbled on a post on Blackmagic’s Forum that suggested deleting “C:\ProgramData\Blackmagic Design\Davinci Resolve\Support\OFXPluginCache.xml” might fix it. Once I deleted that and rebooted Resolve, I clicked on the OpenFX tab in the Color Page, waited 10 minutes, and it started working. From that point on it loaded fast. So, barring the Resolve installation hiccup, there were no problems installing in Premiere and After Effects.

Once installed, you will notice that Universe has a few folders inside of your plug-in’s drop down: Universe Blur, Universe Distort, Universe Generators, Universe Glow, Universe Legacy, Universe Motion Graphics, Universe Stylize and Universe Utilities. You may recognize some of these if you have used an earlier version of Universe, but something you will not recognize is that each Universe plug-in now has a “uni.” prefix.

I am still not sure whether I like this or hate this. On one hand it’s easy to search for if you know exactly what you want in apps like Premiere. On the other hand it runs counterintuitive to what I am used to as a grouchy old editor. In the end, I decided to run my tests in After Effects and Premiere. Resolve is great, but for tracking a HUD in 3D space I was more comfortable in After Effects.

HUD Components
First up is HUD Components, located under the Universe Motion Graphics folder and labeled: “uni.HUD Components.” What used to take many Video CoPilot tutorials and many inspirational views of HUD/UI master Jayse Hansen’s (@jayse_) work, now takes me minutes thanks to the new HUD components. Obviously, to make anything on the level of a master like Jayse Hansen will take hundreds of hours and thousands of attempts, but still — with Red Giant HUD Components you can make those sci-fi in-helmet elements quickly.

When you apply HUD Components to a solid layer in After Effects you can immediately see the start of your HUD. To see what the composite over my footage would look like, I went to change the blend mode to Add, which is listed under “Composite Settings.” From there you can see some awesome pre-built looks under the Choose a Preset button. The pre-built elements are all good starting points, but I would definitely dive further into customizing, maybe layer multiple HUDs over each other with different Blend Modes, for example.

Diving further into HUD Components, there are four separate “Elements” that you can customize, each with different images, animations, colors, clone types, and much more. One thing to remember is that when it comes to transformation settings and order of operations work from the top down. For instance, if you change the rotation on element one, it will affect each element under it, which is kind of handy if you ask me. Once you get the hang of how HUD Components works, it is really easy to make some unique UI components. I really like to use the uni.Point Zoom effect (listed under Universe Glow in the Effects & Presets); it gives you a sort of projector-like effect with your HUD component.

There are so many ways to use and apply HUD Components in everyday work, from building dynamic lower thirds with all of the animatable arcs, clones and rotations to building sci-fi elements, applying Holomatrix to it and even Glitch to create awesome motion graphics elements with multiple levels of detail and color. I did try using HUD Components in Resolve when tracking a 3D object but couldn’t quite get the look I wanted, so I ditched it and used After Effects.

Line
Second up is the Line plug-in. While drawing lines along a path in After Effects isn’t necessarily hard, it’s kind of annoying — think having to make custom map graphics to and from different places daily. Line takes the hard work out of making line effects to and from different points. This plug-in also contains the prefix uni. and is located under Universe Motion Graphics labeled uni.Line.

This plug-in is very simple to use and animate. I quickly found a map, applied uni.Line, placed my beginning and end points, animated the line using two keyframes under “Draw On” and bam! I had an instant travel-vlog style graphic that showed me going from California to Australia in under three minutes (yes, I know three minutes seems a little fast to travel to Australia but that’s really how long it took, render and all). Under the Effect Controls you can find preset looks, beginning and ending shape options like circles or arrows, line types, segmented lines and curve types. You can even move the peak of the curve under bezier style option.

Logo Motion
Third is Logo Motion, located under Universe Motion Graphics titled uni.LogoMotion. In a nutshell you can take a pre-built logo (or anything for that matter), pre-compose it, throw the uni.LogoMotion effect on top, apply a preset reveal, tweak your logo animation, dynamically adjust the length of your pre-comp — which directly affects the logo’s wipe on and off — and, finally, render.

This is another plug-in that makes my life as an editor who dabbles in motion graphics really easy. Red Giant even included some lower third animation presets that help create dynamic lower third movements. You can select from some of the pre-built looks, add some motion while the logo is “idle,” adjust things like rotation, opacity and blur under the start and end properties, and even add motion blur. The new preset browser in Universe 2 really helps with plug-ins like Logo Motion where you can audition animations easily before applying them. You can quickly add some life to any logo or object with one or two clicks; if you want to get detailed you can dial in the idle animation and/or transition settings.

Color Stripe
Fourth is Color Stripe, a transition that uses color layers to wipe across and reveal another layer. This one is a pretty niche case use, but is still worth mentioning. In After Effects. transitions are generally a little cumbersome. I found the Universe 2 transitions infinitely easier to use in NLEs like Adobe Premiere. From the always-popular swish pan to exposure blur, there are some transitions you might use once or some you might use a bunch. Color Stripe is a transition that you probably won’t want to use too often, but when you do need it, it will be right at your fingertips. You can choose from different color schemes like analogous, tetradic, or even create a custom scheme to match your project.

In the end, Universe 2 has some effects that are essential once you begin using them, like uni.Unmult, uni.RGB Separation and the awesome uni.Chromatic Glow. The new ones are great too, I really like the ease of use of uni.HUD Components. Since these effects are GPU accelerated you might be surprised at how fast and fluid they work in your project without slowdowns. For anyone who likes apps like After Effects, but can’t afford to spend hours dialing in the perfect UI interface and HUD, Universe 2 is perfect for you. Check out all of the latest Red Giant Universe 2 tools here.

Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Earlier this year, Brady was nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.

Review: Tangent Ripple color correction panel

By Brady Betzel

Lately, it feels like a lot of the specializations in post production are becoming generalized and given to the “editor.” One of the hats that the editor now wears is that of color corrector — I’m not saying we are tasked with color grading an entire film, but we are asked to make things warmer or cooler or to add saturation.

With the standard Wacom tablet, keyboard and/or mouse combo, it can get a little tedious when color correcting — in Adobe Premiere, Blackmagic Resolve or Avid Media Composer/Symphony — without specialized color correction panels like the Baselight Blackboard, Resolve Advanced, Nucoda Precision, Avid Artist Color or even Tangent’s Element. In addition, those specialized panels run between $1,000 per piece to upwards of $30,000, leaving many people to fend for themselves using a mouse.

While color correcting with a mouse isn’t always horrible, once you use a proper color correction panel, you will always feel like you are missing a vital tool. But don’t worry! Tangent has released a new color correction panel that is not only affordable and compatible with many of today’s popular coloring and nonlinear editing apps, but is also extremely portable: the Tangent Ripple.

For this review I am covering how the Tangent Ripple works inside of Premiere Pro CC 2015.3, Filmlight’s Baselight Media Composer/Symphony plug-in and Resolve 12.5.

One thing I always found intimidating about color correction and grading apps like Resolve was the abundance of options to correct or grade an image. The Tangent Ripple represents the very basic first steps in the color correction pipeline: color balancing using lift, gamma, gain (or shadows, midtones and highlights) and exposure/contrast correction. I am way over-simplifying these first few steps but these are what the Ripple specializes in.

You’ve probably heard of the Tangent Element Panels, which go way beyond the basics — if you start to love grading with the Tangent Ripple or the Element-VS app, the Element set should be your next step. It retails for around $3,500, or a little below as a set (you can purchase the Element panels individually for cheaper, but the set is worth it). The Tangent Ripple retails for only $350.

Basic Color Correction
If you are an offline editor who wants to add life to your footage quickly, basic color correction is where you will be concentrating, and the Ripple is a tool you need to purchase. Whether you color correct your footage for cuts that go to a network executive, or you are the editor and finisher on a project and want to give your footage the finishing touch, you should check out what a little contrast, saturation and exposure correction can do.

panelYou can find some great basic color correcting tutorials on YouTube, Lynda.com and color correction-focused sites like MixingLight.com. On YouTube, Casey Faris has some quick and succinct color correction tutorials, check him out here. Ripple Training also has some quick Resolve-focused tips posted somewhat weekly by Alexis Van Hurkman.

When you open the Tangent Ripple box you get an instruction manual, the Ripple, three track balls and some carrying pouches to keep it all protected. The Ripple has a five-foot USB cable hardwired into it, but the track balls are separate and do not lock into place. If you were to ask a Ripple user to tell you the serial number on the bottom of the Ripple, most likely they will turn it over, dropping all the trackballs. Obviously, this could wreck the trackballs and/or injure someone, so don’t do it, but you get my point.

The Ripple itself is very simple in layout: three trackballs, three dials above the trackballs, “A” and “B” buttons and revert buttons next to the dials. That is it! If you are looking for more than that, you should take a look at the Element panels.

After you plug in the Ripple to an open USB port, you probably should download the Tangent Hub software. This will also install the Tangent Mapper, which allows you to customize your buttons in apps like Premiere Pro. Unfortunately, Resolve and the Media Composer Baselight plug-in do not allow for customization, but when you install the software you get a nice HUD that signals what service each Ripple button and knob does in the software you are using.

If you are like me and your first intro into the wonderful world of color correction in an NLE was Avid Symphony, you might have also encountered the Avid Artist Color panel, which is very similar in functionality: three balls and a couple of knobs. Unfortunately, I found that the Artist Color never really worked like it should within Symphony. Here is a bit of interesting news: while you can’t use the Ripple in the native Symphony color corrector, you can use external panels in the Baselight Avid plug-in! Finally a solution! It is really, really responsive to the Tangent Ripple too! The Ripple really does work great inside of a Media Composer plug-in.

The Ripple was very responsive, much more than what I’ve experienced with the Avid Artist Color panel. As I mentioned earlier, the Ripple will accomplish the basics of color correcting — you can fix color balance issues and adjust exposure. It does a few things well, and that is it. To my surprise, when I added a shape (a mask used in color correction) in Baselight, I was able to adjust the size, points and position of the shape using the Ripple. In the curves dialogue I was able to add, move and adjust points. Not only does Baselight change the game for powerful, in-Avid color correction, but it is a tool like the Ripple that puts color correction within any editor’s grasp. I was really shocked at how well it worked.

When using the Ripple in Resolve you get what Resolve wants to give you. The Ripple is great for basic corrections inside of Resolve, but if you want to dive further into the awesomeness of color correction, you are going to want to invest in the Tangent Element panels.

With the Ripple inside of Resolve, you get the basic lift, gamma and gain controls along with the color wheels, a bypass button and reset buttons for each control. The “A” button doesn’t do anything, which is kind of funny to me. Unlike the Baselight Avid plug-in, you cannot adjust shapes, or do much else with the Ripple panel other than the basics.

Element-Vs
Another option that took me by surprise was Tangent iOS and the Android app Element-Vs. I expected this app to really underwhelm me but I was wrong. Element-Vs acts as an extension of your Ripple — based off the Tangent Element panels. But keep in mind, it’s still an app and there is nothing comparable to the tactile feeling and response you get from a panel like the Ripple or Elements. Nonetheless, I did use the Element-Vs app on an iPad Mini and it was surprisingly great.

It is a bit high priced for an app, coming in at around $100, but I was able to get a really great response when cycling through the different Element “panels,” leading me to think that the Ripple and Element-Vs app combo is a real contender for the prosumer colorist. At a total of $450 ($350 for the Ripple and $100 for the Element-Vs app), you are in the same ballpark as a colorist who has a $3,000-plus set of panels.

As I said earlier, the Element panels have a great tactile feel and feedback that, at the moment, is hard to compare to an app, but this combo isn’t as shabby as I thought it would be. A welcome surprise was that the installation and connection were pretty simple too.

Premiere Pro
The last app I wanted to test was Premiere Pro CC. Recently, Adobe added external color panel support in version 2015.3 or above. In fact, Premiere has the most functionality and map-ability out of all the apps I tested — it was an eye-opening experience for me. When I first started using the Lumetri color correction tools inside of Premiere I was a little bewildered and lost as the set-up was different from what I was used to in other color correction apps.

I stuck to basic color corrections inside of Premiere, and would export an XML or flat QuickTime file to do more work inside of Resolve. Using the Ripple with Premiere changed how I felt about the Lumetri color correction features. When you open Premiere Pro CC 2015.3 along with the Tangent Mapper, the top row of tabs opens up. You can customize not only the standard functions of the Ripple within each Lumetri panel, like Basic, Creative, Curves, Color Wheels, HSL Secondaries and Vignette, but you can also create an alternate set of functions when you press the “A” button.

In my opinion, the best button press for the Ripple is the “B” button, which cycles you through the Lumetri panels. In the panel Vignette, the Ripple gives you options like Vignette Amount, Vignette Midpoint, feather and Vignette Roundness.

As a side note, one complaint I have about the Ripple is that there isn’t a dedicated “bypass” button. I know that each app has different button designations and that Tangent wants to keep the Ripple as simple as possible, but many people constantly toggle the bypass function.

Not all hope is lost, however. Inside of Premiere, if you hold the “A” button for alternate mapping and hit the “B” button, you will toggle the bypass off and on. While editing in Premiere, I used the Ripple to do color adjustments even when the Lumetri panel wasn’t on screen. I could cycle through the different Lumetri tabs, make adjustments and continue to edit using keyboard functions fast — an awesome feature both Tangent and Adobe should be promoting more, in my opinion.

It seems Tangent worked very closely with Adobe when creating the Ripple. Maybe it is just a coincidence, but it really feels like this is the Adobe Premiere Pro CC Tangent Ripple. Of course, you can also use the Element-Vs app in conjunction with the Ripple, but in Premiere I would say you don’t need it. The Ripple takes care of almost everything for you.

One drawback I noticed when using the Ripple and Element-Vs inside of Premiere Pro was a small delay when compared to using these inside of Resolve and Baselight’s Media Composer plug-in. Not a huge delay, but a slight hesitation — nothing that would make me not buy the Ripple, but something you should know.

Summing Up
Overall, I really love the Ripple color correction panel from Tangent. At $350, there is nothing better. The Ripple feels like it was created for editors looking to dive deep into Premiere’s Lumetri color controls and allows you to be more creative because of it.

Physically, the Ripple has a lighter and more plastic-type of feel than its Element Tk panel brother, but it still works great. If you need something light and compact, the Ripple is a great addition to your Starbuck’s-based color correction set-up.

I do wish there was a little more space between the trackballs and the rotary dials. When using the dials, I kept nudging the trackballs and sometimes I didn’t even realize what had happened. However, since the Ripple is made to be compact, lightweight, mobile and priced to beat every other panel on the market, I can forgive this.

It feels like Tangent worked really hard to make the Ripple feel like a natural extension of your keyboard. I know I sound like a broken record, but saving time makes me money, and the Tangent Ripple color correction panel saves me time. If you are an editor that has to color correct and grade dailies, an assistant editor looking to up their color correction game or just an all-around post production ninja who dabbles in different areas of expertise, the Tangent Ripple is the next tool you need to buy.


Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Earlier this year, Brady was nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.

GoPro intros Karma foldable drone, Hero5 with voice-controlled recording

By Brady Betzel

“Hey, GoPro, start recording!” That’s right, voice-controlled recording is here. Does this mean pros can finally start all their GoPros at the same time? More on this in a bit…

I’m one of the lucky few journalists/reviewers who have been brought out to Squaw Valley, California, to hear about GoPro’s latest products first hand — oh, and I got to play with them as well.

So, the long awaited GoPro Karma drone is finally here, but it’s not your ordinary drone. It is small and foldable so it can fit in a backpack, but the three-axis camera stabilizer can be attached to the included Karma grip so you can grab the drone before it lands and carry it or mount it. This is huge! If worked out correctly you can now fake a gigantic jib swing with a GoPro, or even create some ultra-long shots. One of the best parts is that the controller is a videogame style remote that doesn’t require you use your phone or tablet! Thank you GoPro! No, really, thank you.

The Karma is priced at $799, the Karma plus Session is $999, and the Karma plus Hero5 Black is $1,099. And it’s available one day before my birthday next month — hint, hint, nudge, nudge — October 23.

To the Cloud! GoPro Plus and Quik Apps
So you might have been wondering how GoPro intends to build a constant revenue stream. Well, it seems like they are banking on the new GoPro Plus cloud-based subscription service. While your new Hero5 is charging it can auto-upload photos and videos via a computer or phone. In addition you will be able to access, edit and share all from GoPro Plus. For us editing nerds, this is the hot topic because want to edit everything from anywhere.

My question is this: If everyone gets on the GoPro Plus train, are they prepared for the storage and bandwidth requirements? Time will tell. In addition to being able to upload to the cloud with your GoPro Plus subscription, you will have a large music library at your disposal, 20 percent off accessories from GoPro.com, exclusive GoPro Apparel and Premium Support.

The GoPro Subscription breaks down to $4.99 and is available in the US on October 2 — it will be in more markets in January 2017.

Quik App is GoPro’s ambitious attempt at creating an autonomous editing platform. I am really excited about this (even though it basically eliminates the need for an editor — more on this later). While many of you may be hearing about Quik for the first time, it actually has been around for a bit. If you haven’t tried it yet, now is the time. One of the most difficult parts of a GoPro’s end-to-end workflow is the importing, editing and exporting. Now, with GoPro Plus and Quik you will be automatically uploading your Hero5 footage while charging so you can be editing quickly (or Quik-ly. Ha! Sorry, I had to.)

Hero5 Black and Hero5 Session
It’s funny that the Hero5 Black and Session are last on my list. I guess I am kind of putting what got GoPro to the dance last, but last doesn’t in any way mean least!

Hero5 Black

Available on October 2, the Hero5 Black is $399, and includes the following:
● Two-inch touch display with simplified controls.
● Up to 4K video at 30fps
● Auto-upload to GoPro Plus while charging
● Voice Control with support for seven languages, with more to come
● Simplified one-button control
● Waterproof, without housing, to 33 feet
● Compatible with existing mounts, including Karma
● Stereo audio recording
● Video Stabilization built-in
● Fish-eye-free wide-angle video
● RAW and WDR (wide dynamic range) photo modes
● GPS built-in!

Hero5 Session is $299 and offers these features:
● Same small design
● Up to 4K at 30fps
● 10 Megapixel photos
● Auto upload to GoPro Plus while charging
● Voice Control support for seven languages with more to come
● Simplified one-button control
● Waterproof, without housing, to 33 feet
● Compatible with existing mounts, including Karma
● Video Stabilization built in
● Fish-eye-free wide-angle video

Summing Up
GoPro has made power moves. They not only took the original action camera — the Hero — to the next level with upgrades like image stabilization, waterproof without housing, and simplifying the controls in the Hero5 Black and Hero5 Session, they added 4K recording a 30fps and stereo audio recording with Advanced Wind Noise Reduction.

Not only did they upgrade their cameras, GoPro is attempting to revolutionize the drone market with the Karma. The Karma has potential to bring the limelight back to GoPro and steal some thunder from competitors, like DJI, with this foldable and compact drone whose three-axis gimbal can be held by the included Karma handle.

Hero5 Session

Remember that drone teaser video that everyone thought was fake!? Here it is just in case. Looks like that was real and with some pre-planning you can recreate these awesome shots. What’s even more awesome is that later this year GoPro will be launching the “Quik Key,” a micro-USB card reader that plugs into your phone to transfer your videos and photos to your phone, as well as REMO — a voice-activated remote control for the Hero5 (think Apple TV, but for your camera: “GoPro, record video.”

Besides the incredible multimedia products GoPro creates, I really love the family feeling and camaraderie within the GoPro company and athletes they bring in to show off their tools. Coming from the airport to Squaw Valley, I was in the airport shuttle with some mega-pro athletes/content creators like Colin, and they were just as excited as I was.

It was kind of funny because the people who are usually in the projects I edit were next to me geeking out. GoPro has created this amazing, self-contained, ecosphere of content creators and content manipulators that are fan-boys and fan-girls. The energy around the GoPro Karma and Hero5 announcement is incredible, and they’ve created their own ultra-positive culture. I wish I could bottle it up and give it out to everyone reading this news.

Check out some video I shot here.

Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Earlier this year, Brady was nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.

Review: Avid Media Composer 8.5 and 8.6

By Brady Betzel

It seems that nonlinear editing systems, like Adobe Premiere, Apple FCP X, Lightworks, Vegas and Blackmagic Resolve are being updated almost weekly. At first, I was overjoyed with the frequent updates. It got to the point where I would see a new codec released on a Monday and by Friday you could edit with it (maybe a slight exaggeration, but pretty close to the truth). Unfortunately, this didn’t always mean the updates would work.

One thing that I have learned over the last decade is that reliable software is worth its weight in gold, and one NLE that has always been reliable in my work is Avid Media Composer. While Media Composer isn’t updated weekly, it has been picking up steam and has really given its competitors a run for their money.

With Avid Media Composer’s latest updates, including 8.5 and all the way through 8.6.1, we are seeing the true progression of THE gold standard in nonlinear editing software. From the changes that editors have been requesting for years, like the ability to add a new track to the timeline by simply dragging a clip, all the way to selecting all clips with the same source clip color in the timeline (an online editor’s dream — or maybe just mine), Media Composer is definitely heading in the right direction. Once they fix options, such as the Title Tool, I am sure many others will be in the same boat I am. Even with Adobe’s latest update news of Team Projects, I think Avid’s project sharing will remain on top, but don’t get me wrong, I love the competition and believe it’s healthy for the industry in general.

Digging In
So how great are the latest updates in Media Composer? Well, I am going to touch on a few that really make our lives as editors easier and more efficient, including the new Source Browser; custom-sized project creation Preset Manager; Audio Channel Grouping; grouping clips by audio waveform; and many more.

For simplicity’s sake I won’t be pointing out which update contained exactly what, so let’s just assume that you and I are both talking about 8.6.1. Even though 8.6.2 was released, it was subsequently pulled down because of a bad installer and replaced by 8.6.3. Long story short, I did this review right before 8.6.3 was released so I am sticking to 8.6.1. You can find the read me file for any 8.6.3 related bug fixes and feature updates, including Realtime EQ and Audio Suite Effects.

Source Browser
Let’s take a look at the new Source Browser first. If you have worked in Premiere Pro before then you are basically familiar with what the Source Browser does. Simply put, it’s a live access point from within Media Composer where you can either link to media (think AMA) or import media (the traditional way). The Source Browser is great because you can always leave it open if you want, or close it and reopen it whenever you want. One thing I found funny is that there was not a default shortcut to open the Source Browser — you have to manually map it.

Nonetheless, it’s a fast way to load media into your Source Monitor without bringing the media into a bin. It even has a Favorites tab to keep all of the media you access on a regular basis in the same place — a good spot for transition effects, graphics, sound effects and even music cues that you find yourself using a lot. The Source Browser can be found under the Tools menu. While I’ve seen some complaints about the menu reorganization and the new Source Browser, I like what Avid has done. The updated layout and optimized menu items seem to be a good fit for me, it will just take a little time to get used to.

Up next is my favorite update to Media Composer since I discovered the Select Right and Select Left commands without Filler and how to properly use the extend edit function: selecting clips in the timeline based on source color. If you’ve ever had to separate clips onto different video and audio tracks, you will understand the monotony and pain that a lot of assistant editors and conforming editors have to go through. Let’s say you have stock footage mixed in with over two hours of shot footage and you want to quickly raise all of the clips onto their own video layer. Previously, you would have to move each clip individually using the segment tool (or shift + click a bunch of clips), but now you can select every clip with the same source color at once.

Color Spaces

First, you should (or at least I recommend that you should) enable Source Color in your timeline, but you don’t have to for this to work. Second, you use either the red or yellow segment tools or alt (option) + click from left to right over the clip with the color that you want to select throughout the timeline. Once the clip is selected you will right click on the clip. Under the Select menu, click on Clips with the Same Source Color. Every clip with that same color will be selected and you can Shift + CTRL drag the clips to a new track. Make this a shortcut and holy cow — imagine the time you will save!

Immediately, I think of trouble shots that might need a specific color correction or image restoration applied to them like a dead pixel that appears throughout a sequence. In the bin, color the trouble clips one color, select them all in the timeline and bam you are ready to go, quickly and easily. This update is a game changer for me. Under the Select menu you will see a few other options like Offline Clips, Select Clips with No Source Color, Select Clips with Same Local Color, and even Reverse Selection.

Audio
Now let’s jump into the audio updates. First off is the nesting of audio effects. I mean come on! How many times have I wanted to apply a Vari-Fi effect at the end of a music cue and add D-Verb on top of it?! Now I can create all sorts of slow down promo/sizzle reel madness that a mixer will hate me for without locking myself into a decision!

I tried this a few times expecting my Media Composer to crash, but it worked like a champ. Previously, as a workaround, I had to mixdown the Vari-Fi audio (locking me into that audio with no easy way of going back) and apply the D-Verb to the audio mixdown. This isn’t the cleanest workflow but it guaranteed my Vari-Fi would make it into the mix. Now I guess I will have to trust the mixer to not strip my audio effects off of the AAF we send them.

Digging a bit further into the audio updates for Media Composer 8.5 and 8.6, I found the ability to add up to 64 tracks of audio and, more specifically, 64 voices. Sixty-four voices can be laid out in these possible combinations: 64 mono tracks, 32 stereo tracks, 10-5.1 tracks plus four mono tracks or even eight 7.1 tracks.

Nested Audio

Let’s be honest — from one editor to another — do we really need to use all 64 tracks of audio? I urge you to use this sparingly, and only if you have to. No one wants to be scrolling through 64 tracks of audio. I am hesitant to totally embrace this, because while it is an incredible update to Media Composer, it allows for editors to be sloppy, and nobody has time for that. Also, older versions of Media Composer won’t be able to open your sequence as they are not backwards compatible with this.

My second favorite update in Media Composer is Audio Groups. I am a pretty organized (a.k.a. obsessive-compulsive editor), and with my audio I typically lay out voiceover and ADR on tracks 1-2, dialogue on 3-6, sound effects on 7-12 and music on 13-16.

These have to be fluid, and I find that these fit my screen real estate well. They keep my audio edit as tidy as possible, although now with 64 tracks I can obviously expand. But one thing that always sucked was having to mute each track individually or all at once. Now, in the Audio Mixer you can easily create groups of audio tracks that can be enabled and disabled with one click instead of individually selecting each audio track. For instance, I can group all of my music tracks together to toggle them off and on with one check box. In the Audio Mixer there is a small arrow on the upper left that you will twirl down, select the audio tracks that you want to group, such as tracks 13-16 for music, right click, click Create New Group, name it and there you go — audio track selection glory.

Audio Ducking

Last in the audio updates is Audio Ducking. When I think of Audio Ducking I think of having a track of voiceover or ADR over the top of a music bed. Typically, I would go through and either add audio keyframes where I need to lower the music bed or create add edits, lower the audio in the mixer, apply a dissolve between edits and repeat throughout the segment.

Avid has really stepped its game up with Audio Ducking because now I can specify which of my dialogue tracks I want Avid to calculate for when my music bed is playing. You can even twirl down the advanced settings, adjust threshold and hold time for the dialogue tracks, as well as attenuation and ramp time for the music bed tracks. I tried it and it worked. I won’t go as far as to say that you should just use that instead of doing your own music edits, but it is an interesting feature that may help a few people.

Wait, There’s More
There were a few straggling updates I didn’t touch on you will want to check out. Avid has added support for HDR color spaces, such as RGB DCI-P3, RGB 2020 and many more. Once I get my hands on some sweet HDR footage (and equipment to monitor it) I will dabble in that space.

Also, you can now group footage by audio waveform. While grouping by audio waveform is an awesome addition, especially if you have previously used Red Giant’s PluralEyes and feel left out because they discontinued AAF support, it lacks a few adjustments that I find absolutely necessary when working with hours upon hours of footage. For instance, I would love to be able to manually sync clips that don’t have audio loud enough for Avid to discern properly and create a group. Even more so, for all grouping I would really love to be able to adjust the group after it has been created. If after the group was created I could alter it inside of a sequence that would immediately reflect my changes in the group itself, I —along with about one million other editors and assistant editors — would jump for joy.

Lastly, the Effects Tab has been improved with its Quick Find search-ability. Type in the effect you are looking for and it will pop up. This is another game-changing feature to me.

Summing Up
For a while there I thought Avid was satisfied to stay the course in 1080p land, but luckily that isn’t the case. They have added resolution independence, custom project resolutions, and while they have added features to Media Composer — like Source Browser and the ever-improving Frame Flex — they kept their project sharing and rock solid media management at the top level.

Even after all of these updates I mentioned, there are still some features that I would love to see. Those include built-in composite modes in the Effects Pallette; editable groups; an improved Title Tool that will work in 4K and higher resolutions without going to a third-party for support; updated Symphony Color Correction tools; smart mix-downs that can inherit alpha channels; the ability to disable video layers but still see all the layers above and below; and many more.

If I had to use just one word to describe Media Composer, I would say reliable. I love reliability more than fancy features. And now that you heard my Media Composer review, you can commence trolling on Twitter @allbetzroff.

Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Earlier this year, Brady was nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.

Updates to Adobe Creative Cloud include project sharing, more

By Brady Betzel

Adobe has announced team project sharing!! You read that right — the next Adobe Creative Cloud update, to be released later this year, will have the one thing I’ve always said kept Adobe from punching into Avid’s NLE stake with episodic TV and film editors.

While “one thing” is a bit of hyperbole, Team Projects will be much more than just simple sharing within Adobe Premiere Pro. Team Projects, in its initial stage, will also work with Adobe After Effects, but not with Adobe Audition… at least not in the initial release. Technically speaking, sharing projects within Creative Cloud seems like it will follow a check-in/check-out workflow, allowing you to approve another person’s updates to override yours or vice-versa.

During a virtual press demo, I was shown how the Team Projects will work. I asked if it would work “offline,” meaning without Internet connection. Adobe’s representative said that Team Projects will work with intermittent Internet disconnections, but not fully offline. I asked this because many companies do not allow their NLEs or their storage to be attached to any Internet-facing network connections. So if this is important to you, you may need to do a little more research once we actually can get our hands on this release.

My next question was if Team Projects was a paid service. The Adobe rep said they are not talking the business side of this update yet. I took this as an immediate yes, which is fine, but officially they have no comment on pricing or payment structure, or if it will even cost extra at all.

Immediately after I asked my last question, I realized that this will definitely tie in with the Creative Cloud service, which likely means a monthly fee. Then I wondered where exactly will my projects live? In the cloud? I know the media can live locally on something like an Avid ISIS or Nexis, but will the projects be shared over the Internet? Will we be able to share individual sequences and/or bins or just entire projects? There are so many questions and so many possibilities in my mind, it really could change the multiple editor NLE paradigm if Adobe can manage it properly. No pressure Adobe.

Other Updates
Some other Premiere Pro updates include: improved caption and subtitling tools; updated Lumetri Color tools, including much needed improvement to the HSL secondaries color picker; automatic recognition of VR/360 video and what type of mapping it needs; improved virtual reality workflow; destination publishing will now include Behance (No Instagram export option?); improved Live Text Templates, including a simplified workflow that allows you to share Live Text Templates with other users (will even sync Fonts if they aren’t present from Typekit) and without need for an After Effects License; native DNxHD and DNxHR QuickTime export support, audio effects from Adobe Audition, Global FX mute to toggle on and off all video effects in a sequence; and, best of all, a visual keyboard to map shortcuts! Finally, another prayer for Premiere Pro has been answered. Unfortunately, After Effects users will have to wait for a visual keyboard for shortcut assignment (bummer).

After Effects has some amazing updates in addition to Project Sharing, including a new 3D render engine! Wow! I know this has been an issue for anybody trying to do 3D inside of After Effects via Cineware. Most people will purchase VideoCopilot’s Element 3D to get around this, but for those that want to work directly with Maxon’s Cinema 4D, this may be the update that alleviates some of your 3D disdain via Cineware. They even made mention that you do not need a GPU for this to work well. Oh, how I would love for this to come to fruition. Finally, there’s a new video preview architecture for faster playback that will hopefully allow for a much more fluid and dynamic playback experience.

After Effects C4D RenderAdobe Character Animator has some updates too. If you haven’t played with Character Animator you need to download it now and just watch the simple tutorials that come with the app — you will be amazed, or at least your kids will be. If you haven’t seen how the Simpson’s used Character Animator, you should check it out with a YouTube search. It is pretty sweet. In terms of incoming updates, there will be faster and easier puppet creation, improved round trip workflow between Photoshop and Illustrator, and the ability to use grouped keyboard triggers.

Summing Up
In the end, the future is still looking up for the Adobe Creative Cloud video products, like Premiere Pro and After Effects. If there is one thing to jump out of your skin over in the forthcoming update it is Team Projects. If Team Projects works and works well, the NLE tide may be shifting. That is a big if though because there have been some issues with previous updates — like media management within Premiere Pro — that have yet to be completely ironed out.

Like I said, if Adobe does this right it will be game-changing for them in the shared editing environment. In my opinion, Adobe is beginning to get its head above water in the video department. I would love to see these latest updates come in guns blazing and working. From the demo I saw it looks promising, but really there is only one way to find out: hands-on experience.

Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com, and follow him on Twitter @allbetzroff. Earlier this year, Brady was nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.

Review: Mocha Pro 5 plug-in for Media Composer

By Brady Betzel

A common theme among editors and colorists these days is where to draw the line when asked to do more than their job description. Some editors want to stick to cutting, while others find it exciting to push their creative boundaries. Personally, the more options I have in my toolkit, the more employable I become.

If you are one of those editors who doesn’t want to learn how to better track or roto, you might want to rethink your stance. If you need proof, hop on over to YouTube and search for tracking and rotoscoping tutorials. I bet you will find someone younger and hungrier than you doing things beyond your imagination. In the end, tools like Boris FX/Imagineer System’s Mocha Pro are life-altering tools for an editor that will directly affect how much time you can save and how much money you can make.

Whether you are obscuring a face with a blur or painting out an errant drone from the sky, Mocha Pro will turn hours of work into minutes. Mocha Pro 5, the latest update to the Mocha family, is a 2D planar tracker that tracks surfaces and planes instead of points. In simple terms, think of Mocha as being able to track objects with flat (or flat-ish) faces, like a billboard or the hood of a car, as opposed to a single point much like the tracker in Avid Media Composer.

Mocha Pro can be used as a standalone tool for $1,495 or as a standalone app plus multi-host plug-ins for $1,995. It can also be purchased as a plug-in only for Avid for $695, Adobe for $695, OFX (Nuke and Fusion at the moment) for $695 or multi-host for $995. There are also upgrade pricing options if you are coming from Mocha Pro 4. For software and hardware specs click here.

While Mocha Pro 5 isn’t cheap, there are many new features that justify the price of admission. One of the best improvements is the GPU acceleration. If you have a somewhat modern OpenCL-compatible graphics card (any of the current offerings from Nvidia, AMD and Intel, for example) then you will see a big increase in tracking speed.

In Mocha Pro 5 you can select GPU for rendering in the Mocha Preferences Menu > GPU tab. Mocha can either automatically select what it thinks is the best OpenCL device, you can identify it yourself or you can select none and go raw processor only.

The Plug-In Option
The new Mocha Pro 5 plug-in option is the other heavy hitter in this update. The plug-in option allows the editor or VFX artist to work natively in their favorite host app, such Adobe After Effects or Avid Media Composer. So they launch Mocha Pro, work with the native file formats supported in the host app and render within that apps’ world, all while still having access to the power of the Mocha Pro toolset.

While this is game-changing for anyone, it’s especially game-changing for Media Composer editors who are tired of trying to get the point tracker to work or who don’t have access to Boris FX BCC 10 — or maybe they want to go beyond what the Mocha tracker does through BCC 10.

You can now use your layer-based timeline, apply Mocha Pro to the clip in question and fully composite your roto, track or removal work within the Avid timeline. You can also export the data for heavier compositing in apps like Nuke or After Effects, even if you are working through Media Composer. Really incredible!

I have seen my productivity increase by hours using Mocha Pro. Instead of fiddling with the Paint Tool in Media Composer combined with the Point Tracker, I can get a rock-solid track in Mocha and an even cleaner matte than I would have if I stayed within Media Composer.

When working on freelance projects, I often get asked to do what some might consider VFX work — this typically doesn’t mean much more money. Because of tools like Mocha Pro 5, I don’t have turn down jobs because of time constraints.

Many clients appreciate my ability to tackle mild visual effects problems and become return clients because they can have one person do the work instead of many. And, let’s be honest, who wants to deal with coordinating projects and files between multiple people? I know I don’t if I can avoid it.

The Mocha Pro 5 features I tested for this review were the Lens Correction, Stabilize, Insert and Remove Modules using the Mocha Pro 5 plug-in for Media Composer

While I have the utmost respect for awesome rotoscope artists, I don’t have a desire to be one. I will leave that for you to check out on your own. There are some great tutorials on the topics I’m not covering, such rotoscoping and 3D camera solves here.

Real-World Tests
First off is Lens Correction. To test Mocha Pro 5’s lens correction ability I found some old GoPro footage, which is notorious for having a huge fish-eye effect because of its wide angle. (As a side note, Boris FX’s Continuum Complete has an easy Lens Correction fix built into its BCC v10 plug-in suite under Image Restoration.) But what if you want to keep the lens distortion, composite something into your scene and add the lens distortion to your insert shot? This is where Mocha Pro 5 comes into play.

After you apply the Mocha Pro 5 plug-in to your video layer in Media Composer, you can launch the app from the Effects Panel. From inside of Mocha you can jump to the Lens tab, click locate lines, locate and select lines in your scene that should be straight, select the type of distortion you have (for simple scenes, probably the 1-Parameter setting), hit calibrate and distort or un-distort. If you are applying the distortion to an insert shot, you will track the section of the shot you want to insert a shot onto, use the Surface Align tool to identify the surface, import the shot you want to insert, go back to Insert Menu on the original clip you want to composite on and identify the imported clip. Boom! Magic! Next, quit out and save from Mocha Pro 5, check off the Render box in the Effects Panel and select Insert: Composite from the drop down. After that long-winded explanation, you should have a shot properly tracked with the lens distortion applied.

For a great and quick tutorial check out Dan Harvey’s tutorial on BorisFX.com.

As a side note, when working in Media Composer, any effect that you apply to your clip underneath the Mocha Pro 5 plug-in will come through when working in Mocha Pro. Using a ProRes HQ QuickTime I had filmed that was flat in color, I color balanced and corrected the QuickTime in Media Composer and then wanted to do some work in Mocha Pro 5. It was a welcome surprise that the color came through; it really made tracking easier.

Up next I wanted to test out the stabilization inside of Mocha Pro 5. Media Composer’s Fluid Stabilizer is pretty good when it works, so naturally I wanted to see how the Mocha Pro Stabilize held up. While you can Stabilize or Region Stabilize within Media Composer (fairly well, I might add), there are limitations that Mocha Pro 5 really helps with. Once I dove into stabilizing inside of Mocha Pro 5 for Avid, I really began to like the process and methodology.

To get a succinct and great tutorial check out the famous Mary Poplin (Twitter @MaryPoplin) — she has a million great Mocha tutorials that can be found on the BorisFX.com or ImagineerSystems.com websites or on their YouTube page.

Simply, once inside of Mocha Pro you will track the ground plane, align your surface and grid and tell it what type of motion to stabilize for. You can even center, zoom and crop your footage to fill the frame. It’s pretty incredible. With a little time, you can turn a bumpy shot that you filmed out of the side of your car window into one that feels like it was filmed using a professional crane, or at least something close to that.

Finally, I wanted to touch on the Remove Module, and it is straight-up voodoo. Even after using it multiple times, I am still blown away at how well this works. Without going into too much detail, you define what is the background, foreground and what object you want to remove by drawing shapes and tracking them individually. One very important point that I often forget in Mocha is that layer order matters and will make a gigantic difference in your tracking success.

Mocha will read your layers from Top to Bottom. Top being closest to camera and Bottom being the furthest from camera, meaning the background should be your bottom layer. To achieve a successful remove, you need to track the background area that the object you want removed lies on top of and make sure that your garbage matte surrounding your removal object covers the entire object. Also, make sure the matte covering your background covers the entire area that your removal object goes through.

Then you need to track the object you want to remove… the entire object. Typically, your garbage mattes don’t need to be ultra-tight — just enough to cover the entire object you want removed. Without going too deep into clean plate techniques, if you find that your object doesn’t move enough for Mocha Pro to remove your object, you might need to export a frame of your video from Mocha Pro. Simply, click Create Clean Plate and paint out the object in an app like Photoshop. Once painted out, you can import the plate back into Mocha Pro and be able to see your results easily. I tried my hand at removing a shoe from a scene, and while I was able to remove the shoe I had a few crashes along the way. But, in the end, I was able to achieve a successful remove. Check out my Object Removal using Mocha Pro 5 for Media Composer, which I filmed using the DJI Phantom 4 (my review will be coming out soon).

Summing Up
While Mocha Pro 5 is fast for the magic it helps you create, it does need some time to render. My 2-3 second clip needed a few minutes to process the removal. For those wondering if my system was up to par I was running Avid Media Composer and Mocha Pro 5 on an HP Z1G3 workstation loaded with Intel Xeon E3 processors and an Nvidia Quadro M2000M graphics card, so the system is awesome (you will see a review of that in the future also).

In the end, if you need to track, rotoscope, remove or insert footage, Mocha Pro 5 is worth every penny. Imagineer’s Mocha Pro is one of those apps you must have in your post toolbox if you do any kind of tracking, object removal or inserts. It will shave hours off of your workload and directly increasing your productivity )and hopefully your paycheck).

If you are tired of using Media Composer’s built-in point tracker and still need to accomplish tracking, object removal, stabilization and more, Mocha Pro 5 is your savior.

Something I really love about the direct integration of Mocha Pro 5 inside of tools like Media Composer is the continued ability to export your tracking data to pass between apps. Say you are an overzealous offline editor who rules at Mocha Pro tracking in Media Composer but also wants to pass along that shape data to the Nuke or After Effects for some heavy compositing and/or further refinement — you can still export all of the data necessary. Awesome!

It will take a few tries to understand just how a planar tracker like Mocha Pro 5 works, but with the awesome tutorials from Mary Poplin, Martin Brennand and many others on Imagineer’s YouTube page, you should be up and running in a matter of hours.

Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com, and follow him on Twitter @allbetzroff. Earlier this year, Brady was nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.

Review: FxFactory’s AudioDenoise, EchoRemover, Xsend Motion

By Brady Betzel

Over the last few years, I’ve really enjoyed working on Windows-based PCs , but some of the drawbacks are the inability to use FCP X, Motion and any of their plug-ins. Recently, I opened my MacBook Pro once again because I found three plug-ins from FxFactory’s partners Automatic Duck and CrumplePop that I really wanted to check out inside of FCP X: AudioDenoise, EchoRemover and Xsend Motion. I’ve previously reviewed FxFactory plug-ins such as Nodes 2 before, and they are pretty incredible.

FxFactory is a Mac OS-only plug-in app that manages all of FxFactory’s plug-ins. You can buy, update, install, read info and even watch tutorials inside of their app! The FxFactory app is actually a very well-organized place to centrally locate all of your FxFactory plug-ins, as well as learn what each of them do without much legwork. There are a few tabs inside, including one that shows all of your purchased plug-ins, so, if like me, you forget about some of your plug-ins you can check them out in one page.

CrumplePop EchoRemover
First up is CrumplePop EchoRemover, an audio plug-in that sells for $99 and, like its name implies, it removes echo from your audio. If you’ve worked with Red Giant’s Universe plug-in, you’ll want to check out CrumplePop.

EchoRemover has a very simple approach with minimal input needed to do its magic in either FCP X or Adobe Premiere as long as they are on the Mac OS. It has three options to fine-tune the echo removal: Strength, Release and Bass Reduction. Strength covers how aggressively the echo is removed (think of it as opacity if you are a video person); Release describes how fast the cutoff is at the end of words or sounds; and Bass Reduction can help get rid of extra bass that might be present.

To test it out I used a clip I recorded using the Rode VideoMic Go along with the Rode SC4 TRS to TRRS converter with help from my iOgrapher. I figured the iPhone 6 will probably be the lowest common denominator with audio recording, and the reality is a lot of television shows use the iPhone as a quick way to get an emergency soundbyte into an edit. You can check out my test clips on my YouTube page where I placed the unaffected audio before the audio with EchoRemover and AudioDenoise were applied.

I recorded a short clip in my garage to allow for as much echo and background noise as possible. I was able to get a good amount of echo when I stood next to my metal garage door. Once inside of FCP X, I dragged the clip to a new timeline and applied EchoRemover straight away. Now you’Il need to remember that this plug-in is made to be “drag and drop,” meaning you won’t really need to make any adjustments (although small tweaks are possible). I was very impressed with the result of the echo removal. I dropped it on and didn’t touch the parameters at all.

I probably should have touched the Release and Strength a little, but for this example I wanted to leave it — straight out of the FxFactory/CrumplePop box. I guess I could ask for more parameters to adjust, but I really love the simplicity of these plug-ins. As a video editor it lets me concentrate on the story and less on the awful technical difficulties that can happen.

AudioDenoise
Up next is CrumplePop’s AudioDenoise, which sells for $99. As its name implies, its goal is to remove background noise from your audio. AudioDenoise works in Adobe Premiere as well as FCP X, but let’s stick with FCP X for the moment.

AudioDenoise is found under the CrumplePop plug-in heading and is as simple as parking your playhead over the section that contains a good sample of the background noise you are looking to eliminate — although I tested it I through AudioDenoise without any regard for what audio was playing and it worked.

You can then go into the Effects panel and adjust the Strength and Profile in the Effects tab of FCP X. Much like EchoRemover, AudioDenoise is a drag and drop plug-in. I tested the AudioDenoise plug-in by recording a clip in my garage like before, but this time with as much background noise as possible (without waking up the kids), I started our dryer and began talking. You can listen to my demo on YouTube (it is after the EchoRemover demo). Just like EchoRemover, AudioDenoise worked great and without any fiddling of the effect parameters. I was thoroughly impressed.

Xsend Motion
Last, but not least in this FxFactory FCP X-focused review, is Xsend Motion. If you’ve ever heard of Automatic Duck (if you are over 25 years old you probably had to use it when getting yourself out of some sticky FCP 7 to Avid Media Composer circumstances that some crazy person put you in) then you probably already trust this plug-in, because the creators of Automatic Duck created Xsend Motion.

Simply, for $99 Xsend Motion converts your FCP X timeline into a Motion project, complete with some simple effects like position, scale, blending modes and a few third party plug-ins — you can find a more detailed list of third-party compatible plug-ins here. Think of it like a fancy AAF transfer engine or more like how Premiere can send clips to Adobe After Effects from the timeline.

From FCP X you can either send your entire timeline or a section over to Motion. For the entire timeline you will go to the Share Project menu and click Xsend Motion. This will open up the Xsend Motion App where you can tell Xsend Motion where to place the FCP XML, whether or not to create layer groups and where to save the Motion project that you will be creating. From there, Xsend Motion will launch your new project inside of Motion to be edited. If you only want to send a certain section of your timeline to Motion, you will need to create a compound clip (think of a submaster, if you are familiar with Avid Media Composer). Click the newly created compound clip and select File > Export XML. You will then open that XML inside of Xsend Motion, select your settings and click Continue — much like the previous way of sending the entire timeline to Motion.

Once you have made your magic motion graphics inside of Motion you will most likely need to get this back into FCP X. You have two options: export your new Motion project as an FCP X preset/template/generator or export a QuickTime for use in FCP X. My advice is to export a QuickTime as opposed to an FCP X preset/generator as it won’t require re-rendering, but you will need to decide this on a case-by-case basis.

Summing Up
In the end, I was really impressed with CrumplePop’s EchoRemover, AudioDenoise and Xsend Motion. At $99 a piece, some might consider AudioDenoise and EchoRemover a little expensive, but if you value your time and ability to improve your audio production quickly and easily, then $99 is a great price. They really give the editor the ability to focus on the content of the edit rather than fixing subpar audio recording.

Xsend Motion furthers the ability to focus on the content of your edit by letting you send multiple layers of video to Motion from FCP X without breaking each layer into separate QuickTimes, this plug-in seems so necessary it’s a wonder why it isn’t already in FCP X!

Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com, and follow him on Twitter @allbetzroff. Earlier this year, Brady was nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.

 

Review: Rampant Design Tools FCP X plug-ins

By Brady Betzel

Over the past few weeks, I’ve been diving into FCP X and testing out some new tools that work within the editing system, including Rampant Design Tool’s FCP X plug-ins. Rampant Design helps you add value to your project, like adding video enhancements including light leaks, glitch effects, transitions, moving matte overlays and much more.

To back up a second, I have been reviewing Sean and Stefanie Mullen’s Rampant Design products for years, and they regularly create products that directly make me money in my freelance editing. That is the truth, plain and simple. If you make content that needs to grab someone quickly, like a sizzle reel, promos or anything for that matter, Rampant Design tools will help. They will add instant pop to your footage as well as save you so much time that you will be able to concentrate on the story you are trying to convey.

Check out Sean and Stefanie’s Facebook and YouTube show: www.RampantLive.com, a weekly live show that covers topics ranging from dealing with clients all the way to managing stress. It also includes interviews with industry pros like @FilmRiot’s Ryan Connolly (@ryan_connolly), Kevin P. McAuliffe (@KPMcAuliffe) and even me. They not only sell products, but they contribute to the entire post ecosphere by creating actionable and engaging content.

Digging In
Ok, enough background on to the products at hand. Some of the latest releases in the Rampant Design Tools catalog are FCP X plug-ins and Edit Essentials. Edit Essentials are a pack of 220 ProRes QuickTime light leaks, film effects, flares and more that were shot on Red cameras in 4K resolution.

While I am not going too deep into Edit Essentials in this review you can check them out here. Edit Essentials contains many similar products to the FCP X plug-ins, and are QuickTime-based — with a simple composite mode switch to something like Add or Hard Light you can be off and running in Adobe Premiere.

The Rampant Design Tool’s FCP X plug-ins are individual plug-ins that can be adjusted inside of FCP X’s Effects Panel. The downloadable FCP X plug-ins are Hard Light Overlays, Soft Light Overlays, Gradient Overlays, Style Mattes, Film Effects, Film Leaks, Flash Transitions and Glitch Transitions, all of which priced at $29 per set, a very reasonable price considering these are extremely customizable inside of FCP X.

I tried a few of these and created a quick demo on my YouTube channel. I used the Hard and Soft Light Overlays, Flash Transitions, Style Mattes, and Film Leaks combined using different compositing modes (which thankfully FCP X has built in).

What I really liked about using the FCP X plug-ins from Rampant were the ease of use and customization of each effect. Inside of the different effects, such as the overlays and flash transitions, you can customize hue, saturation, rotation and much more. If you like the way a certain light leak feels but you need it to be red instead of green, you can easily change the hue in the Effects Panel. With many of the effects plug-ins you can preview the effect inside the Effects panel or even preview the effect on the actual footage in your timeline and then click and drag onto your clip in the timeline.

style mattesThere is a trick I like to try with these effects: combining different compositing effects on top of each other. With Overlays I’ll apply a Hard Light composite mode to the clip with the Soft Light Overlay on it and place another video clip on a track beneath it. You can achieve a really unique look that is helpful when building sizzle reels and main title sequences. If you combine that like I did with the Style Mattes and you can achieve some high-level, professional-looking results. It really can give you an edge.

I’ve told this story before, but at one point I was an assistant editor looking to make the jump to editor, and I remember seeing the first iterations of Rampant Design Tools (which I still have on DVD by the way), ordered them and now I honestly believe they helped me get a promotion to editor.

One set of FCP X plug-ins that I mentioned earlier is the Style Mattes. I really love using these when doing fast-paced, more modern videos that need a shape-based look. If you troll YouTube for inspiration, like I do, you will most likely stumble on a 15-year old-kid doing a tutorial (that will probably make you jealous) on how to use composite modes and also how to work in diamond shaped patterns with displacement mapping from some drawing made in Adobe Illustrator and After Effects. If you want that look without all of the After Effects hassle, Style Mattes are where you need to be.

In my demo video, I used the Style Mattes X plug-in in FCP X along with hard light composite modes to give a sharp and blown-out look. Rampant Design has tons of variations on matte shapes and motions that you can expand upon by resizing, cropping, re-timing and any other way you want to affect your footage. Style Mattes can also be used as transitional elements between clips or layers of clips, this may take a little tricky transition-building by you, but if you customize a wipe to the timing of the style mattes you can quickly make your own unique shape-based transitions.

Glitch Transitions
The last plug-ins I will touch on are Glitch Transitions. Sure you can throw glitches over a clip, I mean who hasn’t at this point? But what really gets me is the ability to change the displacement values in the Effects Palette. You can instantly modify and create amazing glitch effects with a few button clicks, all starting with Rampant Design Tool’s awesome base — if you screw it up go ahead and reset it.

The Glitch Transitions allow the editor to truly customize each transition if they feel so inclined, or leave the plug-in alone and walk away. I would suggest that you customize your Glitch Transitions if possible, even if it is a little time remapping, doubling up on transitions to get a layered look, or even just modifying the saturation a little. A little customization goes a long way, and without the Rampant’s FCP X plug-ins it would take a few more steps to make such a personalized effect.

If there is one criticism I have with the FCP X plug-ins, it’s the installation process. Installing the FCP X plug-ins is a little cumbersome. They do give a very detailed video and PDF along with your download that explains step by step with arrows how to install them, but a nice simple installation would go a long way in my book. For an awesome plug-in that is priced well below many other plug-ins out there but rivals any of them, I can let a little drag and drop installation slide — I’m probably just lazy.

Summing Up
In the end, I always seem to love what the team over at Rampant Design Tools comes up with. Sean and Stefanie output so many high-end products, it gets hard for me to keep up! I even heard through Rampant Live that they are about to start work on a blood pack, which got me excited thinking about the combination of possibilities when you combine that with their Monster Toolkit.

I have never been disappointed with Rampant Design Tools and continue to be impressed not only by their incredible products but how the Mullen’s approach selling their products. Check out their www.RampantLive.com show and you will understand.

Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com, and follow him on Twitter @allbetzroff. Earlier this year, Brady was nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.

Review: Polaroid’s GoPro 3-way stabilizer and 350 LED light

By Brady Betzel

We’ve all seen the trend: technology has started to take off exponentially and give everyday people access to professional-level equipment and software for semi-affordable prices. It also allows professionals to capture images in a way they might not have considered in the past.

For example, you can buy a GoPro Hero4 and create your own stunning timelapses in 4K, or even add dramatic lighting to your independent Etsy-style boutique iPhone photos for a few hundred bucks. Something I find infinitely valuable is the ability to watch other people’s successes, failures and instruction on YouTube for free. What I’m trying to say is that with a few affordable pieces of production gear, you can take average looking footage or stills to the next level.

A few months ago I saw a press release for a 3-axis GoPro gimbal and a high-powered, dimmable LED light made by, of all companies, Polaroid! I was immediately interested, mainly because I chase my sons around with a GoPro and get shaky footage that makes my professional video editor brain cringe — don’t even get me started on the lighting!
Polaroid was nice enough to send over some sample products that I have started to fall in love with — not only for the technology they pack but for the price they sell for.

Stabilizing Your GoPro
Up first in this two product review is the Polaroid Handheld 3-Axis Electronic Gimbal Stabilizer for GoPro Hero 3/3+/4 Action cameras. For months I had seen examples of gimbal stabilizers for the GoPro, but was always left second-guessing an over-$300 price tag for an accessory that was basically the same price as the main camera itself. Then I saw that Polaroid’s stabilizer cost $180 (with free shipping on Amazon) and realized that this technology wasn’t out of my price range anymore.

I opened the slick packaging and charged the three proprietary batteries for an hour. I was pleasantly surprised that I was ready to fly around. To strap your camera to the stabilizer there are two options: with the GoPro LCD Bacpac attached or without the LCD Bacpac. It’s a little clunky to outfit your stabilizer for the GoPro without the Bacpac (i.e. the GoPro Hero 4 with built in LCD), but once you are thumb-screwed into place, your camera isn’t going anywhere. One thing I learned was that you must start the stabilizer on a level surface like a tabletop. So place it on the handle on a level surface, push the power button on the handle and let the stabilizer balance itself for a second or two.

Putting the GoPro on the stabilizer wasn’t as easy as I thought it should have been, but it only took 10 minutes, so not all that life-altering. In addition, strapping the GoPro to the stabilizer is a semi-permanent thing, as it involves thumbscrews. On the bottom of the stabilizer there is a threaded ¼-inch mount that can be used to attach to a tripod or monopod. I even tried it out on a tripod, using two of the legs as my fulcrum points and creating a pseudo jib to get some real long and smooth tilts with the GoPro and it was pretty fun.

I had both a GoPro 3 and 4 lying around so I tried both, and they fit snugly. One complaint I had was that the ring used to secure the GoPro to the stabilizer goes around the lens and feels real tight — I had to twist and turn to get it on which left me feeling like I might rip my lens off — but it is definitely secure when it is screwed on. Don’t get me wrong, I love this thing and I would be hard pressed to find anything in the GoPro Hero 3/4 stabilizer category that is so low priced.

To test it out I told my son Atticus to get on his bike and ride. I ran after him with little to no training other than spinning around the garage a little bit. You can check it out on YouTube.

I started off walking but then picked up speed and ran a little (if you call that running). With GoPro videos, you definitely get better quality footage with good lighting and as little bouncing around as possible. The more stable your footage the less work the compression has to do, which basically means better detail and color fidelity in your video.

It doesn’t take long to get a handle on the 3-axis electronic stabilizer; it just takes a little practice time and patience to get the moves and footage you want. You’ll find out which ways the gimbal will and, more importantly, will not go when it starts to shake and go a little crazy (not limited to just this Polaroid stabilizer). This is a great accessory for anyone using GoPros in their work, and at under 10 inches long it can fit in a backpack when you go hiking! It gives you a real smooth and epic feel to your otherwise shaky action footage, and like I said, for $180 it’s a great price.

Dimmable Super Bright LED
Up next is the Polaroid 350 High-Powered Variable Dimmable Super Bright LED Light with Barn Door. After I played with the GoPro gimbal I saw that Polaroid also made lighting equipment that didn’t cost an arm and a leg and could really pump up my iPhoneography. What really got me thinking was the cold shoe adapters on top of my iPhone iOgrapher. The iOgrapher is a way to help stabilize your iPhone footage while adding options like tripod mounts, cold shoe adapters for accessories and even handles to even out your shaky footage. So why not use the iOgrapher mounts with the Polaroid 350 to make some stunning footage?

good lighting shotIf you’ve ever shot video or stills with your iPhone you know that you can get some incredible images. With the right lighting and stabilization, you can achieve looks that rival many professionally shot television shows airing today.

Earlier I mentioned an Etsy-like photo, and I really meant that you could bring your photos and video up to a pro level with just the addition of lighting. My wife recently got a lot of people asking her to make hair bows and bowties, so she opened an Etsy store. She had been taking photos with her iPhone 6s+ and they were good, but when we created a DIY lightbox (from instructions we found on Pinterest) and added the Polaroid 350 light her pictures really started to get that professional feel.

It was truly amazing at how improved the quality and most of all the color fidelity was just from using a little light. You can check out her Etsy store.

The Polaroid 350 is very easy to use. Inside the box you will find the physical light head, two lithium-ion type batteries (think older Sony camcorder batteries), a dual battery charger, swivel head mounting adapter, barn door with diffusion filter, carrying case, AC & DC adapter, UK- and EU-style plug adapters and a little manual. I let the batteries charge overnight, put them both on the back of the light and began lighting my wife’s bows and bowties.

I immediately noticed some lines on my wife’s pictures; it looked like something out of “Stranger Things” was going on. So I quickly ran out of the house… well, maybe not, but we added the diffusion and noticed a drastic diminishment of the line pattern. Later when we created the DIY lightbox with some tissue paper as diffusion we noticed the light pattern disappeared. We played around with the rotary knob that let us smoothly change the color temperature from a warm 3200 degrees Kelvin up to a cool 5600 degrees Kelvin, and the flicker-free LED brightness that goes from 10% up to 100%. Other than that the Polaroid light was an awesome way to add a level of professionalism to our iPhone footage and stills for $160.

There are a few other versions of this light offered by Polaroid, like one with an LCD monitor on the back to see the exact color temperature and battery-life left but you also pay an extra $20. There are a few versions under the $160 price tag, but a dual-battery operated light is really key when doing a lot of work without the want or need to stop a shoot and swap out batteries. I do wish I had two more lights to create a three-point lighting set-up!

Summing Up
In the end, I was really impressed with both of Polaroid’s products that I was testing, the Polaroid 350 LED light and the Polaroid Electronic GoPro 3-way Stabilizer. I was even more impressed with the prices!

The stabilizer really adds a smooth high-level production value to the GoPro. It is simple to set up, easy to use and with practice you will get some amazing GoPro shots. The Polaroid 350 LED light with variable color temperature and brightness along with dual battery ports really makes a shot jump off the screen. If you’ve never used external lighting in your shots before, now is the time.

Pick yourself up a Polaroid 350 LED Light — it is a portable, dimmable and lightweight LED powered light that will relieve the strain on your camera and give you more flexibility in post production (your colorist will thank you).

Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com, and follow him on Twitter @allbetzroff. Earlier this year, Brady was nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.

SIGGRAPH: Maxon Cinema 4D updates to R18

By Brady Betzel

During SIGGRAPH 2016, Maxon announced an update to its Cinema 4D to R18. The new release is scheduled to ship this September. While I am planning on doing a full review of R18 once it becomes available, I got a preview of the update from Maxon US president/CEO Paul Babb and Maxon Cineversity tutorialist-staple and VP of operations for Maxon US Rick Barrett. Once you hear Barrett’s voice you will know who I am talking about; he’s definitely given a lot of us some great tips and awesome entry into working in Cinema 4D.

My three favorite updates based off my preview are: the Voronoi Fracture Object, Object Motion Tracking and Thin Film Shader (and a bonus the OpenGL viewport display previews Reflections, Ambient Occlusion and Displacement Mapping).

Voronoi Fracture Object works in conjunction with dynamics and allows you to quickly break through a wall or even procedurally slice and dice vegetables as Babb and Barrett showed using spline or polygon shapes.

Building on Cinema 4D’s existing Motion Tracking, Object Motion Tracking allows the user to track models and other 3D-based objects into real-world footage with less back and forth round-tripping in Adobe After Effects and Cinema 4D via Cineware. In Maxon’s example, they used puff balls purchased from Jo-Ann Fabric and Craft Store as track points, measured the physical distance between them, tracked the objects in Cinema 4D R18, entered the distance between the puff balls and boom! A sweet Transformer-like helmet was tracked with the actor’s head movement needing only minor adjustments.

While there are many other big updates, I was oddly entranced by the Thin Film Shader. If you ever have trouble building materials with that oil slick type of glisten or a bubble with the rainbow-like translucence, Cinema 4D R18 is your friend.

I can’t wait to see some of the presentations that Cinema 4D and the team from GreyScaleGorilla.com have in store, along with other 3D artists. Check out their lineup, and follow them on Twitter @gsg3d. With so many updates like the enhancements to the OpenGL viewport, it will be a long wait until Cinema 4D R18 is released to the public. Check out www.maxon.com for their updated website, Cineversity’s Cinema 4D R18 highlights video, and follow them on Twitter @maxon3d.

Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com, and follow him on Twitter @allbetzroff. Earlier this year, Brady was nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.

Review: The Producer’s Playbook: Real People on Camera

By Brady Betzel

Yes, this is a book review! While I typically write reviews on post production hardware and software, occasionally I will see something off the typical NLE or plug-in grid that piques my interest, such as GoPro gimbals, lighting equipment and even books.

One day I was lurking around Twitter when I saw a tweet about a newly released book focusing on filming people from a producer’s perspective — from historical re-enactments to on-camera interviews. As an editor I have seen my fair share of raw interview footage. Sometimes I catch myself thinking, “Why didn’t you ask [insert question here]?” or “Why would you ask that?” I didn’t understand the psychology or reasoning behind the interviewer and their questions. This is why Amy Delouise’s “The Producer’s Playbook: Real People on Camera” got my attention.

Here is Amy interviewing a child for a project.

“The Producer’s Playbook: Real People on Camera” is a fast, engaging and self-reflective read that took me from the technical tips of interviewing to caring for the crew and talent to historical re-enactments to straight-to-camera talking with everyday real people. Within the first few pages I got the sense that Amy has been working in the trenches as a producer for many years. It’s not easy to find a producer that has the ability to draw the best out of people in interviews while still being strict enough to abide by a tight schedule and budget.

The introduction explains that Amy is well-versed in production and post, with over 400 productions under her belt. She is also currently a Lynda.com instructor, with courses like Video Editing: Moving from Production to Post and The Art of Video Interviews.

I originally started following Amy on Twitter at @brandbuzz when I saw a posting for the hashtag #GalsNGear, a movement to promote amazing women in the production and post production world. Check out their panel at this past NAB.

Digging In
“The Producer’s Playbook: Real People on Camera” is divided into 12 chapters followed by a few supplemental items, such as the Producer’s Pre-Production, Production and Post Production checklists. The 12 chapters contain 177 pages that can be read at a medium pace in a few hours. In this review I will touch on a few chapters that I particularly liked while not spoiling all of Amy’s book in one short review.

The chapter titles range from Getting To Know Your Subjects to Editing Workflow Strategies, all of which give a new perspective on editing, compiling and distilling information needed to tell a complete and well-rounded story.

Amy directing.

Up first is Chapter 9, Beyond the Soundbite. I really love this chapter because it helps an editor like me go beyond just asking the interviewee a question and dive into what motivates the story. Amy goes through the technical process of interviewing, including the beginning, middle and end of the interview process, but also adds important anecdotes, such as “When the End Isn’t Really the End.”

She reveals many of her personal techniques to help people feel comfortable during an interview as well as become their friend. One technique is to keep the camera rolling even after you are “done” interviewing; there may be some more interaction that you get that will further the story beyond what you had storyboarded in your pre-production.

Amy also gives the advice to “encourage, guide and reveal” throughout your interview. I love this advice because it can be applied toward things like client/producer interactions in the edit bay as well as a formal on-camera interview. For instance, if you are creating a motion graphic and the person giving you direction can’t quite get across their idea the way you would like, it is your job as the artist to “encourage, guide and reveal” the true essence of the client’s idea. Of course, this is easier said than done, but a great mantra to repeat in your mind.

Another compelling chapter is Chapter 10: Challenging Interviews and Subjects. I personally identified with this chapter because of the care that interviewers like Amy must use when dealing with delicate topics like mothers of children who have died of SIDs, or other tragic and hard to discuss topics. Amy writes, “And after all, telling the stories of real people is our great privilege, and we can’t forget it.”

The Producer's PlaybookOften times I find myself forgetting the person on the other side of the lens is an actual human being, sometimes I get caught up in my job and will treat the footage as just that, a piece of footage. It’s a great reminder to take a step back and remember that most of us got into production and post to tell stories and, hopefully, treat everyone with dignity and respect. It’s a lost art these days and I hope everyone reading this review and Amy’s book can remind themselves of that fact. It is definitely a privilege to shape story, not a right.

In the end I really was inspired and motivated by reading Amy Delouise’s “The Producer’s Playbook: Real People on Camera.” Everyone from editors, production assistants, sound mixers, students, teachers, and anyone involved in storytelling or wanting to learn the right way to conduct an on-camera interview can benefit from this book.

While on the surface this is a book that can give you insightful practical tips on interviews, such as lighting, camera movements on sliders, budgets and schedules (with full color pictures), I understood that Amy rose to such an esteemed level by caring deeply for the people in her storytelling craft and not forgetting the human elements that drive her story.

So while you should definitely take note of things like her Sample Mini Doc Budget on page 192 in the Appendix, there are also vital concepts that aren’t always on the forefront of a producer or director’s minds.

You can purchase “The Producer’s Playbook: Real People On Camera”  here, and can get a 20% discount by entering the code FLR40.

Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com, and follow him on Twitter @allbetzroff. Earlier this year, Brady was nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.

Review: Red Giant Trapcode Suite 13, Part 2

By Brady Betzel

In my recent Red Giant Trapcode Suite 13 for After Effects review, Part 1, I touched on updates to Particular, Shine, Lux and Starglow. In this installment, I am going to blaze through the remaining seven plug-ins that make up the Trapcode Suite. Those include Form, Mir, Tao, 3D Stroke, Echospace, Sound Keys and Horizon. While Particular is the most well-known plug-in in the Suite, the following seven all are incredibly useful and can help make you money.

Form 2.1
Trapcode Form 2.1 is best described as a particle system, much like Particular, but with particles that live forever and are used in forms like cubes. If you’ve used Element 3D by Video CoPilot you probably know that you can load objects from Maxon Cinema 4D into your Adobe After Effects projects pretty easily and, for all intents and purposes, quickly. Form allows you to load these 3D OBJ files and alter them inside of After Effects.

When you load the OBJ file, Form applies particles at each vertices point. The more vertices you have in your 3D object, the more detail you will have in your Form. It is really a cool way to create a techy kind of look for a HUD (heads up display) or sweet motion graphics piece that needs that futuristic pointillism type look. The original function of Form was to create particle grids that could be exploded or tightly wound and that would live on forever, as opposed to Particular, which creates particle systems with a birth and a death.

Form

Form 2.1

A simple way to think of how Form works is to imagine the ability to take simple text and transform it into “particles” to create a sandy explosion or turn everyday objects into particles that live forever. From Grids to Strings and Spheres to Sprites, with enough practice you can create some of the most stunning backgrounds or motion graphics wizardry inside of Trapcode Form, all of which is affected by After Effect lights and cameras in 3D space.

I was really surprised at how powerful and smooth Trapcode Form can run. I am running a tablet with an Intel i7 processor and I was able to get very reasonable performance, even with my camera depth-of-field turned on.

Mir 2.0
Trapcode Mir is an extremely useful plug-in for those wanting to create futuristic terrains or modern triangulated environments with tunnels and valleys. Mir is versatile and can go from creating smooth ocean floors to spiky mountain tops to extreme wireframe structures. Some of the newest updates in Mir 2.0 are the ability to add a spiral to the Mir landscape mesh you create (think galaxy); seamless looping under the fractal menu; ability to choose between triangles and quads for your surfaces; the really cool ability to add a second pass wireframe on top of your surface for that futuristic grid look; texture sampling from smooth gradients to solid colors; control of the maximums and minimums under z-range (basically allows for easier peaks and valleys); multi-, smoothridge, multi-smoothridge and regular fractals for differing displacements on your textures; and improved VRAM management for speedy processing.

Mir 2

Mir 2.0

These days GIFs are all the rage, so I am really impressed with the seamless loop option. It might seem ridiculous but if you’ve seen what is popular on social media you will know it’s emojis and GIFs. If you want to prep your seamless loop, check out this quick video from Trapcode creator Peder Norrby (@trapcode_lab).

Simply, you create beginning and end keyframes, find the seamless loop options under the Fractal category, step back one frame from your end loop point, mark your end-of-work area, go to the loop point (which should be one frame past where you marked the end to your work area) and click Set End Keyframe. From there Trapcode Mir will fill in the rest of the details and create your seamless loop ready to be exported as a GIF and blasted on Twitter. It’s really that easy.

If you are looking for an animated GIF export setting, try exporting through Adobe Media Encoder and searching “GIF” in the presets. You will find an “Animated GIF” preset, which I resized to something more appropriate like 1280×720 but that still came out at 49MB — way over the 5MB Twitter upload limit. I tried a few times, first with 50% quality at 640×360, which got me to 13.7MB. I even changed the quality down to 5% in Media Encoder, but I kept getting 13.7MB until I brought the size down to 320×180. That got me just under 4MB, which is perfect! If you do a lot of GIF work, an easy way to compress them is to use http://ezgif.com/optimize and to fiddle with their optimization settings to get under 5MB. It’s quick and it all lives online.

As with all Trapcode Suite plug-ins (or anything for that matter), the only way to get good is to experiment and allow yourself to fail or succeed. This holds true for Mir. I was making garbage one minute and with a couple changes I made some motion graphics that made me see the potential of the plug-in and how I could actually make content that people would be blown away with.

3D Stroke

3D Stroke

3D Stroke
One plug-in that isn’t new but will lead into the next one is Trapcode 3D Stroke. 3D Stroke takes the built-in After Effects plug-in Stroke to a new level. Traditional Stroke is an 8-bit plug-in while Trapcode 3D Stroke can run on the color-burning 32-bits-per-channel mode. If you want to add a stroke along a path that interacts with your comp cameras in 3D space, Trapcode 3D Stroke is what you want. From creating masks of your text and applying a sweet 3D Stroke to them to intricate 3D paths that zoom in between objects with a HDR-like glow, 3D Stroke is one of those tools to have in your After Effects tool box.

When using it I really fell in love with the repeater. Much like Element 3D’s particle arrays, the repeater can create multiple instances of your paths or text paths to create some interesting and infinitely adjustable objects.

Tao
Trapcode Tao is new to the Trapcode Suite of plug-ins. Tao gives us the ability to create 3D geometry along a path, and boy did people immediately fall in love with this tool when it was released. You can find tons of examples and tutorials of Tao from experts like VinhSon Nguyen, better known as @CreativeDojo on Twitter. Check out his tutorial on Vimeo, too. Tao is a tricky beast, and one way I learned about it in-depth was to download Peder Norrby’s project files over at http://www.trapcode.com and dissect them as best I could.

Tao

Tao

If you remember Trapcode 3D Stroke from earlier, you know that it allows us to create awesome glows and strokes along paths in 3D space. Trapcode Tao operates in much the same way as 3D Stroke except that it uses particles like Mir to create organic flowing forms in 3D space that interact with After Effects’ cameras and lights.

Trapcode Tao is about as close as you can get to modeling 3D geometry inside of After Effects at realtime speeds with image-based lighting. The only other way to achieve this is with Video CoPilot’s Element 3D or by using Cinema 4D via Cineware, which is sometimes a painstaking process.

Horizon 1.1
Another product that I was surprised by was Trapcode Horizon 1.1. In the age of virtual reality and 360 video you can never have too many ways to make your own worlds to pan cameras around in. With a quick Spherical Map search on Google, I found all the equi-rectangular maps I could handle. Once inside of After Effects, you need to import and resize your map to your comp size, add a new solid and camera, throw Horizon on top of your solid, under Image Map > Layer, choose the layer name containing your spherical image, and BAM! You have a 360-world. You can then add elements like Trapcode Particular, 3D Stroke or Tao and pan and zoom around to make some pretty great opening titles or even make your own B-Roll!

Echospace

Echospace 1.1

Echospace 1.1
Trapcode Echospace 1.1 is a powerful section in the Trapcode Suite 13 plug-in library. It is one of those plug-ins where you watch the tutorials and wonder why people don’t talk about it more. In simple terms, Echospace replicates layers and creates interdependent parenting links to the original layer, allowing you to create complex repeated element animations and layouts. In essence it feels more like a complex script as opposed to a plug-in.

Let’s say you want to create some offset animation of multiple shape layers in three-dimensional space, Echospace is your tool. It’s a little hard to use and if you don’t Shy the replicated layers and nulls, it will be intimidating. When you create the repeated layers, Echospace automatically sets your layers to Shy if you enable Shy layers in your tool bar. A great Harry Frank (@graymachine) tutorial/Red Giant Live episode can be found on the Red Giant website: http://www.redgiant.com/tutorial/red-giant-tv-live-episode-8-motion-graphics-with-trapcode-echospace.

Sound Keys 1.3
The last plug-in in the massive Trapcode Suite v13 library is Sound Keys 1.3. Sound Keys analyzes audio files and can draw keyframes based on their rhythm. One reason I left this until the end of my review is that you can attach any of the parameters from the other Trapcode Suite 13 plug-ins to the outputs of the Sound Keys 1.3 keyframes via a pick whip. If I just lost you by saying pick whip, snap back into it.

If you learn one thing in the After Effects scripting world, it’s that you can attach one parameter to another by alt+clicking (command+clicking) on the stopwatch of the parameter that you want to be driven by another parameter and dragging the curly-looking icon over the other parameter. So in the Sound Keys case, you can attach the scale of an object to the rhythm of a bass drum.

Soundkeys Color Orientation

Sound Keys 1.3

What I really liked about Sound Keys is that it not only can create a dynamically driven piece of motion graphics, but you can also use the audio meters it draws to visualize the audio. You see this a lot in lyric music videos or YouTube videos that are playing music only but still want a touch of visual flare, and with Sound Keys 1.3 you can change the visual representation of the audio including color, quantization (little dots that you see on audio meters) and size.

Easily isolate an audio frequency with the onscreen controls, find the effect you want to drive by the audio, and pick whip your way to dynamic motion graphic. If I was the graphics designer I wish I was, I would take Sound Keys and something like Particular or Tao and create some stunning work. I bet I could even make some money making some lyric videos… one day.

Summing Up
In the end, the Trapcode Suite v13 is an epic and monumental release. The total cost as a package is $999, and while it is a significantly higher cost than After Effects, let me tell you: it has the ability to make you way more money with some time and effort. Even with just an hour or so a day I feel like my Trapcode game would go to the next level.

For those that have the Trapcode Suite and want to upgrade for $199, there are some huge benefits to the v13 update including Trapcode Tao, GPU performance upgrades across the board, and even things like the second pass wireframe for Mir.

If you are a student, you can grab Trapcode Suite 13 for $499 with a little verification legwork. If you are worried about your system working efficiently with the Trapcode Suite you can check the technical requirements here, but I was working on an Intel i7 tablet with 8GB of memory and Intel Iris 6100 graphics processor. I found everything to be very speedy for the limitations I had. Tao was the only plug-in that wouldn’t display correctly, but rightly so, as you can read the GPU requirements here.

If I was you and had a cool $999 burning a hole in my After Effects wallet I would pick up Trapcode Suite 13 immediately.

Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com, and follow him on Twitter @allbetzroff. Earlier this year, Brady was nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.

Review: Red Giant’s Trapcode Suite 13, Part 1

By Brady Betzel

Have you ever watched a commercial on YouTube and thought, how in the world do these companies have the budget for the VFX and motion graphics work featured? Well, many don’t, but they do have access to talented artists with access to affordable tools that bring pricey looks. Most motion graphics creators have a toolbox full of goodies that help them build great-looking products. Whether it’s preset transitions, graphic overlays or plugins — there are ways to incorporate high-production value without the million-dollar price tag.

Particular

One of those tools that many Adobe After Effects motion graphics artists have in their toolbox is Red Giant’s Trapcode Suite, which is currently in version 13. While it isn’t cheap, if you are focused on that style of motion graphics, it can definitely pay for itself after just a few jobs. Inside the suite are magical plug-ins like the famous Trapcode Particular, Trapcode Form, Trapcode Mir, Trapcode Tao, Trapcode Shine, Trapcode Lux, Trapcode 3D Stroke, Trapcode Echospace, Trapcode Starglow, Trapcode Sound Keys and Trapcode Horizon. Holy cow, that is a lot.

The complete Trapcode Suite 13 works with After Effects (CS6 through CC 2015 officially, including the latest 2015.3 update, just make sure to download the update installer from Red Giant since it might not appear in your Red Giant Link updater), as well as a couple like Shine, 3D Stroke and Starglow that will also work in Adobe Premiere (the same version compatibility as After Effects). A good resource to get your feet wet is on the Red Giant tutorial page where you can find a lot of info and in-depth tutorials from the likes of the master Harry Frank (@graymachine) and Chad Perkins (@chad_perkins).

That being said, if you have no idea what the Trapcode Suite entails, buckle up. It is one of the most useful but intricate plug-ins you will see with a $999 price tag to match ($199 if you are upgrading). Of course, you can pick and choose the product you want, such as Shine for $99 or even Particular for $399, but the entire suite is worth the investment.

Particular

Particular

As an editor, I spend the majority of my time inside of a nonlinear editor like Adobe Premiere or Avid Media Composer/Symphony — probably 80 percent if I had to estimate, the other 20 percent is divided between color correction solutions and VFX/graphics packages like After Effects, Blackmagic Resolve, and others. Because I don’t get a lot of time to play around creatively, I really need to know the suite I am working in and be as efficient as possible. For instance, products like Mocha Pro, Keylight in After Effects and Red Giant’s Trapcode Suite 13 are enhancers that help me be as efficient as I can be as an editor without sacrificing quality for time.

In the latest Trapcode Suite 13 update, Trapcode Particular 2.5 seems to have been updated the most while Trapcode Tao is a new addition to the suite, and the rest were given modest enhancements as well. I will try to touch on each of the products so this will be a two-part review.

Particular
Trapcode Particular is one of the plug-ins that most After Effects nerds/aficionados/experts have encountered. If you have been a little wary and intimidated of Particular because of its complexity, now is the time to dive into using Red Giant’s incredible particle building system. In the 2.5 update, Red Giant added the Effects Builder, which seems to resemble the Magic Bullet Looks builder a little, and I love that. Like I said earlier I don’t typically have eight hours to creatively throw darts at a particle system in hopes of creating a solar system fly-through.

Luckily, the new Effect Builder allows you to easily create your particle system and be emitting (or exploding) in minutes. While it isn’t “easy,” per se, to create a particle system like those featured on Trapcode creator Peder Norrby’s (@trapcode_lab) website, the Effects Builder, along with some tutorial watching (mixed with some patience and love) will send you down a Trapcode rabbit hole that will allow you to create some of the most stunning artwork I’ve seen created in After Effects. Don’t give up if you find it overwhelming, because this is one of those plug-ins that will make you money if you can grasp it. One thing I did notice was the Effects Builder interface was tiny and did not scale with the resolution I was using on my system (2560×1440), but After Effects appeared fine.

If you are an experienced user of Trapcode Particular you should be happier with the updated graphing system that lets you set size and opacity over the life of your particle by directly drawing points on your graph, smoothing, deleting and even randomizing. I really loved using this graph. I immediately saw results that mimic using color curves against an RGB Parade and Waveform on a color scope. Particular has also bumped its particle count up from 20 to 30 million, which will matter to someone creating fireworks back plates for the Fourth of July, I’m sure.

Shine

Shine

Shine
Second on my Trapcode Suite 13 hit list is Trapcode Shine, which might not be the most obviously glamorous update to many people, but still has its merits. The largest update is the ability to attach Shine to After Effects light sources easily. Before you would have to do some fancy footwork that most editors don’t have the time or interest in doing, but as long as your light is named “Shine,” with proper spelling and capitalization, your light now controls the light rays produced by the Shine plug-in.

One thing that most After Effects users know to be a staple is the use of Fractal Noise. Whether you are trying to replicate light rays with realistic and organic effects or a fancy text reveal where you use a Fractal Noise mask as your transition, Fractal Noise is a must use effect. Trapcode Shine has Fractal Noise built into the plug-in now, including the use of 3D fractal noise to create a type of parallax within your light ray work. Simply, parallax is the way the foreground moves in relation to the background. Think of a camera on a slider as it moves from left to right your foreground might stay in relatively same position while the background moves much more — this is your parallax.

One thing that you will always use when applying Fractal Noise is animating the Evolution to add realism. Plus, adding the script “*time” to multiply the evolution factor is an easy way to move the fractal noise along its path. Shine has an “Evolution Speed” under the Fractal Noise heading that allows you to easily adjust the evolution without any scripting (I love this!). Being able to quickly add fractal noise into your light rays really improves my efficiency when a client asks for “that fancy text with those light rays poking through,” but wants to pay exactly zero dollars and zero cents.

Lux and Starglow
Trapcode Lux and Starglow are some other light-focused plug-ins that can add that subtle or dramatic detail to your work setting you apart from the rest of the general motion graphics population. Lux is a fast and easy way to add volumetric drama to point and spotlights. Much like the other plug-inStarglows, you need to apply Lux to a new solid, adjust the specific parameters for the spot or point lights in your composition and, my favorite part, tell Lux if you want to apply to lights named anything, “Lux,” “Front” or “Back.”

Simply, instead of just seeing the emanating light from an After Effects light source, you will now see the physical light source when Lux is added. Lux really shows its power when you need to add a light source to something like an after burner on a jet or the tip of a comet-like fireball. Adding physical light points so easily really opened up my way of thinking. It’s a relatively small feature, but it’s similar to knowing how to do something, but also knowing it takes four hours to accomplish it, so because of diminishing returns you just move along. Now I can do that same thing in little to no time and add that finishing touch easily. This makes me more money and makes the client more confident.

Trapcode Starglow is a small-yet-powerful plug-in that gives life-like glow to bright objects. Think of the star or cross-hatch streaks that can appear on stars or street lights in TV shows and movies. Included in all of the Trapcode Suite are presets, and Starglow is no different with 49 presets, each containing various ray length, color, ray direction and more — all of which are the starting points I like to use when figuring out just what type of Starglow I want to go with.

So far, I’ve covered four of 11 plug-ins contained in the Trapcode Suite 13, all of which are amazing and full of ideas that will undoubtedly elevate your work to a higher level. Something I have noticed over the last few years is a lot of amazing work that comes from those using After Effects; most of it, though, has the scent of a preset and/or tutorial that someone watched, duplicated and exported for their display. One tip that will overstep that ordinary look is to double- and triple-stack effects (in particular the same effect) to add varying levels of depth, color and detail that you couldn’t get with just one instance of a plug-in.

In Part 2 of my Red Giant Trapcode Suite 13 Review, I will tackle the rest of this behemoth plug-in set: Trapcode Form, Trapcode Mir, Trapcode Tao, Trapcode 3D Stroke, Trapcode Echospace, Trapcode Sound Keys, and Trapcode Horizon.

Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com, and follow him on Twitter @allbetzroff. Earlier this year, Brady was nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.

Review: Promise Technology’s Pegasus2 R2+ RAID

By Brady Betzel

Every day I see dozens of different hard drives — from some serious RAIDs, like the Avid Nexis (formerly Isis), all the way down to the single-SSD via Thunderbolt. My favorite drives seem to be the ones that connect easily, don’t have huge power supply bricks and offer RAID options, such as RAID-0/RAID-1. If you’ve been to an Apple Store lately then you’ve probably ran into the Promise Technology Pegasus2 R2+ line of products. Also, the Pegasus2 R2+ is featured under the storage tab on www.apple.com. I bring that up because if you are on that page you are a serious contender.

The Pegasus line of products from Promise is often thought of as high-end and high-quality. I’ve never heard anyone say anything bad about Pegasus. From their eight-bay R8 systems all the way down to the R2, there are options to satisfy any hardware-based RAID need you have from 0 to 1, 5 or 6. Lucky for me, I was sent the R2+ RAID to review. I was immediately happy that it was a hardware-controlled RAID as opposed to a software-controlled RAID.

Pegasus2 R2+

Software RAIDs run 100 percent on an external system to control the data structure, but I like my RAID to control itself. The Pegasus2 R2+ is a two-drive hot swappable RAID loaded with two 7200RPM, 3TB Toshiba hard drives. In addition, there is a third bay — the Media Bay — on the top that can be loaded with different pods. You can choose from an SSD reader, a CF/SD card reader or even an additional 1TB hard drive, but it ships with the CF/SD card reader. Keep in mind these pods will only work when connected via Thunderbolt 2 — under USB 3.0 they will not work. Something cool: when you pop out the interchangeable pods they can connect via USB 3.0 separate from the RAID case.

In terms of looks, the Pegasus2 R2+ has a nice black finish, which will go well with any recent Mac Pros you might have lying around. It has a medium-to-small footprint — picture two medium-sized books stacked on top of each other (5.3 x 7.3 x 9.8 inches). It weighs about 13.5 pounds and while I did stuff it in my backpack and carry it around, you know it’s in there. The power cord is nice. I detest the power bricks that typically accompany RAID drives, laptops and anything that sucks a good amount of power. To my delight, Promise has incorporated the actual power supply inside of the RAID, leaving a simple power cable to attach. Thank You! Other than that you have either a USB 3.0 cable or a Thunderbolt 2 cable included in the box.

Running Tests
Out of the box, I plugged in the RAID and it spun up to life. For this review, I found a Mac Pro running a 2.7GHz 12-core Xeon E5, with 64GB of DDR3, and an AMD FirePro D700 graphics card, so there should very little bogging down the transfer pipes when running my tests. I decided to use the AJA System Test for disk-speed testing. I started with the drive in RAID-0 (optimized for speed, both drives are together, no safety) because that is how it is shipped.

DiskSpeedTest Thunderbolt copy

Over Thunderbolt 2, I got around 390MB/sec read and 370 MB/sec write speeds. Over USB, 3.0 still configured in RAID-0, it at about 386MB/sec read/write. When I turned the RAID over to RAID-1 (made for safety, so if one drive is damaged you will most likely be able to have your data rebuilt when you replace the damaged drive), I definitely saw the expected slow down. Over Thunderbolt 2 and USB 3.0, I was getting around 180MB/sec write and 196MB/sec read. Don’t forget, the 6TB drive that ran in RAID-0 is now 3TB when configured in RAID-1.

On the front of the R2+ you have two lights that let you know the drive is plugged in via Thunderbolt 2 or USB 3.0. This actually came in handy, as I was looking to see how I plugged the drive in. Cool!

One thing I was very happy with was how simple the Promise Technology RAID configuration tool was to use. Not only will it give you stats on the drive, like temperature of the drives, health of the drives and even fan speed, it lets you format and designate RAID configurations. This alone would make me think of Promise first when deciding on a RAID to buy. I really liked how simple and easy to use the RAID configuration software was to use.

As a final test I left my Pegasus2 R2+ configured in RAID-0 and pulled a drive out while transferring media to the RAID. The status light on the front changed from a bright blue to an amber color and began to blink. Inside of the Pegasus2 RAID configuration tool an amber exclamation point appeared next to the RAID status as expected. I left the drive alone so it could rebuild itself. Two hours later it was still running, so I left it alone overnight. I didn’t accurately time the rebuild, but by the time I came home the next night it was complete. I only had a few hundred gigabytes worth of data on it, but in the end it came back to life. Hooray!

General Thoughts
In the end, I really love the sleek black exterior, lack of a huge power brick and the RAID configuration software. The additional Media Pods are a cool idea too. I like having a Thunderbolt 2 CF/SD card reader (or better yet an SSD reader — think Red Mag) always ready to go, especially on the Mac Pro shaped like a black cylinder with no card readers built in.

I would really love to have seen what this could do when loaded with SSD drives, but since this review is about what comes with the Pegasus2 R2+, that’s what I’ve done.

Promise Technology has been around a long time and has been known to me to offer very reliable storage solutions. Keep in mind that the R2+ is shipped with the CF/SD card reader, but the other pods can be purchased separately. I couldn’t find anyone selling them online though. When I was writing this review, I saw the retail price of the Pegasus2 R2+ range from $749 to a little over $800. You get a two-year limited warranty, which covers all parts except for the fan and power supply. They are only covered for one year (kind of a bummer). When returning the product for warranty work, you can opt to be sent a loaner, but a credit card is required in case you don’t return it. In this instance, you will be charged retail price of the loaner). You can also opt to send yours in and wait for it to be replaced. Take note that you need a copy of the original receipt and boxes for return.

Summing Up
I really love the stability and elegance of the Pegasus line of RAID systems, and the Pegasus2 R2+ lives up to the beauty and name. If you are a small company or one-person band transferring, transcoding and editing media without the need for SSD speed or Thunderbolt 3 connection, this is the sleek RAID for you.

Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com, and follow him on Twitter @allbetzroff. Brady was recently nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.

Review: Divergent Media’s EditReady 1.4 and ScopeBox 3.5

Affordable transcoding and monitoring solutions

By Brady Betzel

It’s been almost two years since I first reviewed Divergent Media’s video transcoder EditReady version 1.0.2… and I was thoroughly impressed with the speed and ease of use. The only thing that left me wanting more was the fact it was a Mac-only product.

While Divergent still hasn’t made Windows versions of their apps, I did recently see a tweet from EditReady, ScopeBox and ClipWrap developer Mike Woodworth (@vexed) in early May that made me think it might be on their radar. The tweet said, “Every time a hot, new GPU comes out, I get pushed a tiny bit more toward building a PC.” Because I was so excited at the possibility of it coming out on PC, we reached out to Woodworth, who said this: “As a small company, to date we’ve been focused on making high-quality Mac software. We do occasionally get requests to port ScopeBox to Windows, but we don’t have active plans to do so. Most users build out a standalone system to run ScopeBox, and we’ve worked hard to maximize our performance on entry level Mac hardware such as the Mac Mini. We are taking a close look at demand for Windows and if that begins to grow in a substantive way, we’ll shift our roadmap.” Oh, well (sigh).

For this review — conducted on my old-ish MacBook Pro — I am covering two of Divergent Media’s latest releases: the Mac-based EditReady 1.4 and ScopeBox 3.5. EditReady is a video transcoder and ScopeBox is a software video scope solution. To this day, EditReady has been the fastest media encoder and on a Mac that I have ever used. And, as far as ScopeBox is concerned, I’ve been looking to try this out for a while, and now is the perfect time since EditReady now works with ScopeBox via ScopeLink. You can see the technical results of any compression or LUT you are applying in EditReady through ScopeLink.

ScopeBox

ScopeBox

Here is a quick tip to get ScopeBox talking to EditReady: in EditReady you need to be previewing your file by hitting Command + 3 or going to the Clip menu and clicking Open Preview. Playing in the EditReady window will not transmit the signal to ScopeBox.

EditReady
In this latest update to EditReady (v1.4), we get the ability to run our video through ScopeBox via ScopeLink. ScopeLink, which has been around for a bit, allows ScopeBox to process video through apps like Apple FCP X, Adobe’s Premiere Pro, SpeedGrade and After Effects, and now EditReady. What’s cool about this is that if you need to do a quick quality control check of your video, looking for illegal color values, and don’t have time or access to a hardware scope like a Tektronix, ScopeBox will work quickly and easily with EditReady on the same computer.

So now, in addition to using the video scopes, you can batch convert a bunch of clips to an intraframe editing-friendly codec like DNxHR, burn in a LUT and preview it through ScopeBox to see where it hits on your waveform or RGB parade. Keep in mind these are two separate software apps and both need to be purchased for this to work.

EditReady

EditReady

If you were in post production about five years ago, you were probably all about the app ClipWrap, especially when it came to incompatible QuickTime wrappers like today’s often incompatible AVCHD. EditReady has adopted the ClipWrap functionality as well as transcoding. As an added bonus, EditReady automatically joins spanned files like GoPro, AVCHD, MXF (camera MXF not Op1a) and HDV.

Something I really love in EditReady is the ability to take high frame rate media and set it to the frame rate you want to edit in. It won’t add or remove frames but it will adjust the speed accordingly. In the past this was a little bit of hassle to get to work right, but now it’s easy with EditReady. Metadata is another strong suit of EditReady. When using clips from cameras like the GoPro, some NLEs won’t properly read the timecode track. Within the metadata browser in EditReady you can assign timecode to each file. This really helps when making proxy files to be used with an offline/online workflow.

If you want some technical speed test results, check out my previous review of EditReady — the same speeds are still present. I transcoded a 6:30-minute ProRes (standard) QuickTime to ProRes proxy in just under realtime. While this might not seem like much, I’m running a MacBook Pro from 2009 with 8GB of RAM and the last model to ship with an Nvidia card, so that’s a great speed for this system. One day I’ll get that new MacBook Pro. Hint, hint Apple. Just kidding… kind of.

ScopeBox
If you’ve worked in color correction and have learned and used hardware color scopes you quickly realize how important they are. Unfortunately, scopes like those by Tektronix are not cheap. So what do you do? You could rely on the scopes inside your DaVinci Resolve, Premiere Pro or Avid Media Composer/Symphony, but those can get bogged down quickly. Another solution is to take that old Mac Mini, MacBook or tower and get your signal from your editing/coloring system into the ScopeBox system. This might require not only ScopeBox but also a Thunderbolt capture device like Blackmagic’s UltraStudio Mini https://www.blackmagicdesign.com/products/ultrastudiothunderbolt — do your research though because there may be some chroma subsampling issues like not getting a signal higher than 4:2:2. But I digress.

ScopeBox

ScopeBox

In short, you can use an old system as your scope — it’s really a great way to use scopes without bogging down your current system. Nonetheless, you can use the same system you edit/color on if you use apps like FCP X, Premiere Pro, or After Effects. Basically, if the app you use outputs via QuickTime or Adobe Transmit, ScopeBox should work.

ScopeBox contains many of the normal meters such as RGB Parade, YUV Parade, Waveform, Luma Histogram, RGB Histogram and Vectorscope, plus some bonuses like a timecode display and even the HML Balance palette that displays three distinct vectorscopes focusing on Highs, Mids and Lows. It also contains audio and surround meters for audio reference. One measurement tool that ScopeBox does not contain that many people seem love, including myself is the Double Diamond Gamut display, which is copy written by Tektronix, but those scopes also cost a pretty penny.

The Vectorscope does have the ability to zoom but only to 1.875x and 2x strength. In addition you can change graticule style of the vectorscope to Hue Vectors (a style created and popularized by Alexis Van Hurkman, @hurkman on Twitter). In terms of quality control, there are many settings you can set when running video through ScopeBox like Audio Peak and Chroma Excursion; you can even export an FCPX XML of any QC (quality control) flags to add locators to a sequence — pretty awesome!

Not only is ScopeBox literally a scope but it can also capture live video and encode it using codecs like ProRes and DNxHD for later viewing. If you are on set running your picture through ScopeBox you can enable some great functions like Luma, Focus and Chroma Zebra striping, giving you an idea what is overexposed, underexposed, or even out of focus.

EditReady

Summing Up
Practically speaking, ScopeBox worked great even on my old MacBook Pro. I used it from both EditReady as well as Premiere Pro without problems.

Keep in mind that ScopeBox is a high-level application that is running in parallel with your other high-end applications such as Premiere Pro, so having the best system you can, with tons of memory and a high end graphics card, will be your best bet to run this setup successfully.

In the end, ScopeBox is a great app that many colorists use, and now that it can work cleanly with EditReady you have a great combo of transcoding and monitoring solutions for under $200. From using ScopeBox straight out of Adobe Premiere Pro on the same computer or going outboard to another system, you can color correct and grade with the same confidence as with the high-priced hardware scopes.

I have loved EditReady since the day it came out — if only it would find its way over to the Windows dark side would I truly be content. And, with QuickTime removing its support from Windows, maybe now is Divergent Media’s time to strike. It is consistently the fastest and simplest way to batch transcode GoPro media, MXF, AVCHD, M2T and any other QuickTime MOV file you have.

Separately, EditReady costs $49.99 and ScopeBox costs $99.99. Together you can buy them both for $119.99 — a true steal for the functionality deep inside these apps.

Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com, and follow him on Twitter @allbetzroff. Brady was recently nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.

Top 5: Efficiency tips for your health and editing environment

By Brady Betzel

Sometimes in the edit bay, I find myself feeling sluggish because I haven’t moved from my chair in four or eight or more hours. Usually, I can fix this by working out for a half hour before I leave for work, and I try to get in some kettlebell swings and battle rope maneuvers along with bodyweight stuff like push-ups and pull-ups.

With two kids, I sometimes feel guilt about not being home every second I can, and this very often leads me astray and causes me to forget to do a few little things to keep my mind right.

With that in mind, I’d like to offer up my top five tips for enhancing efficiency when being stuck in a chair all day.

1. Move Around
The number one thing an editor can do to cause laziness and stagnation is literally being lazy. Sitting in your chair all day — drinking coffee and not water — staring at pixels for 12 hours will not get that mind in gear to edit creatively. If possible, take a five-minute walk around the block. If not possible, do push-ups — you have the equipment with you at all times. I try to hit my age as a goal, for example I will try and do 33 pushups within an hour, even doing this once a day will dramatically help you out.

If you are searching for some exercise tips I suggest checking out www.onnit.com/academy, specifically https://www.onnit.com/academy/training/bodyweight, which focuses on bodyweight exercises. It’s free and is updated regularly with fun and unique workouts.

2. Meditate, Pray, Zone Out… Whatever
Give yourself five minutes of peace and quiet. No podcasts, no Pantera, no Taylor Swift — just sit in a quiet room with all of your monitors powered off, if possible, and clear your head. Sometimes, if I can’t stop my mind from working, I will try to focus on little things like breathing at a consistent pace or how I can be nicer to people and myself.

2. Drink Good Coffee AND Lots of Water
If you believe in drinking coffee like I do, find yourself a good batch of coffee and brew it in something nice like a French press or an AeroPress. My number one rule when downing espresso and coffee is to not forget to drink tons of water too, otherwise I will get angry and dehydrated. This is one I constantly have to remember.

3. Keep Your Area Clean
I find that editors come in two forms: messy and obsessive compulsive. I know it’s hard to always be tidy, but who wants to see a messy editor bay or desk in the office? It makes my skin crawl when I have to wade through other people’s junk just to get my Wacom pen or fader on the mixer. I know when my area is clean my mind is usually more focused.

4. Force Yourself to Be Pleasant
I often find myself in a dark room, plugging away at keyframes and bezier curves, and forgetting to smile. It’s crazy what can happen if you force yourself to smile, and it is contagious. Try it out, even if people laugh at you and say what is up with happy face — you just made them think twice about being happy. It will really make your day better.

Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com, and follow him on Twitter @allbetzroff. Brady was recently nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.

Review: HP’s zBook 17 G3 mobile workstation

By Brady Betzel

Desktop workstations have long been considered the highest of the high end and the fastest of the fast. From the Windows-driven HP Z820 powerhouse to Apple’s ubiquitous Mac Pro,  multimedia pros, video editors, VFX editors, sound engineers and others are constantly looking for ways to speed up their workflow.

Whether you feel that OS X is more stable than Windows 10, or you love the ability to use Nvidia’s Quadro line of graphics cards, one thing that pros need is a reliable system that can process monster DNxHR, ProRes 4444, even DPX files, and crunch them down to YouTube-sized gems and Twitter-sized GIFs in as little time as possible.

What if you need the ability to render a 4K composition in Adobe After Effects while simultaneously editing in Adobe Premiere on an airplane or train? You have a few options: Dell makes some pretty high-end mobile workstations, and Apple makes an outdated MacBook Pro that might hold up. What other options are there? Well, how about HP’s latest line — the HP zBook Generation 3? I’m focusing on the 17-inch for this review.

One of the fringe benefits when buying a workstation targeted at post pros is they are tested with apps like Adobe’s Creative Cloud, Avid Media Composer and Autodesk’s Suite of apps — better known as ISV Certification (ISV= Independent Software Vendor). HP and selected software vendors spend tons of time making sure the apps that are most likely used by the high-end zBook users are strenuously tested. Most of the time this means increased efficiency.

For example, being able to choose a graphics solution like the Nvidia Quadro M5000M with 8GB of RAM and 1,536 CUDA Cores instead of the AMD FirePro W6150M with 4GB of RAM because you want CUDA-enabled renders is a choice you get because HP spent time testing the highest-end graphics cards to be placed in this system.

Here is a rundown of the specs in the zBook G3 I tested:
– Processor: Intel Xeon CPU E3-1535M v5 — four cores, eight threads, 2.9 GHz
– Memory: 32GB DDR4, 2133MHz
– NVMe SSD drive: NVMe Samsung MZVPV512 – 512GB
– Graphics card 1: HD graphics P530 1GB
– Graphics Card 2: Nvidia Quadro M5000M 8GB
– Screen: 17.3-inch diagonal FHD UWVA IPS anti-glare LED-backlit (1920×1080)
– Audio: Bang & Olufsen HD audio
– Built-In Battery: HP Long Life 6-cell 96 WHr Li-ion prismatic
– External Ports: four USB 3, Gigabit RJ-45, SD media, smart card reader, microphone/headphone port, two Thunderbolt 3, HDMI, VGA, power and security cable slot.
– Full-size spill resistant keyboard with numeric keypad
– Operating system: Windows 10
– Warranty: 3/3/3 – three years parts, labor and on-site (limited restrictions apply)

What Do I Really Think?
Some initial takeaways after using the zBook G3 are: it features very sturdy construction, it offers lightning quick speed and connections, and it has an amazing battery life when paired with the power the zBook G3 harnesses. Obviously, the battery life drains faster when really using the zBook G3 in conjunction with power hungry apps such as Maxon’s Cinema 4D, Adobe’s After Effects, Premiere or Media Encoder, but the now built-in battery is the longest lasting that I have experienced in a mobile workhorse.

I recently took this mobile workstation to San Francisco for the GoPro Developer Program announcement, and it lasted all day. Lasting all day is nice because the power supply is not small and it is not light. I wish I had left it at home, but I was scared I would run out of battery power. When talking with the HP crew during this review process, they stressed how they improved the battery life even though the machine’s speed and power was increased, and they were not lying. But like I said, when using apps like Adobe Media Encoder you are going to drain your battery faster. But I could get two to three hours while transcoding in Media Encoder, which is still pretty great.

Stress Test
With powerful workstations like the HP zBook G3, I like to run Cinebench (a standard in benchmarking for many reviews), a render and speed stress test made by Maxon. I had some interesting results, for OpenGL it was 5th, bested by some desktop graphics cards like the AMD Radeon HD 5770, Nvidia GTX 460, Nvidia Quadro 4000 and the mobile card the Nvidia Quadro K4000M. The Intel Xeon CPU E3-1535M v3 tested 5th, topped by three Intel i7s and one Xeon — all desktop processors. Surprisingly, when tested for CPU single core it ranked second, topped only by the Intel i7-4770K.

Practical Test
As an editor with a lot of experience in the prep and delivery of footage and final products, when I hear workstation I think an encoding and transcoding beast. A typical task in my daily work is to transcode hour-long episodic QuickTimes from codecs like ProRes or DNxHD to something like an H.264 or an MP4. My first test was to compress a two-hour DNxHD 175 QuickTime to the YouTube 1080p setting in Adobe’s Media Encoder, which is a 1920×1080, 16 Mbps, MP4 — fit for decent quality, balanced with a low file size. It took 80 minutes (about 2/3 realtime), which is pretty good considering I’m working on a mobile workstation. On a high-end desktop workstation like the Mac Pro or z840 I might get that down to about (1/4 realtime, or about 30-40 minutes).

My next test was to transcode a 44-minute DNxHD QuickTime to the YouTube 1080p setting in Adobe’s Media Encoder. This file took 33 minutes to transcode, roughly ¾ of realtime. I tried compressing a ProRes HQ 50-minute long QuickTime to the YouTube 1080p MP4 setting and it took around 40 minutes. So all in all, you are getting a little faster than realtime, and if you need it to be faster you should probably be compressing on a desktop workstation.

Other Observations
I was able to really appreciate the large IPS screen that is very bright and very clear. One thing I notice as I get older is that I need larger screens (yuck, I think I just fainted… definitely getting old). On mobile workstations it’s hard to get a large screen that is also easy to view for multiple hours, but this HP matte screen is great.

Another thing I really like is the branded speakers. Most laptops have half decent speakers at best, but the zBook comes with Bang & Olufsen speakers that offer sound way above other laptop speakers I’ve heard. I definitely plugged in headphones, but in a pinch these were more than good. I particularly liked the full-sized keyboard with numeric keypad (any editor who has to enter timecode knows how important the numeric keypad is for this).

In the End
I love HP’s line of z series workstations, from the super-high-end z840 to this zBook G3. If you are looking to transcode a 44-minute QuickTime in under 15 minutes, you are going to need a system like the HP z840 with 64GB of RAM and an SSD under the hood.

If you need similar power to the z840 but in a mobile powerhouse, the zBook G3 is for you. With peripherals like the HP Thunderbolt 3 dock you can keep your Thunderbolt 3 RAID, display ports for your UHD/4K monitors and even more USB 3 ports stationary at home without having to always hook up and unhook your peripherals every time you get home from office. The 200W dock will cost $249, and the 150W dock is $229 (for the 17-inch G3 you will need the 200W version). The power supply to charge the zBook G3 is not small, so using the dock as a charging station and peripheral connector is definitely the way to go.

One issue I had with the zBook has to do with HP ditching the Thunderbolt 1/2 connectors. It’s kind of funny to see a VGA port next to an HDMI and Thunderbolt 3 ports without a Thunderbolt 2 connection, or at the least I would have hoped HP would include an adapter with their zBook. I asked HP about this and they said other companies were already tackling the Thunderbolt 1/2-to-3 converters. While it’s not a huge issue, it’s interesting to see them ditch such a new interface like Thunderbolt 2 (which was in the zBook G2) when I know their customers have recently invested in Thunderbolt 2 devices and there is no easy way to connect them to this zBook G3, other than buying a $100 adapter, after paying for the mobile workstation. Obviously I am nitpicking, but it stood out to me.

Moving on, the zBook G3 is one of the most solid mobile workstations I have touched. It’s not light, but it’s not meant to be. HP has other options for users looking for a Windows-based PC that rivals the MacBook Air. The zBook isn’t as powerful as its stationary workstation line, but it won’t let you down if you need something to encode QuickTimes on the go or create proxies for your Blackmagic Resolve 12.5 or Avid Media Composer 8.5 projects. It will even run Cinema 4D without skipping a beat.

If you have the money, the zBook G3 is at the top of my list for a workstation that fits in a backpack, lasts upwards of five hours on battery life, and can chew up and spit out media files.

Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com, and follow him on Twitter @allbetzroff. Brady was recently nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.

Simple tips that will help you work more efficiently

By Brady Betzel

Recently, I was asked to share some best practices surrounding the editing process… little things that can make doing your job that much easier and more efficient.

Get Comfortable With Your Equipment
Whether you are using a Wacom tablet, Razer mouse, Premiere Pro keyboard, Palette controls or Tangent Element panels, knowing how they work will make you money. If you are on a salaried job and you are fast and efficient (most likely if you work at a decent place) you will be able to leave early when the job is done. When I first learned my Wacom tablet I spent some time just using the hot keys on the side and discovering how I could use them to my benefit. Sometimes I would set up macros on them just to see how far I could go.

Learn Something New Every Day
If time allows, I try to watch one tutorial a day on YouTube, Lynda.com or another place that can make me smarter. Whether I am learning audio tips, After Effects scripts, Avid Effects tips or something unrelated to video and editing, I always gain something.

Even if the tutorial is taught by an eight-year-old on an iPad — if it looks better than anything I’ve ever done, I’m seeing a new viewpoint or discovering a tip I’ve never seen before — you never know where inspiration will come from. So keep on learning… it will not only make you smarter, you will probably work faster too.

Get in Some Exercise
While I try to workout before I go to work a few days a week, it isn’t always possible. I try to get at least a few sets of push-ups in during my workday. This helps to get my blood going. An easy game to play is to try and hit your age in pushups in an hour. While it won’t get you in crossfit box jumping shape, it will get your blood circulating and your mind thinking clearer.

Learn What Someone Else’s Job Entails
When I do have spare time, I like watch other people doing their job. On my way up the professional ladder, I always learned from watching people I admired; whether it was a producer, editor or production assistant. Lately, I like to watch the guys and gals in the machine rooms. Just the other day, I learned how ISDNs were patched and what codecs were used in transmission. While it doesn’t relate directly to my job, it really makes my mind keep thinking of different things and find new perspectives on my own work.

Set Yourself up for Success
This is a terrible cliché, but it really has staying power. There is value in being prepared. For example, when I was a kid, my dad always taught my sister and I to be aware of the closest exit, no matter where we were — one of the perks of growing up in earthquake prone Southern California.

At home, I always learned to keep my play area clean, so when I needed to I could sit down and use it without having to wade through a mess. As a side note this might have also led me to be super obsessive compulsive about a clean workspace, or my need for a color-organized closet (sorry to my wife), but still it will only help your efficiency if you can just sit down and work.

Find your exit or path to working fast and efficiently. Whether it’s a tidy desktop on your computer, literally a clean desktop where you work or a bin with all of your preset plug-ins at the ready for when you need them. It can’t hurt to be prepared.

Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com, and follow him on Twitter @allbetzroff. Brady was recently nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.

Review: Fayteq’s FayIN Planar Tracker

A toolset that simplifies simple tracks and match moves.

By Brady Betzel

Tracking has become a hobby/obsession of mine lately, mainly because it’s not easy to do it well. Personally, I’m a fan of Imagineer’s Mocha Pro, but for some Mocha is either too hard, too time consuming or too pricey. Luckily for them, there is a tool that is made for those looking to do simple planar tracking and image inserts: FayIN from Fayteq (@fayteq). Simply, FayIN is a planar tracking plug-in for Adobe After Effects that allows the user to easily insert objects onto surfaces or computer screens relatively quickly.

As many of you know, if you dive deep into products like Mocha, SynthEyes, Element 3D or Cineware, for example, it can take hours of tutorials and practice to understand how the plug-in works. I was expecting FayIN to take at least a day to learn, but to my surprise it only took about an hour of watching all of their tutorials and another 30 minutes to figure out how FayIN worked. After I got used to it, I was thinking to myself, “Is that it?” As in, it’s simple and it does one thing. So when reading this review, or trying FayIN for yourself with their 30-day free trial, keep in mind that this plug-in isn’t proclaiming to be a Mocha replacement, but instead a toolset that simplifies simple tracks and match moves.

FayIN is a planar tracker that builds its own 3D world to align a camera for proper placement and corner pinning. In my testing, I found that FayIN shines when the footage is free of quick pans and extreme zooms and is as simple of a shot as possible. Think of a computer screen or picture on a wall that you want to replace.

How to Track a Shot
In After Effects you will need to load footage you want to track into a new comp. Keep in mind you can either set your in and out points for the track in the FayIN tool panel or you will need to trim your footage, put that into a pre-comp and apply the FayIN plug-in. From there you will see your video in the FayIN tool panel, click on that and add a new track. A new dialogue box named Track Properties will open, and this is where you will either create a rectangle bounding box around the area you want to track or use the brush-like tool to paint over the surface you want to track. You will also choose whether the area you want to track is a static image (picture of a house) or a moving object (picture of moving truck). There are some more advanced options, but for the most part you probably won’t be using them. Then just click to start. You will see a progress bar in the Effect Properties panel, as well as the FayIN tool panel.

Once complete, FayIN will add a placeholder in your comp, labeled “FayIN Placeholder,” that is colored a nice blue. In your comp viewer you should see a slightly see-through blue object that, in theory, will match what you are tracking. From there, in the FayIN tool panel, you can right click on the FayIN Placeholder and click “Exchange Insert Footage.” It will ask if you want to reformat the footage to the aspect ratio of the track or if you want it to fill the object regardless of aspect ratio (which will look stretched, most likely). I would go with the first option, in most cases, and finally your object will be placed in FayIN’s track. Hopefully, it locks into place and all is well; if you use simple footage you may have to do some slight rotation or z-space fixing, but you should be in business.

If you do need to do some tweaking you are going to want to click on the footage you tracked in your comp timeline. Then in the Effect Controls panel find the track that corresponds to the item you want to fix, click “Active” and you should be ready to adjust your rotation, x-, y- or z-space, and even see a report with hints at how you can make your track more successful. The main corrections you will probably have to make are aligning your tracking plane in z-space and adjusting the proper rotation. Luckily, FayIN will tell you that in the report!

I do have to say that the ability to tweak things like size and rotation at anytime without affecting keyframes is a nice feature. If you want to really get down to the nitty-gritty you will want to watch some of the advanced tutorials Fayteq has created. They are typically around five to seven minutes, so not too long. Check them out here.

Testing
I tested FayIN using some low-light video I shot with my iPhone 6, mostly because I was lazy and I figured that would be the footage that would be hardest to track with all the noise and low level of detail. To my surprise, barring the panning, FayIN performed great, and I was able to replace a photo on my wall in a matter of minutes. You can check out my example on YouTube.

I also tried some footage where a model rolled her head from left to right, looking away from the camera and eventually toward the camera. I wanted to insert some of Rampant Design Tools (@rampantdesign) Monster Toolkit Eyes onto the model. It was pretty tricky to get it to stick properly. Also, keep in mind that you cannot add an erase mask on top of the inserted object. You would need to duplicate the footage and create a custom mask. I wanted to erase the insert where her eyelids opened up to reveal the monster-like eyes, but I had to do it the old-fashioned way.

So What Do I Think?
In the end, I think FayIN is a great planar tracker if used for simple tracks. I had some problems tracking anything with quick pans, zooms or even with stuff that might have rotated into frame. I do understand that tracking isn’t easy and takes a lot of work, most of the time, but I also think that if you are buying a plug-in whose sole purpose is to track easily and simply, it should do just that. While FayIN does do what it is supposed to, and does it very easily, it doesn’t work in all instances. Furthermore, FayIN does not do anything that you can’t do with After Effect’s built-in 3D tracker and/or even the free version of Mocha that comes bundled with After Effects — but it does do simple replacements very easily.

Summing Up
In the end, if you do a lot of simple sign, monitor or flat surface inserts and don’t want to reinvent the tracking wheel every time, then FayIN is the tool you want. It is moderately priced at around $330, but if you buy before NAB you can get it for around $250. At the very least you should download the free 30-day trial and see for yourself, because FayIN is definitely extremely easy to use if you follow the well-done tutorials.

I feel like the future of FayIN is going to be bright; they seemed to have built this plug-in with the future in mind. They have been teasing FayOUT for at least a year now offering the ability to easily do object removal. So who knows, it seems like 3D integration is only a matter of time away.

Brady Betzel is an online editor at Margarita Mix in Hollywood. Previously, he was editing The Real World at Bunim-Murray Productions. You can email Brady at bradybetzel@gmail.com, and follow him on Twitter, @allbetzroff.

Review: Maxon’s Cinema 4D R17

By Brady Betzel

Over the years, I have seen Maxon take Cinema 4D from something that only lived on the periphery of my workflow to an active tool alongside apps such as Adobe After Effects and Adobe Premiere and Avid’s Media Composer/Symphony.

What I have seen happenis the simplification of workflow and capabilities within Cinema 4D’s releases. This brings us to the latest Cinema 4D release: Cinema 4D R17. This release not only builds on the previous R16 release, like improved Motion Tracking and Shaders, but Maxon continues to add new functionality with things like the Take System, Color Chooser or the Variation Shader.

Variation Shader

Because I work in television, I previously thought that I only needed Cinema 4D when creating titles — I couldn’t quite get that gravitas that I was looking for in apps like Media Composer’s Title Tool, After Effects or even Photoshop (i.e. raytracing or great reflections and ambient occlusion that Cinema 4D always conveyed to me). These days I am searching out tutorials and examples of new concepts and getting close to committing to designing one thing a day, much like @beeple or @gsg3d’s daily renders.

Doing a daily render is a great way to get really good at a tool like Cinema 4D. It feels like Maxon is shaping a tool that, much like After Effects, is becoming usable to almost anyone that can get their hands on it — which is a lot of people, especially if you subscribe to Adobe’s Creative Cloud with After Effects, because Cinema 4D Lite/Cineware is included.

Since I am no EJ Hassenfratz (@eyedesyn), I won’t be covering the minute details of Cinema 4D R17, but I do want to write about a few of my favorite updates in hopes that you’ll get excited and jump into the sculpting, modeling or compositing world inside of Cinema 4D R17.

The Take System
If you’ve ever had to render out multiple versions of the same scene, you will want to pay attention to the new Take System in Cinema 4D R17. Typically, you build many projects with duplicated scenes to create different versions of the same scene. Maybe you are modeling a kitchen and want to create five different animations, each with their own materials for the cabinet faces as well as unique camera moves. Thanks to Cinema 4D R17’s Take System you can create different takes within the same project saving tons of time and file size.

Take System

Under the Objects tab you will see a new Takes tab. From there you will generate new takes, enable Auto Take (much like auto keyframing, it saves each unique take’s actions) and perform other take specific functions like overrides. The Take System uses a hierarchical structure that allows for child takes to take the properties of its parents. At the top is the main take, and any changes made there affect all of its children underneath.

Say you want your kitchen to have the same floor but different cabinet face materials. You would first create your scene as you want it to look in the overall sense, then in the Take menu you would add takes for each version you want, name it appropriately for easy navigation later, enable Auto Take, change any attributes to that specific take, save and render!

In the Render Settings under Save > File… you can choose from the drop-down menu how you want your takes named upon export. There are a bunch of presets in there that will get you started. Technically, Maxon refers to this update as Variable Path and File Names, or “Tokens.”

This is a gigantic addition to Cinema 4D’s powerful toolset that should breathe a sigh of relief into anyone who has had to export multiple versions of the same scene. Now instead of multiple projects you can save all of your versions in one place.

Pen and Spline Tools
One of the hardest things to wrap my head around when I was diving into the world of 3D was how someone actually draws in 3D space. What the hell is a Z-Axis anyways? I barely know x and y! Well, after Googling what a Z-Axis is, you will also understand that technically, with a modern-day computing set-up, you can’t literally draw in 3D space without some special hardware.

pen tool

However, in Cinema 4D you can draw on one plane (i.e., Front View), then place that shape inside of a Lathe and bam! — you have drawn in 3D space complete with x,y and z dimensions. So while that is a super-basic example, the new Pen Tool and Spline Tool options allow for someone with little to no 3D experience to jump into Cinema 4D R17 and immediately get modeling.

For an example, if you grab the Pen tool and draw some sort of geometry and then want to cut a hole in it, you can now grab a new circle, place it where you want it to intersect the beautiful object you just drew, highlight the object you want to use as the object that will do the cutting (if you use the Spline Subtract), then hold Control on Windows and Command Mac and click on the object you want to cut from. Then go into the Pen/Spline menu and click Spline Subtract, Spline Union, Spline And, Spline Or or Spline Intersect. You now have a new permanent way to alter your geometry in a much more efficient way. Try it yourself; it’s a lot easier than reading about it.

I used this to create some — I’ll call them unique — shapes and was able to make intersection cuts easily and painlessly.

I also like the Spline Smooth tools. You’ve drawn your spline but want to add some flare —click on the Spline Smooth tools and under the options check off exactly what you want to do to your spline with your brush (think of Spline Smooth like the Liquify tool in Photoshop where you can bulge, flatten or even spiral your work). Under the options you can choose Smooth, Flatten, Random, Pull, Spiral, Inflate and Project. The Spiral function is a great way to give some unique wave-like looks to your geometry

Color Chooser
Another update to Cinema 4D R17 that I really love is the updated Color Chooser. While in theory it’s a small update, it’s a huge update for me. I really like to use different color harmonies when doing anything related to color and color correction. In Cinema 4D R17 you can choose from RGB, HSV or Kelvin color modes. In RGB there are presets to help guide you in making harmonious color choices with presets for Monochromatic, Analogous, Complementary, Tetrad, Semi-ComplemenColor Choosertary and Equiangular color choices. If you don’t have much experience in color theory it might be a good time to run to your local library and find a book; it will really help you make conscious and appropriate color choices when you need to.

Besides the updated color theory based layouts, you can import your own image and create a custom color swatch that can be saved. In addition, a personal favorite is the Color Mixer. You can choose two colors and use a slider to find a mix of the two colors you chose. A lot of great experimentation can happen here.

Lens Distortion
When compositing objects, text or whatever you can think of into a scene it can get frustrating when dealing with footage that has extreme lens curvature. In Cinema 4D R17 you can easily create a Lens Profile that can then be applied as either a shader or a post effect to your final render.

To do this you need to build a Lens Profile by going to the Tools menu and clicking Lens Distortion, then load the image you want to use as reference. From there you need to tell Cinema 4D R17 what it should consider a straight line — like a sidewalk, which in theory should be horizontally straight, or a light pole, which should be vertically straight

Lens Distortion

To do this you need to click Add N-Point Line and line it up against your “straight” object, you can add multiple points as necessary to create changes in line angle, choose a lens distortion model that you think should be close to your lens type (3D Standard Classic is a good one to start with), click Auto Solve and then save your profile to apply when you render your scene. To load the profile on render find your Render Settings > Effects > Lens Distortion and load it from there.

Book Generator
I love that Maxon includes some shiny bells and whistles to their updates. Whether it is a staircase from R16 or Grow Grass, etc, I always love updates that make me say, “Wow that’s cool.” Whether or not I use it a lot is another story.

In Cinema 4D R17, the Book Generator is the “wow” factor for me. Obviously it has a very niche use but it’s still really cool. In your content browser just search for Book Generator and throw it on your scene. To make the books land on a shelf you need to first create the shelves, make them editable, then click “Add Selection as One Group” or “Add Selection as Separate Groups” if you want to control them individually. Afterwards under the Book Generator object you can click on the Selection, which are the actual books. Under User Data you can customize things like overall book size, type of books or magazine, randomness, textures and bookends, and even make the books lean on each other if they are spaced out.

book generator

It’s pretty sweet once you understand how it works. If you want different pre-made textures for your magazines or books you can search for “book” in the Content Browser. They have many different kinds of textures including one for the inside pages.

Summing Up
I detailed just a few great updates to Maxon’s Cinema 4D R17, but there are tons more. The awesome ability to import SketchUp files directly into Cinema 4D R17 is very handy and keyframe handling updates and the possibilities from the Variation Shader make Cinema 4D R17 full of endless possibilities.

If you aren’t ready to drop the $3,695 on the Cinema 4D R17 Studio edition, $2,295 on the Visualize edition, $1,695 on the Broadcast edition or $995 on the Prime edition, make sure you check out the version that comes with Adobe After Effects CC (Cineware/Cinema 4D Light). While it won’t be as robust as the other versions, it will give you a taste of what is possible and may even spark your imagination to try something new like modeling! Check out the different versions here: http://www.maxon.net/products/general-information/general-information/product-comparison.html.

Keep in mind that if you are new to the world of 3D modeling or Cinema 4D and want to find some great learning resources you should check out Sean Frangella on YouTube: https://www.youtube.com/user/seanfrangella @seanfrangellawww.greyscalegorilla.com/blog, Cineversity: and www.motionworks.net @motionworks. Cineversity even used my alma mater, California Lutheran University in their tutorials!

Brady Betzel is an online editor at Margarita Mix in Hollywood. Previously, he was editing The Real World at Bunim-Murray Productions. You can email Brady at bradybetzel@gmail.com, and follow him on Twitter, @allbetzroff.

Boris FX’s BCC 10 for Avid Media Composer

By Brady Betzel

I love plug-ins — Video CoPilot’s Element 3D, Red Giant’s Universe, Neat Video’s Noise Reduction and many more. There are some pros who like to pretend that they are too good for plug-ins or consider the use of plug-ins a crutch, but not this guy. I love them! Plug-ins make my job easier and more efficient.

Time is money and when you are doing something by hand that can be done faster by using a plug-in, you are wasting time and money — your client’s money and your time, which could be spent with family and friends.

I don’t always love the products I review, but I do love this one, so prepare yourself — I’m going to rave over Boris FX’s latest update to their BCC collection: Boris Continuum Complete v10. It even had an update to v10.01 last week, offering improved 4K handling, overall render speed improvements and — an important one for me —optimization of Avid project size when BCC AVX filters are applied.

Mocha Within Avid
The BCC v10 update is the biggest and most complete update to BCC that I have seen. I say that because in 2014 Boris FX acquired Imagineer Systems, the maker of the magical Mocha planar tracking software. When I first heard this news I almost jumped out of my skin, mainly because tracking inside of Avid’s Media Composer is lacking. And while Media Composer’s point tracker is appropriate for some circumstances, one thing it does not have is backwards tracking. Luckily for us Avid users, Mocha is now integrated into Avid via the BCC 10 highway… streamers and confetti should pop out of your computer after reading that sentence.

So what does Mocha mean for the everyday editor? Well, it allows for a much tighter tracker inside of Media Composer. Furthermore, if you use effects like Gaussian blur or the new BCC Beauty Studio, you can apply the Mocha tracking data inside of each effect in the effects editor. For example, if you are editing an interview featuring a person with less-than-perfect skin and the producer or director wants to fix that, you can… and pretty quickly. Yes, I know there are ways to do this for free using some sweet luminance mattes and maybe a slight blur on certain color channels, but, let’s be real, that might take hours, not to mention trying to track the facial movements, as well as erasing the teeth and eyes from the aforementioned stack of effects.

Once you apply Beauty Studio you can launch Mocha from within the effects editor inside of Media Composer, track the entire head shape with X Splines (or B Splines), track the eyes and possibly mouth to create your subtract (erase) layers, CTRL + Q or Command + Q on a Mac to quit Mocha, see your settings applied in Media Composer and, magically, you have a subject with smooth and appealing skin.

You should take that last paragraph with a little grain of salt, because while it is “easy” to accomplish this with the help of Mocha and BCC 10, there is a moderate learning curve, and sometimes there will be a large render time involved. My suggestion is to watch and read everything Mary Poplin does — she is on Twitter @MaryPoplin and on Imagineer System’s website with some excellent video tutorials. One tip, even if you think a certain video is too long or might not be exactly what you are looking to learn, Mary always finds a way to drop a bazillion tips into every video.

Another personal favorite tutorial creator is Kevin P. McAuliffe; he makes all sorts of great videos, but one I saw recently was how to easily make a scrolling credit bed in Media Composer with the help of the new BCC 10 Title Studio. You can also follow him on Twitter @KPMcAuliffe.

While BCC makes it fast and relatively easy to do things like key a greenscreen or make someone beautiful, it all comes at a price, and usually that price is rendering. I typically only render things I can’t view in realtime or really need to see play out in realtime, otherwise I will save my renders for when I am sleeping.

Other Updates
So what else is new and what is still great in BCC 10 besides the amazing Mocha integration you ask? I will quickly go over my favorites in the next few paragraphs.

Under the “Still Great” category is Chroma Key Studio. While this was released in BCC 9 I can’t say enough about it. If you’ve used SpectraMatte to death and can’t quite get a great key, you need to throw on BCC Chroma Key Studio, which is under the BCC Key and Blend heading. In the Effects Editor, change the view to source, sample the green and, usually, you are halfway home. I will dial in the density and Matte Cleanup settings first — think Clip Black and Clip White from Keylight inside of After Effects — then mess with the Light Wrap and Matte Choker settings until I can get it dialed in.

When I have a particularly poorly lit or uneven greenscreen behind the subject, I sometimes use the Pre-Key Cleanup to help even out the greenscreen color to sample from. I may even jump into my secondary color correctors in Symphony — isolate the green I want to smooth out, widen my input vector’s hue width to capture all of the offending greens I can, dial in a few other settings and dial my output vector to taste. From there you are usually good to go, but if you are still having trouble with motion blur or green creeping into those dreaded fingers waving, you can jump into BCC Image Restoration and apply BCC Noise Reduction on your base layer. Be careful though because this will add a tremendous amount of render time, and you will definitely need an overnight render.

Under the “What’s New” heading, I really love Boris FX’s BCC Remover located under Image Restoration; it’s basically a clone tool with the added ability to track using Mocha for your tracking and mattes. I use this constantly.

It’s as simple as watching a few Mary Poplin tutorials on how to use Mocha’s X or B splines to draw your masks. Then track, using the Uber key to adjust your track without adjusting your keyframes — or individually adjust your keyframes if you want — then quit, save out of Mocha and, finally, adjust your clone settings inside of Media Composer’s Effect Editor. You can choose from a few different fill types like auto-fill or clone. I have actually had success by just using BCC’s auto-fill with no additional adjustments necessary.

One thing that is not as obvious as I would like is that when you use Mocha and want to feather your mask, you need to twirl down Pixel Chooser Mask/Mocha to find it.

Video Glitch

I also love the BCC Light Leaks and Video Glitch plug-ins. You can use these for transitions or just throw them over your footage to give an instant flare to your footage. If you’ve read my previous reviews, you know how much I love Rampant Design Tools. They offer some great high-quality tools, such as light leaks and grunge. If you are a light leak or grunge enthusiast and can’t find the right color or flow, then BCC 10’s Light Leaks or Video Glitch are for you. Immediately you can add the BCC Light Leaks effect (located in BCC Lights) or Video Glitch (under BCC Stylize), click on Show FX browser in the effect editor and sample different looks over the footage in your timeline.

Another warning here, while you can preview the presets over your footage in your timeline it does have to “cache,” so the first time your clip will play it will be slowly and with a stutter (think After Effects “realtime” rendering).

Share, Share, That’s Fair
A very cool thing in the new Boris FX Browser is the ability to view presets other people have made or sent you via email. This is cool, so stay with me. If you work in a networked editing environment, such as through an ISIS, then you most likely have a bunch of editors making all sorts of effects. If you’re an assistant editor, polishing editor, finishing editor, online editor or whatever title leads you to be in charge of a look of a show, the ability to share plug-ins and presets is critical.

BCC 10 has the ability to easily share and preview presets from different systems. In fact, you could have a folder of BCC presets on the ISIS that can either be copied locally or kept on the network drive to be shared by all. My suggestion would be to copy locally if you can and have someone update those presets on each system when needed, but what do I know? Anyway, you can find the presets on the system level under Program Files > Boris FX, Inc, BCC Presets 10 AVX.

Beauty Studio
The last plug-in I want to talk about was the already mentioned BCC Beauty Studio, located in BCC Image Restoration. Remember earlier when I mentioned the interview subject with less-than-perfect skin who could use a little touch up?

I especially like to use this in conjunction with Mocha to track facial movement and eliminate as much of the “beauty studio look” that I can by containing only the face or problem area. This plug-in does work with the presets, but again let’s be real, you should never accept a preset as the final version of your work — we can have the philosophical discussion of how a preset is not technically your work on Twitter if you want. Tweet me @allbetzroff.

Also, every person’s skin texture, scene lighting and even color temperature can change drastically between set-ups, so one preset might not work for another set-up. Basically, what I’m saying is you will need to learn this tool, and to do so I recommend searching through the tutorials and watching something like this. Do a little noodling and elevate your skills.

Over the years I’ve noticed that editors typically get to choose between two sets of plug-ins when working in Avid: GenArt’s Sapphire or Boris FX BCC — unless you are super lucky and get to have both. At this moment I really love what Boris FX has to offer, which is a high-end tracking solution, not to mention all the other features.

In overall value, it’s very hard to beat BCC 10 for Avid, Adobe, or any OFX-supported app like Blackmagic DaVinci Resolve. (Currently, Boris FX BCC 9 is compatible with Resolve, but 10 is supposed to be released any day now).

The multi-host license — good for Adobe, Avid, FCP X and OFX supported platforms like Resolve and Sony’s Vegas Pro — will cost $1,995 for the full install package, and $695 for just the upgrade from v9. If you want to rent the multi-host version it will cost $595/year. The individual app licenses look like this:
Avid $1,695/Full – $595/Upgrade
Adobe $995/Full – $295/Upgrade
FCPX/Motion $695/Full – $195/Upgrade
OFX (Resolve) $695/Full
OFX (Sony) $695/Full — $195/Upgrade

BCC-10_TITLESTUDIO USE

Title Studio

Summing Up
It’s hard to cover Boris FX’s BCC 10 in just 1,000 words, but to sum up, I love it! So much so that I would recommend it to everyone out there working in Media Composer and Symphony.

Heck, I didn’t even cover the awesome feature of importing Maxon Cinema 4D models into the BCC Title Studio. Of course, you need to spend some time to figure out the intricacies of Mocha tracking as well as what each parameter does inside of the Chroma Key Studio, but luckily you have a great set of tutorials on the BorisFX.com website to get you up to speed.

If I could wish for one feature request, it would be the ability to easily take any Mocha work you did in one BCC plug-in, such as a Gaussian blur, and apply it to another, such as BCC Remover or BCC Composite. At the moment you need to export/copy the data from Mocha and load it in the plug-in you want to use it in. That solution works, but it would be nice to have a seamless way to move your tracking data.

Brady Betzel is an online editor at Margarita Mix in Hollywood. Previously, he was editing The Real World at Bunim-Murray Productions. You can email Brady at bradybetzel@gmail.com, and follow him on Twitter, @allbetzroff.

‘Star Wars: The Force Awakens’ editors weigh in on the cut

J.J. Abrams called on trusted, long-time collaborators for his newest film.

By Brady Betzel

Star Wars fever! Who’s got it? From where I sit, almost everyone — from the long-time fan to the newbies discovering the franchise for the first time. I fit somewhere in the middle. Up until recently I hadn’t seen the original three Star Wars, only the prequels, but thanks to my four-year-old son’s interest in the new action figures we got to experience the first three together, followed quickly by a viewing of the new Star Wars: The Force Awakens.

When my family walked out of the theater, my son asked when the next one was coming out, and my wife wanted to see it again! So when Randi Altman approached me to interview the editors of The Force Awakens, I jumped out of my chair, screaming, yes! As a working editor, mostly cutting TV fare, I was very interested in their story and process.

x1626IMAX_tea0050_PUB_IMAX_noMB_16int.v07.4

I spoke to editors Mary Jo Markey and Maryann Brandon —  who joined our call half way through — as well as associate editor Julian Smirke, who offered up a great Avid shortcut for any editor or assistant editor that deals with conforming different types of audio. Matt Evans, another associate editor, wasn’t available for the interview. Both Markey and Brandon have worked with Force Awakens director J.J. Abrams before on his TV series and films, such as the Star Trek and Mission Impossible franchises.

When talking to the editors, I was taken with an overwhelming and poignant feeling that enveloped me in a way; it was a sense of comfort and family that extended beyond our phone call to the set and production process. The best way to explain it is by using this example: I asked the editors how they would approach director Abrams if they had ideas on re-shoots or story points, and they all responded in kind: J.J. listens to everyone! If he had signed off on a scene but the editors felt something was missing, they still felt empowered to go in and address their concerns. This is a director who truly values the input of his editors.

Without any further ado here are highlights from our conversation.

Did you do anything special to prepare for editing this film as opposed to other films you have worked on with J.J.? Or did you walk in and say, “Let’s go”?
Mary Jo: I don’t think you can really cut in an authentic way while trying to imitate something else; we still cut the way we cut. Stylistically the only thing we did keep were the soft wipes between scenes. For me, I really don’t try and impose a style on material; the material kind of tells you what to do in a way.

mjmcolor-14

Mary Jo Markey

How do you decide who edits what? Do you divide the movie scene by scene or pick and choose what you want?
Mary Jo: We divided up the script according to page count and did it in very large pieces so we had a good run at a section. That way when J.J. was editing with us he didn’t have to be bouncing back and forth between rooms. I cut the beginning and end of the film and Maryann cut the large middle section. We also watched the dailies together and talked about the movie incessantly.

While cutting we would go to each other’s bays and talk about things we are surprised by and what we weren’t sure was working. We would ask questions like, “Why do you think he did it that way?” or, “What do you think that performance is getting at?” If we are really confused about something, which doesn’t happen much, we have a direct line to J.J. on set. But for the most part we do our own thing and come together after they are done shooting.

How close were you behind shooting when watching dailies?
Mary Jo: We were a day behind shooting.

If something wasn’t working, were you able to re-evaluate and re-do?
Mary Jo: Actually, that did happen. It happened in part because Harrison Ford broke his ankle on set, which we all felt really bad about. We shot all we could without him on set, but unfortunately, he still wasn’t able to come back, so we took about a three-week hiatus. During that time, J.J. was able to sit back and see what he already had and how it was working. There were two things re-conceived during that period: Rey and Finn’s initial meeting in the trader’s tent and Harrison and Chewie’s first interaction with Rey and Finn.

lex0040.x1.trl3_088838.v01    Star Wars: The Force Awakens

Did you ever look at your sequence and have to “kill your darlings,” if you know what I mean?
Mary Jo: It was more like killing beats that we liked — there were some little jokes that got lifted. We are all really committed, J.J. in particular, to getting our films down to two hours if possible, so if things weren’t working or weren’t absolutely necessary they were lifted out of our sequence. We all have had to lose something that we really liked, but we always knew we needed to be clear-eyed about what we needed. I don’t think there is anything not in the film that I regret it not being in the film.

Maryann: There were a few whole scenes that we took out. So while they were fun scenes, they didn’t really advance the story or take the characters where we wanted them to go. There were times when we cut a little extra moment or a little extra joke, but there were scenes that J.J. thought weren’t working and we thought were, but in the end we all needed to agree a scene was working. Sometimes we would even dive back into a scene that J.J. had signed off on because we still thought we could do better.

So you have no problems going back into a scene and re-presenting it to him even if he already “approved” it?
Mary Jo: No, he takes it very seriously if we don’t think a scene is working, even if he has approved it. We have a great working relationship and we always feel heard and considered. He takes it very seriously if we are unhappy or dissatisfied about something.

Clearly, there was a great story being told in this movie, was the motto always story first?
Maryann: That was our aim. Even in the heavy action sequences if you don’t know who’s doing what or where they are going, then the action scene isn’t as enjoyable.

Maryann Brandon

Maryann Brandon

What systems were you working on, and what codec were you working with in the offline edit?
Julian: We worked on eight Avid Media Composer 7 stations with ScriptSync, and eventually Avid Media Composer 8.4 alongside an ISIS 5000. We had an offline working resolution of DNxHD 115. On previous films we worked in DNxHD 36, but with Star Wars we jumped to 115, which worked and looked great for our needs.

Did you ever do any work in other programs such as After Effects?
Julian: We worked with our VFX editor Martin Kloner, who primarily worked in Media Composer. Once we were further along in the process, the VFX were sent out to various vendors like ILM but also Kelvin Optical, who were especially helpful because they worked out of Bad Robot where we worked.

Maryann: It was invaluable to have someone like Martin do temp VFX, speed ramps, split screens, or backgrounds — anything we could do to simulate the end product would help when watching various cuts tremendously.

Do you edit with keyboard, mouse, Wacom tablet, etc?
Maryann: Mary Jo and I use a keyboard and mouse, but also a Logitech controller.

Julian: I use keyboard and Wacom tablet, but when I really need to speed up I’ll jump back to the keyboard and mouse, which seems to work faster for me.

Julian, since you are technically an associate editor on The Force Awakens, have you been able to edit scenes and work in a creative position more than just wrangling data?
Julian: So, technically, I have been the first assistant editor along with Matt, but with Star Wars — more so than previous films because of Maryann and Mary Jo’s generosity — I’ve been given more creative input. We get to sit in the rooms and be apart of the creative process. Sometimes we make suggestions and sometimes they work out and sometimes they don’t (Julian laughs a little here), but everyone is very supportive and it’s nothing more than trying to make a scene better.Star Wars: The Force Awakens

I was an assistant editor for a while, and it’s great to see editors like you — Maryann and Mary Jo — keeping the traditional sense of an assistant editor alive.
Mary Jo: Matt has been my first assistant since Super 8, and he’s just incredibly valuable to show a cut to. If he says something isn’t working I have to believe him because it always turns out to be true.

Maryann: It’s great to have someone like Julian involved before it goes out to the bigger world, meaning J.J.; it’s great to bounce ideas off of someone you have a short hand with.

Julian, how did it make you feel to that kind of creative input on something like Star Wars?
Julian: It’s amazing; a dream come true! Matt and I have learned a lot about what works and what doesn’t work. Sometimes if Maryann is slammed with dailies, I can grab a scene and assemble it for her so she can use that as a starting point. To learn from these great and talented editors is a dream come true.

Julian, could you describe how your time is divided?
Julian: It’s hard to describe precisely, because it depends on where in the process we are. During dailies it’s very labor intensive with grouping multiple cameras and syncing sound, or even just dealing with the huge amount of footage that comes in. You need to make sure all of your technical work is solid so it doesn’t become a problem a year and a half away.

EP7_IA_105362_R-2

So you and Matt were responsible for syncing sound?
Julian: Yes. So we shot primarily on 35mm film, and the telecine facility didn’t receive sound. Matt and I grouped and synced sound and every take manually and then prepped dailies for Maryann and Mary Jo. Once we finished with the more technical side with dailies we could move more into the creative side with temp sound design, 5.1 mix and such.

Do you have any favorite Avid shortcuts?
Mary Jo: Copy to Source monitor.

Maryann: Fit-to-Fill for quick speed ramps.

Julian: Because I live so much in the sound world with temp sound design, I would have to go with Option + Command + U, which allows you to insert any type of audio track in between other tracks. During large scenes like the Falcon chase we would use up to 20 tracks or more and need to insert a stereo track into the middle of that with that shortcut.

Mary Jo: I feel like we should have a meeting about all of this because I don’t know the ones you are talking about!

Maryann: Me too, I was like copy? We can copy?

Mary Jo: We’re going to have put a pamphlet together.

Brady Betzel is an online editor at Margarita Mix in Hollywood. Previously, he was editing The Real World at Bunim-Murray Productions. You can email Brady at bradybetzel@gmail.com, and follow him on Twitter, @allbetzroff.

 

Top 3: My picks from Adobe’s Creative Cloud update

By Brady Betzel

Adobe’s resolve to update its Creative Cloud apps on a regular basis has remained strong. The latest updates, released on December 1, really hammer home Adobe’s commitment to make editing video, creating visual effects and color correcting on a tablet a reality, but it doesn’t end there. They have made their software stronger across the board, whether you are using a tablet, mobile workstation or desktop.

After Effects and Stacked Panels

I know everyone is going to have their own favorites, but here are my top three from the latest release:

1. Stacked Panels
In both After Effects and Premiere you will notice the ability to arrange your menus in Stacked Panels. I installed the latest updates on a Sony VAIO tablet and these Stacked Panels were awesome!

It’s really a nice way to have all of your tools on screen without having them take up too much real estate. In addition, Adobe has improved touch-screen interaction with the improved ability to pinch and zoom different parts of the interface, like increasing the size of a thumbnail with a pinch-to-zoom.

In Premiere, to find the Stacked Panels you need to find the drop down menu in the project panel, locate Panel Group Settings and then choose Stacked Panel Group and Solo Panels in Stack, if you want to only view one at a time. I highly recommend using the Stacked Panels if you are using a touchscreen, like a tablet or some of the newer mobile workstations out there in the world. Even if you aren’t, I really think it works well.

Premiere Pro and Optical Flow

Premiere Pro and Optical Flow

2. Optical Flow Time Remapping
Most editors are probably thinking, “Avid has had this for years and years and years, just like Avid had Fluid Morph years before Adobe introduced Morph Cut.” While I thought the exact same thing, I really love that Adobe’s version is powered by the GPU. This really beefs up the speed of the latest HP z840 with Nvidia Quadro or GTX 980 Ti graphics cards and all their CUDA cores. Be warned though, Optical Flow (much like Morph Cut) works only in certain situations.

If you’ve ever used Twixtor or Fluid Motion in Media Composer, you know that sometimes there is a lot of work that goes into making those effects work. It’s not always the right solution to time remapping footage, especially if you are working on content that will air on broadcast television — even though Optical Flow may look great, some content will fail certain networks’ quality control because of the weird Jello-looking artifacting that can occur.

After Effects and the Lumetri Color Panel

3. Lumetri Color inside of After Effects
While you might already have a favorite coloring app or plug-in to use, having the ability to take clips from Premiere to After Effects, while carrying over the color correction you made inside of the Lumetri panels, is key. In addition, you can use the Lumetri effect inside of After Effects (located under the Utility category) to quickly color your clips inside of After Effects.

Overall, this round of updates seemed to be par for the course, nothing completely revolutionary but definitely useful and wanted. Personally, I don’t think that adding HDR capabilities should have taken precedence over some other updates, such as collaboration improvements (think Avid Media Composer and Avid’s Shared Storage solution, ISIS), general stability improvements, media management, etc. But Adobe is holding true to their word and bringing some of the latest and greatest improvements to their users… and causing users (and manufacturers) of other tools to take notice.

Brady Betzel is an online editor at Margarita Mix in Hollywood. Previously, he was editing The Real World at Bunim Murray Productions. You can email Brady at bradybetzel@gmail.com, and follow him on Twitter, @allbetzroff.

Everyday tips for editors

By Brady Betzel

I started in this industry as an intern, worked my way up to assistant editor and am now a full-time video editor. Getting here took some time, but I learned some valuable lessons along the way. Here are just a few tips for those of you who might be wondering what it takes to be successful over the long haul, and how to be part of the team — a creative asset — not just a tool by which to get the work done.

1) Learning to not be too nice was key… basically help others, but don’t forget to promote yourself. It’s walking that fine line — being a real person versus promoting yourself is hard, and I learn every day how to keep my “soul” in all my discussions/edits/personal conversations. It’s a constant learning process, much like life!

2) Don’t be too good at your job. Don’t misunderstand me, I’m not saying you should do a bad job, but if you are really good at being an assistant editor, make sure you are super-duper clear about your long-term goals. Being an assistant editor isn’t the only thing you could be great at, so let them know that.

3) Have an opinion. No one ever got mad or looked down on me — as far as I know — for having an opinion. I made sure I was never obnoxious about expressing my thoughts, and I made sure those opinions would add to the conversation as opposed to being unhelpful or negative. I like to hear other people’s ideas and opinions, whether or not I agree with them. It’s how you grow as an editor and a person.

I definitely believe that people I’ve worked with, and for, in the past still keep in contact with me because I have an educated opinion and because I’m not boring. Sitting in a room with me for hours at a time will not put anyone to sleep…. at least I hope not.

4) Being part therapist is definitely part of being a good editor, although a lot of editors just sit there and listen. While this might help some of the time, it won’t help all of the time. This goes back to having the confidence to have an opinion. Therapy in the edit bay is definitely about listening, but offering solutions and alternate views will go a long way in making the client feel better.

5) Don’t let yourself get taken advantage of… monetarily and idealistically. Walk the fine line of a good opinion versus being obnoxious, and talk about yourself and your particular assets.

For the first eight years of my career, I was completely on board and willing to help others because I believe that goodwill will eventually come back to you. Although I’ve been proven wrong on occasion — some people are takers — don’t let a few bad apples ruin the bunch.

Brady Betzel is an online editor at Margarita Mix in Hollywood. Previously, he was editing The Real World at Bunim-Murray Productions. You can email Brady at bradybetzel@gmail.com, and follow him on Twitter, @allbetzroff.

 

Review: Video CoPilot’s Element 3D v2.2

Improved UI, shadows, reflections and more

By Brady Betzel

If you’ve read my past reviews, you likely know how much I love Video CoPilot’s Element 3D. I can’t stop talking about it, in my reviews, at work, on social media, or even at home with my four-year-old son, who typically responds with, “Can it make Thomas the Train?” Luckily, I am able to respond, “Yes, as long as daddy has a Thomas 3D model.”

I’ve been using Element 3D, which is currently in v2.2, since v1. I have always found it comfortable and easy to use, especially as someone with a good sense of spatial 3D relationships, as well as a moderate level of After Effects knowledge. So there won’t be any surprise that the latest release has lived up to its previous incarnations. Furthermore it started and continues to evolve the pipeline between the 2.5D Adobe After Effects and true 3D applications like Maxon’s Cinema 4D and its Cineware offering.

What’s New?
The latest version of Element 3D has some great updates, including interface upgraders; group symmetry modes; matte shadows with alpha channels; general speed improvements; and many more awesome new stuff. Element 3D (E3D) v2.2 is a free update to owners of v2. If you are still running E3D from the v1 era, obviously it still works and works pretty well, but if you want the latest mind-blowing features for a sub-$200 plug-in, v2.2 is where you want to be — you even get a generous $100 discount if you are upgrading from a valid Element 3D v1 license. If you are feeling like you want to spend a little more money, purchase Pro Shaders 2 along with Element 3D — it even comes with a bundle discount that makes it just shy of $250. The Pro Shaders are a library of high-quality textures used inside of Element 3D and even Cinema 4D. I could go on all day about the products that Video CoPilot puts out, but instead I suggest you check out their website, and Andrew Kramer’s awesome tutorial and demo videos.

So let’s dive in and see why Element 3D v2.2 is worth the price of admission.

Digging In
I really love Element 3D for its particularly easy way of creating fantastic-looking 3D titles quickly. To create a quick and great-looking title in After Effects you can follow these steps: create a new text layer, type your text (usually works better with a thick and/or strong font), under the “Custom Layers” > “Custom Text and Masks” find your layer from the drop down and select it, launch the scene interface, click on extrude, and you are ready to apply a texture.

Once you are done coloring and texturing you can jump back into After Effects and create simple swooshing titles, or even titles that break apart and transform into another title, all within the Element 3D effect controls. You can even add cameras and lights, which will impact your creation. If you do any sort of fancy titling, Element 3D must be in your After Effects toolbox. If you are worried about how well your computer will handle it, I can tell you that you don’t need the behemoth Nvidia Quadro M6000 (although it would be nice). Element 3D just needs a graphics card with at least 512MB of RAM and, while Intel graphics are not supported, there have been some cases where they work. Typically, you will want an Nvidia or AMD card.

If you’ve used E3D before, you are probably past the point of building 3D text and are ready for me to move on to the v2.2 meat and potatoes of Element 3D. So without further ado…

Group Symmetry Mode
In short, group symmetry allows you to designate a reflection of your group within the E3D scene interface. In one of his demos, Kramer uses a futuristic door as an example, and it really makes a lot of sense when describing how Group Symmetry Mode works. If you have one side of a door created and want to simply reflect that side on the X, Y and/or Z-axis, you can enable Group Symmetry by clicking on your group and checking the Group Symmetry box. If you want to animate those individual elements you can jump inside of your group and designate each object to be assigned to an auxiliary animation.

For example, If you have glass in front of your door that is within the same group, you can assign the glass to auxiliary animation channel 1 and the door to channel 2, allowing the glass and door to be animated separately but still operating in the same group. Another benefit of group symmetry is the ability to texture the object and, because E3D processes the symmetry as instances any texture applied to one object gets replicated to each object within your group (with no slowdown). You can even save your group folder as an E3D file to use in another scene.

Matte Shadow With Alpha and 3D Noise
This feature is aimed at anyone doing 3D compositing over live action within After Effects. With Element 3D v2.2 you now have the ability to work with dynamic reflections and dynamic shadows. This leads into creating a shadow of a 3D object while using the alpha channel of a texture to create a more realistic roll-off into your live-action footage. Basically, it gets your composite to look more realistic faster and more easily.

Third on my list is deforming and 3D noise. In E3D v2 you have gained the ability to add primitives directly inside of the scene interface without importing objects, and this makes it extremely easy to come up with organic and often-times wild animations within seconds. Simply drop any primitive object (or anything like text or even an external .OBJ) into your scene, bump up the amount of segments in your 3D model, change your Surface Options > Normal to Dynamic (Deform), as well as check off Optimize Mesh to speed up the rendering, then texture your object however you would like and exit back into After Effects.

water models

Now, under the numbered group that houses your object find the Particle Look and locate the Deform settings. This is where the crazy magic happens. Mess around with the Noise setting a little and you will immediately be looking for some sound effects to complement the weird organic jello-like movement. Check out the MotionPulse Sound Design Toolkit.

Summing Up
I’ve really only touched on a few of the latest updates to Video CoPilot’s Element 3D v2.2. Andrew Kramer and his team have done a great job with Element 3D from the very beginning, and v2.2 is no exception thanks to the addition of sub-surface scattering; Matte Reflection mode; the ability to export .OBJ files for other 3D apps; automatic texture importing with your external 3D objects; dynamic reflection maps; the ability to designate a transfer mode like Add or Screen to your textures; and so much more.

The only knock I have with Element 3D — well, it’s not really a knock — is that once you dive in you just want to keep learning more. It’s an extremely exciting rabbit hole of motion graphics and VFX that’s hard to get out of.

Brady Betzel is an online editor at Margarita Mix in Hollywood. Previously, he was editing The Real World at Bunim-Murray Productions. You can email Brady at bradybetzel@gmail.com, and follow him on Twitter, @allbetzroff.

 

Review: Wacom Cintiq Companion 2

A video editor puts this tablet to the test

By Brady Betzel

If you’ve ever used a Wacom Intuos or Cintiq tablet, then you know how efficient they can make your workflow, regardless of your job title. I’m a video editor, and after using a Wacom Intuos 5 I immediately noticed less wrist pain when compared to using a mouse.

Wacom makes very high-quality products that do not disappoint. The Wacom Cintiq Companion 2 is a tablet with the power of a laptop, coupled with the precision of the company’s famous line of pen tablets. Whether you’re an illustrator, a visual effects artist or even an editor, you should check out this tablet.

Under the Hood
While there are multiple configurations of the Wacom Cintiq Companion 2, I will be reviewing only the version I received to test. It’s loaded with the Intel i7 dual core (four thread) 3.1GHz processor, 8GB of DDR3 memory, 256GB SSD Toshiba hard drive and Intel Iris Graphics 5100 graphics card. It comes pre-loaded with Windows 8.1 Pro (if you purchase one with an i3/i5 processor it comes Windows 8.1 standard). This configuration retails for $1,999.95.

IMG_5333

Other configurations run from $1,299.95 all the way up to $2,499.95. In addition, the Companion 2 also comes with a carrying case, stand, the Pro Pen (my favorite accessory), an AC adapter and the Cintiq Connect cable. There is a set of six express keys that I don’t often use — except when doing some Photoshop work — but they are programmable and they are there. Around the outside you get three USB 3.0 ports, a display port,a microSD card slot and a headphone port.

I really liked Wacom’s Cintiq Companion. I thought it was great, but there were a few things I felt could be improved: the stand; the power supply, which was cumbersome and had many problems (Google it); and the ability to use it just like a normal Cintiq when connected to another computer

With the Companion 2, Wacom has listened to what its customers wanted. They addressed the bad power supply connection, although the power connection still hangs off the side. Wacom also made a great improvement — allowing the Companion to be used in conjunction with the Cintiq Connect Cable and perform the same functions as its famous cousin the Cintiq. To use this function, however, you must download the drivers to the computer you want to use the Companion with, as well as have a computer with HDMI out and USB ports.

Unfortunately one of my biggest problems with the Companion is the stand and that has not changed. While it’s not a deal breaker, I find it cumbersome and, in my opinion, should have been built-in, much like the Microsoft Surface.

Testing it Out
Once I got the stand attached, the computer turned on fast, and within five seconds I was up and ready to run. If you haven’t used Windows before, don’t worry. It comes loaded with Windows 8.1 Pro and recently has been suggesting that I upgrade to Windows 10. If you are thinking about upgrading to Windows 10, I would be careful because currently many pro apps are not yet certified.

I immediately downloaded the Adobe CC Suite, specifically After Effects and Premiere. When using this tablet, I wanted to concentrate on its video capabilities as opposed to its well-known illustration abilities. As most reviews and articles will tell you, the Companion 2 has 2,048 levels of sensitivity, as well as tilt and multi-touch offerings.

Not long after launching After Effects and Premiere I discovered that I really like to use touch over the Pro Pen for the most part, which is a true testament to Adobe and the improvements they have made to their apps for touch. The exception came when I was using bezier curves, masks or adjusting color curves. I could not get the same level of accuracy as I do with the pen.

Nonetheless, using the CIntiq Companion 2 as a video editor and effects machine proved to be a great experience — including the fact that I was able to use Video CoPilot’s Element 3D without a problem. It should also be noted that there will be some hiccups when editing multiple video layers; you will need way more memory and a dedicated graphics card. This brings up another point: technically the Cintiq Companion 2 cannot be upgraded, so if you order the 8GB memory version, that’s it. My advice would be to spend a little more money and max it out as much as you can, your renders will thank you.

IMG_5336 IMG_5338

I tested the machine with an XDCAM 50 MOV file. The XDCAM codec is a notoriously processor-intensive codec that gives even the largest Mac Pro or HP z840 a run for their compression money. The Companion stayed in the race nicely. I compressed the nine-minute, 11.2GB XDCAM MOV using Adobe Media Encoder, compressing to the YouTube 1080p preset and harnessing the OpenCL acceleration in 12 minutes and 52 seconds — with OpenCL turned off, and using only the software acceleration, it took 11 min 37 seconds. It definitely kept up with rough realtime encoding, but with 16GB of DDR3 we may have seen a slightly faster time.

Summing Up
If you have the money and/or the need for Wacom’s high precision and craftsmanship, the Cintiq Companion 2 is the mobile Cintiq for you. In addition to the precision, the Companion 2 boasts a QHD screen with a resolution of 2560×1440 (an aspect ratio of 1.778 or 16:9) and a color gamut of 72 percent NTSC. While this isn’t the fastest tablet on the market, you will not find one with the same precision and quality that Wacom has become famous for.

I leave you with these highlights: the Companion 2 offers 2,048 levels of sensitivity with the Pro Pen; the Cintiq Connect Cable allows you to use the Companion like a standard Cintiq; and it offers QHD 2650×1440 screen resolution.

Brady Betzel is an online editor at Margarita Mix in Hollywood. Previously, he was editing The Real World at Bunim-Murray Productions. You can email Brady at bradybetzel@gmail.com, and follow him on Twitter, @allbetzroff.

Review: Tangent Element color panels

These compact surfaces can help editors gain control of their color work

By Brady Betzel

More and more these days offline editors are also color correcting or grading footage in some way. For those who are new to this and unsure of the differences between color correction and color grading, let me help…

Color correction is the process of balancing different cameras color properties, exposure and contrast to create a visibly and technically pleasing image — helped by an external waveform monitor, such as Tektronix products with the Double Diamond display. This can mean hours, days or weeks of work depending on factors such as white balancing or poor lighting.

Color grading, on the other hand, typically happens after the colorist balances the footage. This is where they will add “creative” looks to the content, such as the ubiquitous orange and teal look. While some software packages (Magic Bullet Looks, for example) are great, they are designed to be only color grading packages — for the most part tools such as Blackmagic’ DaVinci Resolve, Digital Vision’s Nucoda Film Master, FilmLight’s Baselight, Adobe SpeedGrade and others are built to correct as well as grade.

2

There are a lot of conversations and arguments to be had about correcting and grading, but there is one thing that all colorists I’ve met agree on: color correcting and grading are more efficient and creative with hardware panels.

If you’ve never seen a colorist work, I highly suggest you find a way. At the beginning of my career, I had the opportunity to tag along with a friend to watch the colorist for a Barry Sonnenfeld show, called Pushing Daisies, at work. I was blown away.

To be honest, I don’t remember what panel or software was used (or even his name), but, like most “creative” people who work in any medium will tell you, it’s not the tool that should define you. In the end, I remember the colorist balancing and grading a day-for-night shot. It was incredible. I had seen what amateur colorists could do, but holy cow! A dedicated colorist really is a pro for a reason. It was magical.

element-tk_lg

Some vendors who produce color software also make color correction and grading panels — for example, the Blackmagic DaVinci Resolve Control Surface and the Digital Vision Nucoda Precision Panels. These surfaces target dedicated and high-end colorists and are often $20,000 to $30,000. Plus, there is the additional cost of equipment needed to run the software, like an HP z840 or Mac Pro, and a local storage server. It can set you back a lot of cash. This is where Tangent Element panels can help.

Affordable Control
In my opinion, the Tangent Element panels are great for someone who loves to learn everything about post production, including color correcting and grading… so me, for instance, (an editor/VFX artist who wants to color but doesn’t want to commit $30K to purchase a full panel) or a wedding videographer who really wants to dial in their color without that hefty price tag.

Tangent makes different panels and iOS apps that work well with a variety of software apps, including Resolve 12, Digital Vision Nucoda, Baselight and many more. If you’ve ever tried to color correct on your MacBook Pro or HP z800 with your mouse, a tablet, a keyboard, or a combination of all three, you probably understand how constrained your creativity becomes.

A color panel set typically contains different banks of buttons, knobs, scroll wheels, maybe a built-in tablet, roller balls and rings. Tangent sells a set of panels — that can be purchased separately or as a package — for way under the price tag of the Blackmagic control surface, the FilmLight Blackboard 2 or others in that price range.

Element Mf

Element Mf

The Tangent Element package costs in the neighborhood of $3,300, and the pieces are sold individually as well. For example, a place like B&H sells them individually: Element Mf costs $1,040; Element Bt costs $660; Element Kb costs $850; and Element Tk costs $1,135. Add that all up and it’s $3,686, but if you purchase the entire package all at once, you can save over $300. Oh, what do those letters after the panel’s name mean? They stand for each product’s function. Tk = Trackball, Mf = Multifunction, Kb = Knobs and Bt = Buttons. For an in-depth description of each, check out Tangent’s site.

Digging In!
When I opened the Tangent Element boxes and felt their weight for the first time, I was blown away at the build quality. I have been around some high-end color bays over the last few years, as an editor and online editor, and have been lucky enough to spin the track balls a few times. The expensive and luxurious panels are awesome, smooth and easy to navigate, but did I mention they are also expensive? So when I picked up the Tangent Element panels I was expecting plastic, or a lightweight set — like the difference between a Hyundai and a Mercedes. While they both do the same basic function, the feeling and weight are incomparable. This was not the case with the Tangent panels. The knobs were smooth, the rings rotated graciously and the balls rolled like butter. Considering the price, I was shocked at the quality.

What you will notice with a color panel is that every action has a button or a knob. Tangent requires that you download and use its software, Tangent Hub, including Mapper for certain applications. This helps in assigning functions to buttons in different programs. In some programs you are locked to what functions the manufacturer sets for the Tangent Element panels, such as Resolve, SpeedGrade and Baselight. This includes the Avid plug-ins for Baselight as well. Nucoda, however, allows for mapping using Tangent Mapper, which is a pretty big deal for such a powerful color application.

element-bt_lg

Element Bt

I tested the panels using Adobe SpeedGrade. As an editor, even if you just do a color balance pass, just one panel like the Tk can improve your color correcting tremendously. Keep in mind that when buying panels for Resolve, Tangent’s Application Compatibility list states that you must buy the Bt panel if you are buying the Kb panel. While it’s pretty awesome to have the full Elements set, if you wanted to go bare bones, you would likely want the Tk and the Kb panels, so it would kind of stink to have to shell out the extra $660 to get the knobs to work.

While I’m not digging too deep into the particulars of color correcting — and I’m looking at it from an editor’s perspective — the Tangent Element panels are a night and day difference when compared to color correcting with a mouse, tablet and/or keyboard. If you want some down and dirty talk about how Tangent Panels compare to others or whether functions like the soft clip are properly mapped to the panels in Resolve 12, you should sign up for the Lift Gamma Gain forums. They are one of my favorite resources next to Denver Riddle’s, where you can find some great tutorials to get you up to speed. (On a side note, I have Alexis Van Hurkman’s paperback book “Color Correction Handbook,” and it’s a phenomenal resource for color correction rules and techniques.)

For the price, the feel of these panels is great. The trackballs are great, the rings are smooth and even removable. The rings are attached by magnets and can be removed for easy cleaning, although I would leave that to a professional. You definitely don’t want to clean the trackballs on your own if you are unsure. You will most likely damage your panels and trackballs permanently.

Summing Up
I love these panels! The trackballs are at a great height and the button placement is great. I chose not to magnetically connect my panels edge to edge because I like to have them angled a little… just a personal preference. My line-up, from left to right was Mf, Tk, Bt and Kb.

One thing you should keep in mind when blindly purchasing color correction surfaces is button and trackball placement. Will you be comfortable with knobs and buttons above your trackballs? Personally, I find myself touching the trackballs and adjusting grades by accident when using compact color panels with the knobs and buttons placed on top, but the Tangent Elements panels have few buttons above the trackballs exactly for this reason.

The one thing I wish was different? I would love it if each panel had a separate USB plug with no hub to connect them all together (you need to purchase your own hub separately). It might be nice if one of Element panels had a built-in hub to help clean up the cable mess, but it’s definitely not a deal breaker.

If you are an editor who finds yourself editing, onlining, coloring and mixing your work (which hopefully you can do at least at a basic level), then you want the right tools to do the job. The full Tangent Element panel set is definitely a luxury item for the editor who dabbles in color, but it will increase your efficiency ten fold, if not more. Like any tool with keyboard shortcuts, the more you practice the faster you become. Next to my Wacom Intuos tablet, I really feel that these Tangent Panels are worth every penny. Check them out for yourself; I’m sure you will be impressed.

Brady Betzel is an online editor at Margarita Mix in Hollywood. Previously, he was editing The Real World at Bunim Murray Productions. You can email Brady at bradybetzel@gmail.com, and follow him on Twitter, @allbetzroff.

Review: Telestream Episode Pro 6.5

By Brady Betzel

Over the last few years, transcoding has been a hot topic of discussion. With the line between offline and online editing becoming more blurred than ever, it’s crucial to implement a proper workflow that will erase any bottlenecks in production and post production from the very beginning.

Whether you are ingesting 1080p footage from a DSLR or 6K from a Red Dragon, it’s critical that all transcoding is fast, invisible (at least as invisible as possible) and able to run across multiple systems if you have them.

When I was an assistant editor I remember months of 24 hour-a-day transcoding. Typically, it was because of GoPro footage. Don’t get me wrong, I love GoPro from a size and usability standpoint, but when getting that footage prepped for offline and online editing it takes a huge chunk of time to transcode (not to mention checking for errors). Depending on the choice of the director of post or post supervisor, typically we would pick a “mezzanine” codec to transcode the footage to. A mezzanine codec is one that is high enough quality for your master outputs but also workable within your NLE to not cause hiccups. Apple’s ProRes, Avid’ DNxHD and DNxHR, as well as Cineform’s codecs are all considered mezzanine. Codecs such as H.264 or AVCHD are not easy on a processor and are typically converted to the mezzanine format of choice if you want to work efficiently.

Now that I’m an online editor and on the other side of the fence, so to speak, I see just how important it is for assistant editors to transcode to a proper codec while maintaining viability in the offline edit, as well as keeping image integrity for the online process.

So what I’m really getting to is, what program will allow the fastest transcode time while offering the highest image quality for “dailies” and even outputting masters and sometimes with multi-channel audio? That’s a very loaded question, but hopefully I can give some testing results that will give guidance in your decision.

Episode Pro 6.5
There are many different programs that are stand alone transcoding solutions — Adobe’s Media Encoder, Apple’s Compressor, Divergent Media’s EditReady, Sorenson’s Squeeze, MPEG Streamclip. It’s also offered via the NLEs themselves, and color correction apps like Blackmagic’s DaVinci Resolve 12. However, in this review I am just focusing on Telestream’s Episode Pro 6.5.

Some of the latest features in Episode 6.5 are Closed Caption support; updated codec support including HEVC, XAVC, VP9, and MXF AS-11; multi-bitrate streaming support, which includes MPEG-DASH; improved multitrack audio support and reassignment; and image sequence support. While I won’t be running through all the new features, be sure they are all very big additions.

Immediately when opening Episode Pro 6.5 I noticed how cleanly my options were presented. There are four main categories: Workflows, Sources, Encoders and Deployments. I then had to activate Episode Pro 6.5, which didn’t go as smoothly as I had hoped, but I will get to that later. You can use Episode’s preset workflows or create your own; it’s dead simple.

Up to the Test
To run a few test I used an older MacBook Pro laptop (2.4 Intel Core Duo, 4GB of DDR3, Nvidia GeForce 9400M 256MB and SSD boot drive). It’s slow compared to today’s barn-burner mobile workstations, but it is something that could be used in a production that wants to set a standalone transcode station up without purchasing new equipment. Plus, I can run comparisons with a few different transcoders.

I used two QuickTimes that I created to run a few speed tests. The first is a one-minute H.264 QuickTime from a GoPro Hero 3+ Black Edition and the other is a 20-second ProRes QuickTime from a Blackmagic Pocket Cinema Camera, both are 1920×1080 running at 23.976fps.

My first test was to transcode the one-minute long, 356.7MB, GoPro QuickTime into ProRes using a few different encoders to get a sense of speed and file size, here are the results:

Episode Pro 6.5: three minutes and 11 seconds: 878.3MB
Adobe Media Encoder CC 2015: two minutes, 13 seconds: 868.7 MB
EditReady 1.3.4 — 57 seconds: 901MB

The second test was to transcode the 20-second 131MB, Blackmagic camera ProRes QuickTime to ProRes:
Episode Pro 6.5: 33 seconds: 133.7MB
Adobe Media Encoder CC 2015: 1 second: 131.2MB
EditReady 1.3.4 — 10 seconds: 135.9MB

The third test was to transcode the same: the 20-second, 131MB, Cinema Camera ProRes QuickTime to DNxHD 175, 8-bit:
Episode Pro 6.5 — 52 seconds: 192.9 MB
Adobe Media Encoder CC 2015 — 30 seconds: 194.4MB
EditReady 1.3.4. — 29 seconds: 195.3MB

Remember that while I do mention products like EditReady for transcoding, EditReady is limited in features when compared to Episode Pro, which is a fully featured encoding and transcoding solution — literally one transcoding tool that will do everything you need, including prep and upload your file to YouTube.

So what do I really think about those speeds? When transcoding a single file it really isn’t the fastest app out there. However, when needing to run two or more jobs at once using a fast computer like an HP z840 with 128GB of RAM and an Nvidia Quadro m6000, you will probably make up that time. In addition if you enable a Cluster with all of your Epsiode installed systems you will probably cut your time down tremendously (keep in mind you need a fully paid Episode license on each cluster system for this to work).
A Cluster is basically a workgroup of computers that can be used to run jobs simultaneously via IP connection. Clusters are relatively easy to set up, although I would suggest running it by your IT department first if you work in a proper workplace.

If you happen to have a few spare computers lying around your house, you can set up a Cluster and let Episode do the rest. This will allow up to two simultaneous jobs to be performed in parallel with an Episode Pro license and unlimited parallel jobs with Episode Engine (costing a little under $6K).

Another perk to the Episode Engine version is Split and Stitch, which allows one file to literally be split among many “nodes” of your encoding/transcoding Cluster, rendered and then stitched back together at the end. If you have a severe time crunch or just like to be the fastest encoder on the block you could save yourself tons of time with this feature.

Episode Caption Insertion Screenshot

Closed Caption Insertion
What really got me pumped about testing Episode 6.5 was the closed caption insertion feature. If you’ve ever seen the bill for closed caption insertion you know that it might be worthwhile to check this feature out. When I first started out, I figured I would be able to encode a ProRes HQ QuicktTme with a caption file and be on my way, sweet! Unfortunately it’s not that easy, Episode 6.5 can handle captioning files (.scc or .mcc) but can only insert or pass through into MXF, MPEG-2 and H.264 wrappers. You will need to purchase one of Telestream’s other products — MacCaption (for Mac) or CaptionMaker (for Windows) — in order to insert your captions into any sort of mastering format or Avid AAF (a sweet feature I just learned about is MacCaption or CaptionMaker can create an Avid compatible AAF that will allow you to place closed captioning on a data track within Media Composer for output). Cool new feature if it fits your workflow.

Final Thoughts
I wanted to touch on the differences in the versions. There are technically three versions of Episode: Episode ($594), Episode Pro ($1,194) and Episode Engine ($5,994). To me, besides the price differences, the real differentiating factors are the amounts of parallel encoding and higher end format support. Episode Pro and Engine allow for formats such as MXF, MPEG DASH, as well as image sequences. Episode Pro allows for two parallel jobs while Engine allows for unlimited, the standard version allows for one job at a time.

If you are a person who has an encoding farm at the ready or a post house who needs to run two or more encodes/transcodes in parallel, Episode Pro or Engine is for you. While my tests showed Episode running a little slow in a single system, one file job — in the right cluster based environment with multiple threads and multiple cluster nodes you could easily cut the transcode time in half. Telestream has put a ton of work into Episode with its depth of technical tweaks you can make to your resulting encodes. On the other side it’s super easy to jump in and add a preset transcode setting to your workflow, there is little knowledge needed.

Brady Betzel is an online editor at Margarita Mix in Hollywood. Previously, he was editing The Real World at Bunim Murray Productions. You can email Brady at bradybetzel@gmail.com, and follow him on Twitter, @allbetzroff.

Review: Rampant Design Tools’ latest updates

By Brady Betzel

If it seems like I’m reviewing Rampant Design Tools’ latest releases every few months, it’s because I am. Sean and Stefanie Mullen, the creators of Rampant Design Tools, are creating brand new sets of overlays, transitions, paint strokes, flares and tons of other tools every month.

Typically when I do reviews there isn’t much personal interaction with the business owners, but Sean and Stefanie made themselves available for questions every step of the way. Even when I’m not doing a Rampant review, I am emailing them and they are always ready to help and even give advice. For them it’s about their customers, and they are continually releasing top shelf tools that I believe every editor and motion graphics artist should have in their toolbox.

Digging In
Before I get into what is new, you should download their free samples at www.4kfree.com. Almost every editor I show these too says, “I had no idea that’s what those were. I thought they were just stock footage elements.” Rampant Design Tools are not stock footage elements; they are color overlays, animated motion graphic elements, transitions, glitches and more. They are elements that are used in any program that can apply an Add, Multiply, Screen or any other composite mode to footage — really to any NLE or VFX app made. If you are a Blackmagic Design DaVinci Resolve user you can jump into the edit mode, place the Rampant clip on top of your original clip, select the Rampant clip to composite, open the inspector and under the composite mode pop up menu select your desired mode.

Paint Stroke Sample copy

Paint Stroke

Typically, Add mode will do the job, but each mode has some cool differences that you will want to try out for yourself — for a stark contrast check out Hard Light. If you are an Avid Media Composer or Symphony user, check out my previous write-up on discovering the elusive composite or blending modes within Media Composer: https://postperspective.com/tutorial-blending-modes-rampant-inside-media-composer.

What’s New
I think of Rampant offerings as quick and efficient tools that can add texture and interest to footage. In their latest rollout of releases, Rampant has sets of Designer Overlays, Film Burns, Matte Transitions, Flare Transitions, Glitch Transitions, Paint Stroke Transitions, and even animated motion graphics for editors. I’ll go into a few of the ones I find particularly interesting, but to find out more check out http://rampantdesigntools.com/rampant-all-products.

Matte Transitions are really useful. Not only can they be used traditionally as transitions between scenes or footage, but they can also be used to reveal a color treatment. I really like to use Rampant Design Tools in non-traditional ways, such as using mattes to reveal color treatments or effects. In Adobe Premiere I will duplicate my footage in the timeline, apply a unique color treatment to the duplicate footage, add the “Set Matte” effect and tell it to use the alpha channel of the Matte Transition. While this is a unique way to transition a color effect, it can be used in all sorts of circumstances.

Designer Overlays Sample copy

Designer Overlays Sample

My favorite is when a producer or even another editor comes in and just wants something different; they don’t know what they want but they know it needs to be totally different. You can easily throw on a few different Rampant Design Tool overlays and get very different treatments quickly. You can even use the mattes to reveal text in a lower third or main title. It really adds depth to your work.

Paint Strokes are a really cool way to reveal or transition out of text or footage. I really like to use these to reveal color in a scene. Recently, I used it on a very desaturated piece I was working on. In the last 10 seconds of the piece I used a Paint Stroke to add a vibrant splash of paint to the project. The client really liked how it left a lasting impression of vibrancy and color.

If you have seen what is going on in the land of YouTube, you might have noticed how flashy and eye catching the videos are (and if you haven’t you better get over there and get inspired before you are asked to work on something and end up under-delivering in the “wow” department). One thing that gets tricky is designing new or altered transitions. Rampant Design has tons of transitions that are great to have in your editor’s toolbox. From the ever-popular Glitch transition to Flares, Paint Strokes and even Color Overlays. I like to add a white flash under a light leak to turn it into a transition sometimes.

Motion Graphics for Editors Sample copy

Motion Graphics for Editors

Finally, my interest was captured with the “Motion Graphics for Editors” bundle. It contains lots of motion graphics elements such as Grids, Signs, Rays, Loaders, Lines, pre-made aspect ratios or even Triangles. Typically these little elements can take a ton of time to create. Usually if you are looking for these elements you are an editor who knows enough about motion graphics to be dangerous but who doesn’t have time to create these elements individually. Some uses for these are lower thirds that would typically be a boring gradient with text over the top or infographics, and while infographics seem easy they are most definitely not. They take tons and tons of time if you want them to look great. They are really easy to use with Rampant alpha channels.

Summing Up
In the end if you are looking for elements that are not stock footage, but instead handcrafted elements like organic paint strokes or unique Designer Overlays, you need to get over to www.rampantdesigntools.com. I have experienced firsthand the power these elements have. I’ve been at the end of my rope on some projects that weren’t paying enough to validate the drain on my brain power, then, remembering I had Rampant Design Tools, spent about an hour applying about 20 different treatments, transitions and effects to footage, color and text.

Film Burns Sample copyMatte Transition Sample copy
Film Burns and Matte Transition

In the end the client was happy and I was happy that I didn’t have to spend my time creating the elements from scratch. Rampant Design Tools takes projects to the next level quickly and easily by dragging and dropping, allowing you to work faster and more efficiently, making you more money in the process. I leave you with these highlights: unique non-serialized graphic overlays; easily combine color corrections to make unique color grades; and the newly-added “Motion Graphics for Editors.”

Brady Betzel is an online editor at Margarita Mix in Hollywood. Previously, he was editing The Real World at Bunim Murray Productions. You can email Brady at bradybetzel@gmail.com, and follow him on Twitter, @allbetzroff.

Tips: What I know now but didn’t then

By Brady Betzel

I’ve passed my 10-year anniversary working in TV — specifically post production — and it’s really pretty crazy. When I started, I was an eager beaver willing to listen and do (almost) anything the “important” people told me I should do. Now, while I still like to think I am eager, I like to feel like I am a very informed beaver, albeit a pretty skeptical one.

The following are some myths about building a career based on my personal experience.

The Need to Say Yes to Everything
This one is a little polarizing because it touches on the working for free topic, which I don’t actively support. To me you aren’t working for free if you are able to develop a skill or use the project for your own benefit. Short term it might be “free” but the long-term benefits will pay off if you are able to learn and grow technically and/or creatively.

That being said, you don’t need to say yes to everything. Take this with a heavy dose of common sense, but if someone tells you to do something and your gut is saying the opposite, lean toward your inner voice. People tend to respect that more than if you always say yes, no matter the job. I learned this first hand when I was offline editing — sometimes editors are tasked with showing the client what they say they want, but they may think one thing and then end up with a completely different end product.

I edited a sizzle reel — a cheap way of making a pseudo-pilot where the content is not fully flushed out but may have a spark of an idea that editors sometimes cover in fancy light leaks and sparkles. The client said it would be easy (it wasn’t and never is), and they had a story producer that would give me editing points for a five-minute sizzle reel. Long story short, the story producer had a completely different (and frankly boring) story in mind for a sizzle reel.

As I watched all 12 hours of “awesome” material, I found about 30 seconds of real story… I thought. So while I edited their version, I also edited mine. Eventually they thought the whole thing needed to be re-done. I then sent them my version and they took it. They had a couple of notes but their five-minute already done sizzle reel turned into a completely different story in three and a half minutes. The moral of this story is don’t always be a 
“yes” person.

Moving Up the Ladder Quickly
Here is another that has bugged me for a long time, and I still struggle with it. I was an assistant editor for four years, and I feel my rise to editor should have come faster. I always saw assistants moving up quickly around me, the commonality (usually) was that they weren’t that good at their job. It seemed counterintuitive, but then I realized that just because you move up quickly in rank, doesn’t always mean you are qualified for the job — your boss may just want you out of their hair.

In the assistant editor world, that could mean that you are messing up tons of stuff that other people are fixing without you knowing (not that I experienced that or anything like that). So if you aren’t moving up the ladder quickly don’t stress about it. Be assertive, but don’t be rude.

You Must Know Editing, Color, Mixing…
There is nothing like real-world experience. There is nothing like sitting in a color grading session with the colorist powering DaVinci Resolve color panels, or being in a audio mix stage for the first time and hearing how powerful different mixes are.

However, you don’t always get the luxury of being mentored while sitting next to the colorist. You don’t always get to play with the lift, gamma or gain without worrying about messing up. Don’t be afraid to watch tutorials on YouTube, Lynda.com, RippleTraining.com or other paid or free training sites. When I do get a free moment, I often watch tutorials on YouTube and learn techniques I would never have thought of before. It doesn’t matter if you watch a 10-year-old teaching After Effects expressions or Mocha tracking Big Bird into a scene, if you become a master wireframe remover thanks to YouTube videos, you may very well earn the same paycheck and work on just the same films as someone who learned at USC.

Partying Vs. Networking
I firmly believe that you don’t need to live in Hollywood and go to the Chateau Marmont weekly to become an editor, or whatever post position you want to achieve. I live an hour and 20 minutes outside of Hollywood in an avocado orchard, and work on shows that millions of people watch each week. I rarely go to parties or events, and I still get jobs. My work speaks for itself.

However, I also feel that if I was more of a social person I may have different opportunities. So while partying isn’t always necessary, maybe take a middle road: do some networking (in-person and via social media) but also take some time away from the hustle and bustle of Highland.

Interning
I started my career as an intern on the show On Air with Ryan Seacrest, so this may be a little weird, but bear with me. I see a lot of people who work in TV turn their noses up when they hear people that haven’t interned before get jobs. I have to admit I was one of those people until I realized you don’t need a college degree, internship or any formal training for that matter.

If you know how to make an opening title graphic in Cinema 4D better than someone with a Master’s Degree in communication, the fact is that you just do. Don’t be ashamed and don’t feel like you don’t deserve a job over someone else. Just go for it.

Keep in mind that doesn’t give you an excuse to be complacent and uninformed about your job description and duties.

Obviously, all of these tips are to be taken with humility and common sense, but in the end if you have the talent, drive and fortitude to stand up for your ideas, then you can make it in post production, even if it means taking a few extra years to become a quality audio mixer, sound designer, visual effects artists, motion graphics maniac or whatever.

Brady Betzel is an online editor at Margarita Mix in Hollywood. Previously, he was editing The Real World at Bunim Murray Productions. You can email Brady at bradybetzel@gmail.com, and follow him on Twitter, @allbetzroff.

Review: G-Tech’s G-RAID removable dual drive system

By Brady Betzel

In our datacentric world, needing storage that exceeds 1TB or 2TB is commonplace. Beyond that, pros typically need something that can be quickly formatted to ever-changing RAID demands. One day I might need a RAID-0 configuration for multimedia tasks, including video editing, and the next day I might want to have a little safety net with RAID-1. Or I might want to treat my RAID as a bunch of different disks so I could eject and insert the drives at different times (known technically as a JBOD — just a bunch of disks).

RAID-0 combines both drives into one larger drive to achieve higher speeds than if separate. RAID-1 duplicates the data being transferred to both drives, so in case of a single-drive failure the other can still function with minimal data loss — although you are limited to the size of one of the drives (i.e. in this 8TB drive you only have 4TB of usable space). JBOD is the same as having both drives plugged in separately — it allows for flexibility in terms inserting and ejecting disks (handy when running drives between multiple removable drive docks).

Out of the Box
I was sent an 8TB G-RAID stocked with two HGST (remember Western Digital? HGST is owned by them) 4TB drives, SATA 6.0Gb/s, 7200 RPM drives enclosed in removable drive bays. On first look, the external RAID looks exactly the same as any G-Tech drive that you are used to seeing — the polished aluminum (much like the older Mac Pro towers), front grate with the “G” logo, status light directly under the “G,” and multiple connections on the rear.

G-RAID_Removable_Drive_LowRes copyI was intrigued by this drive because of the removable drive bays.  I had seen G-Tech drives with non-removable drives and I was not a big fan, simply because if something goes down you have to send the entire thing in for warranty repair (if you are within your warranty). You can’t just get another drive and be quickly back up and running. With this latest release, if you are in a time-critical environment and a drive fails (barring the loss of data), you can have a spare and be back up and running in under an hour instead of days — all thanks to the removable drives. Keep in mind G-Technology does not warranty against data loss, only hardware failure due to technical problems, not wear and tear.

Removing Drives
Getting around the hardware is pretty simple. To open the bay and get to the removable drives, you push the G on the front of the external enclosure. Once inside you can push another G to release the drive bays, slide them out and back in. Around the back are the connections — the dual FireWire 800 ports, one USB 3.0 port and one eSATA port. In addition you have the power connection, power button, rear fan and Kensington lock port. For some reason I do not care for the push button power switches on the G-Tech external drives; they feel a bit flimsy to me, although it doesn’t affect the drive (unless the switch breaks). The power supply that was sent is made by Asian Power Devices, but is referred to as the G-RAID/G-Dock ev power adapter on www.g-technology.com, in case you need to buy a replacement or an extra for $24.95.

G-RAID_Removable_Rear_LowRes copy

Once plugged in I wanted to immediately format the RAID into a RAID-0 for testing. Luckily for me the G-RAID comes pre-formatted in RAID-0 configuration for Mac computers. I quickly found out that in order to properly interact with the formatting on this drive I had to download the G-Technology RAID Configuration Utility. G-Tech makes a version for Mac and Windows, and I would suggest that you download it first, here: http://support.g-technology.com/support/g-raid-removable. This way your computer will be able to easily prepare the G-RAID for RAID-0, -1 or JBOD configuration. Keep in mind that this is necessary to communicate properly with the hardware RAID that is used onboard the G-RAID. You must run this when configuring a new RAID, then open Disk Utility on a Mac or right-click on “My Computer” and click manage in Windows. Formatting drives can be a little tricky if you aren’t familiar, so do some research before formatting any drives.

Once formatted, I used AJA’s System Test on each of the connection ports on each of the different RAID configurations. I emulated a 1920×1080 10-bit video file at 16GB. Here are the results:

RAID-0
USB 3.0: Read – 118.6MB/s – Write – 187.6MB/s
FireWire 800: Read – 67.9MB/s – Write – 52.8MB/s
ESATA: Read – 226.2MB/s – Write – 207.1MB/s

RAID 1
USB 3.0: Read – 115.4MB/s – Write – 163.6MB/s
FireWire 800: Read – 67.5MB/s – Write – 52.6MB/s
eSATA: Read – 163.6MB/s – Write – 162.1MB/s

JBOD
USB 3.0: Read – 116.0MB/s – Write – 163.6MB/s
FireWire 800: Read – 66.9MB/s – Write – 51.4MB/s
eSATA: Read – 163.8MB/s – Write – 164.2MB/s

Keep in mind these are raw results. Factors such as an empty or full drive can affect the outcomes. But overall they seem par for the course. If you are looking for something that is far and away faster, definitely check out the G-RAID with Thunderbolt 2. It will knock your socks off.

G-RAID_Removable_Front_LowRes copy

Summing Up
In the end, the drive looks good next to a MacBook Pro, MacBook Air or older Mac Pro tower. The array of connections helps when lending the drive to a client or another editor. If you have a few bucks, I would suggest buying a second power adapter just in case yours gets lost or gets cut in half somehow, and maybe even a spare drive just in case. (I like preparing for the worst.) If only one drive to goes out, and you put your G-RAID into the RAID-1 configuration, you might be able to put that spare drive in and rebuild your RAID.

G-Tech gives a three-year limited warranty with this particular G-RAID, which will cover any errors in craftsmanship of the drives or enclosure itself. If you drop it, no warranty for you.

G-Tech is one of those names that everyone recognizes and has in their mind when purchasing an external drive, if you are a single person editing or a VFX company, check out G-Tech’s line of RAIDS, including this 8TB monster!

I leave you with these highlights: Removable dual enterprise class (7200RPM) drives; USB 3.0 (2.0), FireWire 800 and eSATA compatible; and RAID-0, RAID-1 and JBOD configurations supported.

Brady Betzel is an online editor at Margarita Mix in Hollywood. Previously, he was editing The Real World at Bunim Murray Productions. You can email Brady at bradybetzel@gmail.com, and follow him on Twitter, @allbetzroff.

Review: Lenovo ThinkPad W550s Ultrabook mobile workstation

By Brady Betzel

Over the last few years, I’ve done a lot of workstation reviews, including ones for HP’s z800 and z840, Dell’s mobile workstations and now the Lenovo ThinkPad W550s mobile workstation.

After each workstation review goes live, I’m always asked the same question: Why would anyone pay the extra money for a professional workstation when you can buy something that performs almost as good if not better for half the price? That’s a great question.

What separates workstations from consumer or DIY systems is primarily ISV (Independent Software Vendor) certifications. Many companies, including Lenovo, work directly with software manufacturers like Autodesk and Adobe to ensure that the product you are receiving will work with the software you use, including drivers, displays, keypads, ports (like the mini display port) and so on. So while you are paying a premium to ensure compatibility, you are really paying for the peace of mind that your system will work with the software you use most. The Lenovo W550s has ISV-certified drivers with Autodesk, Dassault, Nemetscheck Vectorworks, PTC and Siemens, all relating to drivers for the Nvidia Quadro K620M graphics card.

W550s_Standard_05

Beyond ISV driver certifications, the Lenovo ThinkPad W550s is a lightweight powerhouse with the longest battery life I have ever seen in a mobile workstation — all for around $2,500.

Out of the box I noticed two batteries charging when I powered on Windows 8.1 — you can choose Windows 7 (64-bit) or 8.1 (64-bit). One of the best features I have seen in a mobile workstation is the ability to swap batteries without powering down (I guess that’s the old man in me coming out), and Lenovo has found a way to do it without charging an arm and a leg and physically only showing one battery. For $50 (included in the $2,500 price), you can have a three-cell (44Whr) battery in the front and a six-cell (72Whr) battery in the back. I was able to work about three days in a row without charging.

This was intermittent work ranging from sending out tweets with 10 tabs up in Chrome to encoding a 4K H.264 for YouTube in Adobe Media Encoder. It was a very welcome surprise, and if I had a second battery I could swap them out without losing power because of the battery in the front (built-in).

Under the Hood
The battery life is the biggest feature in my opinion, but let’s layout the rest of the specs… Processor: Intel Core i7-5600U (4MB Cache, up to 3.20GHz – I got 2.6); OS: Windows 8.1 Pro 64; Display: 15.5-inches 3K (2880×1620), IPS, Multi-touch, with WWAN; Graphics: Nvidia Quadro K620M 2GB; Memory: 16 PC3-12800 DDR3L; Keyboard: backlit with number keypad; Pointing Device: trackpoint (little red joystick looking mouse), Touchpad and Fingerprint Reader; Camera: 720p; Hard Drive: 512GB Serial ATA3, SSD; Battery: three-cell Li-Polymer 44Whr (Front), six-cell Li-ion 72Whr Cyl HC (Rear); Power Cord: 65W AC Adapter; Wireless: Intel 7265 AC/B/G/N dual band wireless plus Bluetooth; Warranty: one-year carry-in (diagnosed by phone first).

The W550s has a bunch of great inputs, like the mini display port, which I got to work instantly with an external monitor; three USB 3.0 ports with one of them always on for charging of devices; a smart card reader, which I used a lot; and even a VGA port.

W550s_Product tour_06 W550s_Product tour_05

In terms of power I received a nice Intel i7-5600U Quad Core CPU running at 2.6GHz or higher. Combined with the Nvidia Quadro K620M and 16GB of DDR3L, the Intel i7-5600U delivered enough power to encode my GoPro Hero 3+ Black Edition 4K timelapses quickly using the GoPro software and Adobe Media Encoder.

Encoding and layering effects is what really bogs a video editing system down, so what better way to see what the W550s is made of than by removing the fisheye on my clip with an effect on the image sequence containing about 2,400 stills in Adobe Premiere, speeding up the timelapse by 1,000 percent and sending the sequence to Adobe Media Encoder? In the end, the W550s chewed through the render and spit out a 4K YouTube-compatible H.264 in around 15 minutes. The CUDA cores in the Nvidia Quadro K620M really helped, although this did kick the fans on. I did about six of these timelapses to verify that my tests were conclusive. If you want to see them you can check them out on YouTube.

The Quadro K620M is on the lower end of the mobile Quadro family but boasts 384 CUDA cores that help with the encoding and transcoding of media using the Adobe Creative Suite. In fact, I needed a laptop to use in a presentation I did for the Editors’ Lounge. I wanted to run After Effects CC 2014 along with Video Copilot’s Element 3D V1.6 plug-in, Maxon Cinema 4D Studio R16 and Avid Media Composer 6.5, all while running Camtasia (screen capture software) the entire time. That’s a lot to run at once, and I decided to give the W550s the task.

In terms of processing power the W550s worked great — I even left After Effects running while I was inside of Cinema 4D doing some simple demos of House Builder and MoText work. I have to say I was expecting some lag when switching between the two powerhouse software programs, but I was running Element 3D without a hiccup, even replicating the text particle and adding materials and lighting to them – both a testament to a great plug-in as well as a great machine.

While the power was not a problem for the W550s, I did encounter some interesting problems with the screen resolution. I have to preface this by saying that it is definitely NOT Lenovo’s problem that I am describing, it has to do with Avid Media Composer not being optimized for this high resolution of a screen. Avid Media Composer was almost unusable on the 15.5-inch 3K (2880×1620), IPS, multi-touch screen. The user interface has not been updated for today’s high-resolution screens, including the W550s. It is something to definitely be aware of when purchasing a workstation like this.

I did a few benchmarks for this system using Maxon Cinebench R15 software, which tests the OpenGL and CPU performances as compared to other systems with similar specs. The OpenGL test revealed a score of 35.32fps while the CPU test revealed a score of 265cb. You can download Cinebench R15 here and test your current set-up against my tests of the W550s.

There are a couple of things cosmetically that I am not as fond of on the W550s. When you purchase the larger rear battery, keep in mind that it adds about ¼- to ½-inch lift — it will no longer sit flat. In addition the keyboard is very nice and I found myself really liking the addition of the numeric keypad, especially when typing in exact frames in Premiere, the touchpad has the three buttons on top instead of underneath like I have typically encountered. On one hand I can see how if you retrain yourself to use the three buttons with the left hand while using your right hand on the touch pad it may be more efficient. On the other hand it will get annoying. I like the idea of a touchscreen, in theory — It’s nice to move windows around. But practically speaking, from a video and motion graphics standpoint, it probably isn’t worth the extra money and I would stick to a non-touch screen for a mobile workstation.

The last item to cover is the warranty. Typically, workstations have a pretty good warranty. Lenovo gives you a one-year carry-in warranty with the purchase of the W550s, which to me is short. This really hurts the price of the workstation because to get more than a three-year warranty — one that will actually help you within a business day if a crisis arises – will cost you at least a few hundred dollars more.

Summing Up
In the end, the price and awesome battery life make the Lenovo ThinkPad W550s a lightweight mobile workstation that can crunch through renders quickly. If I was ordering one for myself I would probably max out the memory at 32GB, get rid of the touchscreen (maybe even keep the 1920×1080 resolution version) and keep everything else… oh, I would also upgrade to a better warranty.

Before you leave, take these highlights with you: extreme battery life, lightweight and durable, and powerful enough for multimedia use.

Review: Adobe Video CC 2015 Updates

By Brady Betzel

The big update to the Adobe Video collection is here. It features some heavy hitters in terms of offerings. If you really want to see what the fuss is all about, go and update your Adobe apps, read this write up and get to playing…NOW!

One addition to the line-up that I think is very important to the future of the Creative Cloud ecosphere: Libraries. These aren’t FCP X libraries or LightRoom libraries; the new Creative Cloud libraries are basically a way to share common assets between Adobe apps, including their new iOS app Hue CC — I’ll get to that shortly.

A big gap in Adobe Premiere’s data sharing offerings is the ability to rival Avid’s ISIS collaborative working environment, with sequences being worked on concurrently between teams of editors. However, this is one step in that direction, and I hope they will continue to evolve this concept to eventually work in an internally networked environment where teams of two or 50 can work on the same Premiere, After Effects or Speed Grade projects concurrently. While you can share moving media in libraries, it’s just that…a library, not a way to share projects or sequences.

Adobe Hue's Look Library

Adobe Hue’s Look Library

Hue
Something that really got me to bite on this Adobe update was the addition of the iOS app Hue. Simply put, you take a picture with your iPhone or iPad in the Hue app (which is connected to your Adobe Creative Cloud login via the Libraries), and the app interprets the light and colors and creates a swatch. This swatch can then be applied to footage in Premiere, Premiere clip or After Effects. Imagine if you are witnessing a beautiful sunset in Hawaii with great purples, reds and oranges — take a pic in Hue, choose those colors you like and later on, or immediately, apply it to your footage. This is a great way to take advantage of current technology.

Premiere
In the past, Premiere had been thought of as the NLE in the back of the room, usable but never quite at the level of FCP or Avid. Over the past couple of years that view has changed and the tool has not only gained traction, but, in my opinion, has started to pass the competition… in some aspects. These days I use Premiere as the Swiss Army Knife in my post toolbox. It can open practically every video codec and resolution and decipher many XMLs or AAFs from other NLEs, coloring suites and VFX software packages. Oh and it’s an editor too.

The latest update to Premiere Pro has added some awesome preset “workspaces.” At the top is now a menu that gives you the options for different workspaces such as Editing, Effects and Color. These are basic preset workspaces that actually work quite nice. They open common windows that make sense when working in certain modes like color correction. You can delete, create or even modify existing workspaces if you like. While it’s really just a reimagining of preset workspaces, I think it really helps someone to jump right into using Premiere to its fullest abilities without having to fumble around finding where different windows are.

Premiere Pro's Workspace editing

Premiere Pro’s workspace editing

Up next is Premiere’s addition of pseudo-live scopes and consolidated color tools directly inside of Premiere in the new Color workspace. (After reading this breakdown you may ask yourself, “Will SpeedGrade be around much longer?” I’m really not sure if Adobe imagines Premiere to become more of a Resolve or not, but it seems like a logical progression.)

The new color workspace and tools are referred to as the Lumetri Color panel and Lumetri Scopes. In previous versions of Premiere we had scopes, however they wouldn’t play in realtime, which if we are going to be honest really makes color correction difficult. The newly updated Lumetri Scopes update live while playing a video clip or sequence. I did notice some lag when playing a sequence (I tried both a 1080p and a 4K clip with the same results) — it seems to be a few frames. I went one frame at a time down the timeline and even once I stopped, the scopes continued to update.

For this review by the way, I am working on a Lenovo W550s mobile workstation that contains an Intel i7 2.6GHz processor (two cores, four threads), 16GB of RAM and an Nvidia Quadro 620M. It’s not a slow computer but it also isn’t an HP z840, so take from that what you will. Software scopes are nice for a quick reference, but if you are doing constant scope referencing (which you probably should be), you may want to take a look at Scope Box or an external hardware scope.

Some things I would love to see in the future would be to have the ability to zoom in on the vectorscopes and reference under “0” in the RGB Parade, as well as have the ability to dock individual scopes into different windows. If I had the luxury of three monitors, and my system could handle it, I would love to have the scopes docked on the third screen. Those are nit-picky wishes I guess, but Premiere is on its road to glory so why not get all the details sorted out.

Before I leave Premiere, under the new color workspace and inside the color panel are the same curve panels and new Hue/Saturation tool, which can be very handy, as well as a basic color correction tab where you can do things like input your LUT or do some basic exposure correction. Adobe has introduced a newly renovated three-way color corrector, and if combined with a nice color panel like the Tangent Element would operate nicely. (I wasn’t able to test this, but keep an eye on this space for an upcoming Tangent Element panel review.) Inside of the Creative tab is where you can load a look or dial in your own creative grade. Overall, this is a phenomenal addition to the already vast toolset of Premiere.

CharacterAnimator_TimelineTracking

Character Animator
There is a game changing app that Adobe is releasing called Character Animator. At the moment it is a separate app that can track your (and your friends’) facial movements and, in realtime, apply them to a puppets’ facial features with little programming knowledge. It really is as simple as that. You can go much deeper but for this review I will just say it’s amazing and you must try it for yourself to really understand what it can do.

You really get the feeling of how powerful this will be in the future. You turn on your webcam and you are controlling a puppet just by talking. I can’t say enough how amazing this is. Really, just download a trial of it already! I got caught up, playing with it for hours. Even my wife, who leaves the nerdy tech stuff to me, was blown away, and it’s really easy to use. Of course, you can dive in deeper and get super complicated if you want to.

After Effects
A huge update to After Effects in this latest release is the ability to preview while adjusting parameters in the timeline. While I couldn’t get it to continue playing while I was adjusting the parameters, once I stopped adjusting it did continue playing. So it’s not like you can adjust a curve while video continues to roll; it will stop for the time you are adjusting and pick up when you stop adjusting. It is still an awesome and needed feature.

AfterEffects_CreativeCloudLibraries AfterEffects_FaceTracker_DetailedTracking

The new Face Tracker is something I find intriguing. I quickly tried to track a face and was able to get the outside of the face — one eye and the mouth — to adequately track, however, one eye didn’t lock on. It was pretty accurate when it worked, but it didn’t always work. You can quickly see how your workflow will speed up if you have a lot of very tight face blurring or eye color changing to do. Unfortunately, I wasn’t able to get a good track on faces that weren’t facing towards the screen.

Adobe Media Encoder
I wouldn’t be doing my reviewer duties if I didn’t mention a few of the Adobe Media Encoder updates in this latest release. First off, I love Adobe Media Encoder, it’s fast and has pretty much every option I need. In this release Adobe has added Dolby Digital and Dolby Digital Plus support, QuickTime Channelization and rewrap (think one QuickTime with multiple types of audio layouts, eliminating multiple QuickTime deliveries — this sounds awesome to me!), MXF-wrapped JPEG 2000 format and Time Tuner.

Time Tuner is a weird one for me. While in theory it sounds great, I just don’t see it being used in many broadcast workflows. Time Tuner allows the encoding operator to shorten or lengthen a QuickTime based on time or percentage. Often networks require strict total run times when delivering master show files. For instance, if a network requires the total run time of your show to be 42:10 and the final edit is 42:00 for whatever reason (often indecision), what are you supposed to do if you absolutely can’t cut content to meet your total run time? Well Time Tuner is designed to rescue you… that is, if you don’t care where that time is stretched (or shortened) in your QuickTime. It is limited to plus or minus 10 percent of your total run time, so it isn’t completely crazy.

MediaEncoder Dolby Digital_SurfacePro

The issue I have is that 90 percent of the time the act structure of a show is specific, i.e. the act breaks must be :10 or acts must all start or end on zero frames, meaning you can’t arbitrarily add or remove video without destroying the exact start and stop of content. It’s possible you could luck out, but I wouldn’t gamble on that. On a positive note, I have seen shows that have alternate deliverables (often for international delivery) that don’t have strict time requirements, in addition to not having act breaks. In that instance, this could save your butt if you need an additional :30 of content.

Remember that all the program is doing is essentially speeding up or slowing down your content over the course of the entire QuickTime. If you are incrementing or decrementing only a little, then it will probably not be noticeable. However, if you are pushing the 10 percent stage, it might not be acceptable. I tested this on a 19-second and 14-frame QuickTime that I needed to be at an even 20 seconds. The result looked great — I didn’t visibly notice a difference — however the time it took to encode with Time Tuner was expectedly longer, about 1-2 seconds per frame additional. This will add up over an hour-long piece of content.

Summing Up
The latest updates to the Adobe Creative Cloud video apps are really laying the groundwork for some great advances in production and post production technology. The new Character Animator is simply amazing, even if you and your kids just play around with it for now. I can’t even fully understand the future implications of this tracking technology.

Premiere Pro

Premiere Pro

The latest updates to Premiere are furthering its advancement in the race between the major NLE contenders. Premiere added a feature called Morph Cut to this latest update (think of it as an updated and more advanced version of Avid’s Fluid Morph). If you have an interview shot where the person stumbles over a word and you want to cut it out, Morph Cut tries to stitch together the cut by searching the video for similar frames and/or morphing the footage to try and look seemless — keep in mind the circumstances have to be almost perfect for this to work correctly, and if they aren’t perfect it looks like a mistake or glitch. Features like this remind me that Adobe is listening to their customers and even if the feature isn’t for me personally, they are pushing the boundaries to make great apps that work together beautifully and address many concerns of its users.

After Effects features greatly improved preview functions in addition to Face Tracker, which could come handy in the right circumstances. SpeedGrade, Prelude, Audition and Media Encoder all had various updates but the real advancement is Creative Cloud Libraries. Adobe has gotten their feet wet with true team collaboration by integrating a live library between Adobe apps via the Creative Cloud, allowing access to different assets between apps (and can be contributed to by multiple creative cloud users). Hopefully, there will be a time when this includes large amounts of video in the team-based environment where users can work on the same project simultaneously with proper file locking, much like Avid Media Composer and its ISIS collaboration.

I leave you with these top three highlights: Adobe Premiere’s interface has been updated and improved with the integration of color correction toolsets; Adobe After Effects’ preview engine will now run while you are making adjustments; and Adobe Hue (formerly known as Project Candy) allows for interesting use of real life color palettes by way of an iOS app to be used in Premiere and After Effects through Libraries.

Brady Betzel is an online editor at Margarita Mix in Hollywood. Previously he was editing The Real World at Bunim Murray Productions. You can email Brady at bradybetzel@gmail.com, and follow him on Twitter, @allbetzroff.

First Impressions: Blackmagic’s DaVinci Resolve 12

By Brady Betzel

While I wasn’t able to get to Las Vegas for NAB this year, I was definitely there in spirit thanks to constant Twitter updates and blog posts around the web. The company that stood out to me the most was Blackmagic Design. They introduced tons of awesome equipment and products, including the latest update to DaVinci Resolve. I was really interested in what I was seeing: multicam workflow, AAF exporting, 3D tracking… it was overwhelming.

You might have noticed that in addition to my day job as an editor at Margarita Mix, I do a lot of product reviews. I love the process. Why wouldn’t I? I get to play with the latest and greatest offerings in production and post.

While I don’t have the DaVinci Resolve 12 update yet, the senior director of marketing and all around guru for Blackmagic, Paul Saccone, gave me an in-depth tour of what is going to be released in the latest version. Before I review the software I wanted to share a couple of key updates that are seemingly turning DaVinci Resolve into what many had hoped Avid Symphony would maybe turn into.

Multicamera Workflow
Working with multiple cameras can often be tricky. Syncing and grouping them together isn’t always as straightforward as one would hope. When I was an assistant editor I remember spending hours and days grouping footage. Sometimes I would be able to sync by timecode and sometimes not. I would be lucky to get a clap or some sort of sync reference from the people recording in the field. When none of that was available and my clips seemingly had very little in common I would resort to using PluralEyes by Red Giant, which is still a great and useful tool. The only problem is that it’s an external app and if I can avoid it I would much rather work inside my NLE or online suite.

Blackmagic has added what seems to be an awesome integration of multicam workflow into Resolve 12. You can even sync by audio, just like PluralEyes does! That should be a great feature.

The best part about Resolve 12’s multicam workflow is the ability to modify and add to existing groups by simply editing the group like a sequence. If your group is out of sync, open up the group sequence, put it in sync and your group will be immediately updated. For us Avid users out there this means no more re-grouping yuck. You can even add cameras or audio tracks to your group later!

Nested Timelines
You can now nest a sequence inside of your current sequence. If you are assembling a final edit you may want to lay out your acts in linear order for timing reasons and then once all the acts are “final” (we know nothing is ever final), you can now “decompose in place,” meaning break out all of your clip-based edits in the same timeline you are working in without having to overcut. Really a great feature.

3D Keyer and Tracker
If you’ve seen how Imagineer System’s/BorisFx Mocha Pro planar tracker works or Adobe After Effects’ 3D tracker works, you know there are some amazing options to track. Unfortunately these are usually not the tools you work in to conform and online your work. In Resolve 12, there is a new 3D tracker and 3D keyer that from first glance will be all you need for basic to semi-advanced work. It doesn’t seem like these will be full replacements of Keylight in After Effects or planar tracking in Mocha Pro, but if Blackmagic can keep me in one NLE/coloring platform/compositor without having to farm out tasks to After Effects or another program, I am definitely listening.

The features I listed here are only a couple that I think are amazing. In addition, there are features like shot color matching, AAF to Pro Tools export, improved media management features, improved trimming functions, overall layout improvement, smart bins and many more.

I hope to review DaVinci Resolve 12 in a few months, and am really excited to run it through its paces. I’ve been venturing deeper into different compositing apps, coloring correcting packages and NLEs and am really impressed by the way Blackmagic is digging in and starting to outpace other software and hardware makers. Maybe they really can make the ultimate NLE/compositor/color corrector — we’ll have to wait and see.

If you want to get a quick video run through of the new features being released, check out Blackmagic Design’s website and click on “What’s New.” You can also follow them on Twitter @Blackmagic_News.

Brady Betzel is an online editor at Margarita Mix in Hollywood. Previously, he was editing The Real World at Bunim Murray Productions. You can email Brady at bradybetzel@gmail.com, and follow him on Twitter, @allbetzroff.

Review: Red Giant’s Shooter Suite 12.6

A suite of seven tools helps you get organized, saving you time and money

By Brady Betzel

If you have ever been an assistant editor, an editor who had to prep their own content, or even a production assistant who was asked to go way beyond their job title and perform DIT duties on set, you know that without the right tools, that “quick and easy” job you took on Craigslist can quickly turn into a 12-hour day and immediately become not worth the time.

If you have ever regretted taking a job after realizing that the job description stated “content is already organized and in sync” but they really meant you are the one that will need to “organize and sync” (without jam sync timecode), then you need to pay close attention to this review of Red Giant Shooter Suite. Red Giant Shooter Suite (currently in version 12.6) contains seven tools that will help make offloading, organizing, proper metadata entry, synchronizing, denoising, upressing, deinterlacing and LUT-ing much more enjoyable.

Included in the Shooter Suite package are PluralEyes, Offload, Bulletproof, Denoiser, Instant 4K, Frames and LUT Buddy. In this review, I will dig in deep to just a few of the tools, but every tool is worth way more than the price of $399 (a savings of $345 if each were bought individually, although LUT Buddy is a free download). Keep in mind that every tool could potentially save you tons of time, money and more importantly your sanity.

Offload

Offload

Offload
First up is Red Giant Offload. Offload is one of those really simple tools that can potentially save your job. More often than not I see and read about jobs that ask for an “experienced” DIT, editor or assistant editor to be on set to transfer footage from cameras, log, add LUTs and many other responsibilities. The only problem is they usually want to pay $100/day — not a good rate-to-job responsibility ratio. (My advice is don’t take that job.) So to those posting these ads on Staff Me Up and Craigslist, do those poor new hires a favor and buy them Offload, at the least.

Offload allows the user (Mac or Windows) to choose a media source folder (including XDCAM, GoPro, etc.), a folder to copy the assets to and a folder to copy a back-up of the assets to. This all happens very simply and even shows a preview thumbnail of the files at the bottom. One issue I had with Offload was that as I was transferring some footage from my GoPro Hero3+ Black Edition, I plugged in a drive with XDCAM footage on it and crashed the program, so one word of advice: let all of your files transfer before plugging in more drives.

Anybody could understand this awesomely simple interface, however, the real power lies in the checksum verification (a way to check the integrity of transferred data when compared to its original source media — but keep in mind it doesn’t check for errors, only that the beginning file and end file match exactly) that is done almost instantly after the files are done being copied or backed up. In addition, the checksum text file can be located and copied. Trust me, keeping a copy of things like the checksum file may seem trivial, but when you get an editor or director that wants to blame the low person on the totem pole, you can whip out this file and their argument ends there… hopefully.

Checksum

Checksum

Bulletproof
If your skill set lies beyond Offload, you’ll want to open Bulletproof, which takes the idea of copy and backup to the next level with addition of metadata editing, light color correction, curves adjusting and even LUT application. I really love the color adjusting with the Red Giant Colorista three-way and LUT addition — it even comes with a few LUTs like Prolost Flat to Warm. It’s quick, easy and with some practice, anyone with some post knowledge could learn to become a makeshift DIT.

I don’t have much real-world experience as a DIT, so I can’t definitively say if this is a lifesaver for that world, but what I can say is that with 10 years experience in post production I would feel comfortable asking my assistant editor to take a half-hour in Bulletproof, read the online manual and come up with a preset for a Bulletproof workflow to create ProRes Proxy files to edit with.

BP_1_2_IMPORT_VIEW copy

In terms of export options you can choose between PhotoJPEG, ProRes or H.264 MPEG-4 AVC, with or without a timecode burn-in, embedded XMP metadata and appended file names. It really is a one-stop shop for dailies creations. I would definitely recommend a newer iMac or, better yet, a new Mac Pro with 64GBs of RAM when dealing with high-resolution files — my five-year-old MacBook Pro was a little sluggish.

PluralEyes
My favorite part of Red Giant Shooter Suite is by far PluralEyes. For years I have been using PluralEyes with Avid Media Composer and Symphony, always hoping Avid would eventually incorporate this technology into their grouping, but unfortunately that hasn’t happened. It seems others did catch on like the forthcoming Blackmagic’s DaVinci Resolve 12, but I digress. PluralEyes 3.5 is the latest release candidate, and boy does it kick butt.

Pre-SyncPost-Sync
PreSync and PostSync

When taking that time-challenging job I talked about earlier, where the job poster claimed the media to be edited was already organized and in sync but in reality wasn’t, PluralEyes is the tool that will solve your problems as long as they recorded audio. From personal experience, I have taken over 40 hours of footage, without jam sync timecode, and PluralEyes lined it all up almost perfectly, although it took some time to match up the footage and isolate audio tracks. To make this work you must have audio for PluralEyes to sync the media properly.

PluralEyes works by analyzing waveforms and matching up what it thinks should match. Most of the time it is correct, but occasionally it isn’t. When it isn’t correct you can manually adjust the sync. What I love best is that when using it for apps like Adobe Premiere and Media Composer, your media comes back in fully editable.

PluralEyes doesn’t do any encoding or transcoding of your media files if you don’t want it to, so if you want to add new clips that don’t have audio but happen to have proper slating with timecodes, you can add it and start your multi-grouping. If you do want PluralEyes to make media files with the in-sync video and audio, it can do that too. This helps in apps that don’t support XML or AAF importing.

In Adobe Premiere CS6 or CC you can even use Red Giant’s PluralEyes 3.5 Connector, which allows you to transfer projects back and forth without the hassle of XMLs. To set up a project to be analyzed in PluralEyes you must import your media; add it to a timeline or sequence with different cameras or isolated audio on different layers; export the XML or AAF with linked media to PluralEyes; run PluralEyes’ synchronize command and, depending on how many hours of media, you have you could be off and running in under an hour.

I’ve previously tested this on an ISIS-connected Media Composer and it worked pretty well when using an AAF and linking to the media. If you have specific questions about how to properly set up an AAF for things like round-tripping with PluralEyes, tweet me @allbetzroff — I might be able to help you out.

A caveat to be aware of is identifying sync settings that you may need to enable like, “Try Really Hard.” If you have very low or rough audio you may want to check this option. It will go a lot more in depth when matching audio. The only downside is that it will take a lot longer to sync.

Denoise

Denoiser II

More Tools
The final few tools included in the Red Giant Shooter Suite are Denoiser II, Instant 4K, Frames and LUT Buddy. Denoiser II is one of those tools you can just drop on a noisy clip and for the most part it does its job without any tweaking (although you can do some fine tuning, like luma offset or chroma smoothing). I dropped in a free stock footage clip from www.dissolve.com featuring a coffee mug. The footage wasn’t terrible to begin with but after bringing it into After Effects CC, I zoomed in close and saw some noise. I dropped on Red Giant Denoiser II and without any tweaking saw a vast improvement. What I like is that it got rid of the noise and some blocky artifacting without having to specify anything. It really made the clip look great; I might add a little noise back into the clip, color, then export.

Instant 4K is a plug-in for Adobe Premiere and After Effects that will uprez any footage to 720p, 1080p, 2K, 4K and many other resolutions. I added the effect to a 1920×1080 piece of footage and upscaled it to my composition’s width of 4096. The footage looked pretty great even after being upscaled so much.

To find out about the last two apps: Frames and LUT Buddy, check out http://www.redgiant.com/products/shooter-suite or follow Red Giant on twitter: @RedGiantNews. Do yourself a favor and if you aren’t following Aharon Rabinowitz already follow him @ABAOProductions. He has a host of tutorials and general amazing Red Giant awesomeness going on.

Summing Up
In the end, if you want a complete toolset for copying, organizing, verifying, synchronizing, denoising, and upressing your images, then you will want to purchase the complete Red Giant Shooter Suite 12.6. Personally, I think Pluraleyes and Bulletproof are worth the $399 cost of admission — add in the magic of Denoiser II and the 4 other Red Giant video tools and you have a true technical value pack. Every person who works in freelance production and post that is in the realm of onset shooting, organizing media, and assistant editing should have Red Giant Shooter Suite in their toolbox.

Just to reiterate, here are my top Red Giant Shooter Suite 12.6 highlights: it contains seven highly specialized video tools in one package saving over $345; it comes with newly released Offload for easy and worry-free transfer and backup of master files from camera memory cards; and it includes the lifesaving PluralEyes 3.5 auto-syncing app, which can really save hours of work.

Review: Forbidden Technologies Forscene

This cloud-based editing platform doesn’t want to replace your NLE, but instead complement it.

By Brady Betzel

In December of 2013 I wrote a review of the Forscene platform from Forbidden Technologies. Back then, I went into some detail about the product and how nice it was to be able to do a rough edit from just about anywhere, including my local Starbucks. Simply, Forscene is a cloud-based, nonlinear logging, editing, reviewing and publishing browser-based platform.

When I first got a look at Forscene, a little over a year ago, I admit I was a little put off by the interface. It had the look of the light grey Avid Media Composer setting if it was on a Windows 95 PC. While looks aren’t everything, looking good doesn’t hurt. Since that time, Forscene has Continue reading

My ‘Top 3’ prosumer filmmaker necessities

By Brady Betzel

I love to tinker with every type of gadget that I can get my hands on, from After Effects plug-ins to GoPros and their accessories. But sometimes I forget the basics like a simple tripod, camera bag and a light.

Whether you’re a hobbyist or a product reviewer, like me, who enjoys shooting and getting great images, you will always need the basics. So here are my Top 3 prosumer filmmaker necessities:

1. The Adorama 3Pod V2AH Tripod

One thing that is a necessity for any filmmaker looking to get a stable shot is a tripod. If you are going cheap you can get the basic tripod without a fluid head for probably $100 to $125, but if you really want to be able to pan and tilt fluidly, you will need to find a tripod that can support a fluid head. Continue reading

Review: GenArts Sapphire 8

The newest version of this suite of plug-ins and presets

By Brady Betzel

Over the past year or so we’ve seen an explosion in the preset and plug-in world, offering users a variety of options regardless of their budget. For example, there is Red Giant’s Universe and Boris FX’s BCC 9 suite — both offer tons of powerful plug-ins and presets that can take any project from mediocre to awesome with a few mouse clicks and some creative thinking.

Red Giant offers a variety of ways to use the program: there is a “light” version of Universe that’s free, or you can pay $10 monthly, $99 yearly or $399 for a lifetime of updates. Boris FX BCC 9 ranges in price from $695 to $1,995, depending on the Continue reading

Review: Maxon Cinema 4D Studio R16

By Brady Betzel

It’s not every day that I need a full-fledged 3D application when editing in reality television, but when I do I call on 
Maxon’s Cinema 4D. The Cinema 4D Studio R16 release is chock full of features aimed at people like me who want to get in and get out of their 3D app without pulling out all of their hair.

I previously reviewed Cinema 4D Studio R15, and that is when I began to fall in love with just how easy it was becoming to build raytraced titles or grow grass with the click of my Wacom stylus. Now we are seeing the evolution from not just a standard 3D app but a motion graphics powerhouse that can be used to craft a powerful set of opening credits or seamlessly composite a beautiful flower vase using the new motion tracker all inside of Cinema 4D Studio R16.

I’ve grown up with Cinema 4D, so I may be a little partial to it, but luckily for me the great Continue reading

Review: HP Z1G2 All-in-One Workstation

By Brady Betzel

In today’s broadcast post environments you typically see only a handful of workstations. This is usually because there are only a few vendors that are truly certified to work with products from companies such as Adobe, Avid and Autodesk — ISVs (Independent Software Vendors).

While many people will brush over the word workstation, it really is not a term to take lightly. HP in particular spends a tremendous amount of time and resources to make sure that if you buy a workstation-class system from them, it is going to work. If it doesn’t, it is going to be relatively easy to fix to get up and running with very little down time. Workstation-class computers are expensive (no matter the brand), so it’s not a decision to take lightly.

In this review I will be covering the HP Z1 G2 All-In-One Workstation, a competitor to the Apple Continue reading

Rampant Design’s Budget VFX offers 40 HD looks for $29

By Brady Betzel

This week Rampant Design Tools has released its latest offering, Budget VFX, which are multiple collections of ProRes QuickTime overlays that add light leak looks (Beauty Light), Bokeh Effects, Film Clutter and more.

Each Budget VFX (@budgetvfx) collection is priced at $29 and offers 40 HD (1920×1080) QuickTime files that import into any NLE or VFX software; just adjust the Blending/Composite Mode and you are in business. If you are looking for formats larger than 1920×1080 you are going to want to check out the Rampant website, which includes products in 2K, 4K and even 5K resolutions, just not at the $29 price point.

Continue reading

Review: Imagineer System’s Mocha Pro 4.1 and Mocha Plus 4.1

By Brady Betzel

Most people working in modern post production have heard about “tracking.” Many may have even dabbled in a little tracking to remove a dead pixel, do some simple sign replacements or stabilize a shaky shot. If you are lucky enough to have been successful in your standard Avid tracker or the amazing Adobe After Effects 3D Camera Solver then you will love what Imagineer System’s is doing with Mocha Pro 4.1 and Mocha Plus 4.1.

If you haven’t touched a tracker and/or are scared, I don’t blame you; it’s not necessarily a painless procedure (especially if you don’t have great source footage), but stick with it because the results of a great track, along with some creative motion graphics creativity, can be Continue reading

Five Adobe After Effects Shortcuts

By Brady Betzel

As an editor, most of my day is spent inside of Avid Media Composer, but occasionally I will get to turn on my Spotify, groove to the music and crank out some Adobe After Effects or Maxon Cinema 4D work. Over the years I’ve found some shortcuts within After Effects that make my job easier, and I wanted to share five of my favorites… from an editor’s perspective.

Double Click in the project window to import an asset
When importing assets into an Adobe After Effects project I often see people do the archaic: File > Import. Instead, if you just double click in the Project Window you will save yourself a few steps. Simple, but I see it all the time.

Tilde key (`) to make full screen
Continue reading

LaCie’s speedy Rugged Thunderbolt external drive

By Brady Betzel

LaCie is a familiar name to anyone who works with external hard drives…. drives that look a bit different than most. The company hires top-shelf architects and designers, such as Philippe Starck, Porsche Design Group and Neil Poulton, to help their product stand out.

That said, design is maybe 10-15% of what I’m looking for in a quality external drive. The rest is about portability, stability and speed. Can I drop it while running around a shoot or running between edit bays with all data staying intact? Can it interface with the fastest and most widely used connections? The LaCie Rugged Thunderbolt SSD can.

Continue reading

My Top 5 Avid Media Composer shortcuts

By Brady Betzel

During my four years as an assistant editor and my two as a full-time editor, almost all of my work has been on the Avid Media Composer or Symphony.

Over the years I have collected a few shortcuts that I love to use and wanted to share. I hope you find them helpful.

Replacing the default “A” + “S” Go to Next/Previous Edit with Fast Forward and Rewind
When you open the default keyboard settings in Avid Media Composer, “A” and “S” are used as Go to Next and Go to Previous edit. While this serves its purpose of going to the next edit in the timeline and going into Trim Mode, I prefer it to be used to just jump to the next edit Continue reading

Review: Yanobox Nodes 2

By Brady Betzel

I realize, as most editors do, that to grow means to be constantly learning. Not a day goes by where I am not on the Web looking for the latest and greatest tools and tutorials to expand my creative toolbox.

When scouring Twitter one day I found Eran Stern’s (@sternfx) demo of Yanobox Nodes 2 from FX Factory (@fxfactory) which he used to create a promo for the After Effects World Conference. While the project required an advanced level of After Effects skill and finesse, it wasn’t as hard as I thought it would have been, and Nodes 2 was vital in creating such a spectacular spot.

Here is a link to his breakdown: http://www.sternfx.com/tutorials/140. There are many uses for Nodes 2, and motion graphics creator Jayse Hansen has created some stunning HUD Continue reading