Tag Archives: Adobe After Effects

A post engineer’s thoughts on Adobe MAX, new offerings

By Mike McCarthy

Last week, I had the opportunity to attend Adobe’s MAX conference at the LA Convention Center. Adobe showed me, and 15,000 of my closest friends, the newest updates to pretty much all of its Creative Cloud applications, as well as a number of interesting upcoming developments. From a post production perspective, the most significant pieces of news are the release of Premiere Pro 14 and After Effects 17 (a.ka., the 2020 releases of those Creative Cloud apps).

The main show ran from Monday to Wednesday, with a number of pre-show seminars and activities the preceding weekend. My experience started off by attending a screening of the new Terminator Dark Fate film at LA Live, followed by Q&A with the director and post team. The new Terminator was edited in Premiere Pro, sharing the project assets between a large team of editors and assistants, with extensive use of After Effects, Adobe’s newly acquired Substance app and various other tools in the Creative Cloud.

The post team extolled the improvements in shared project support and project opening times since their last Premiere endeavor on the first Deadpool movie. Visual effects editor Jon Carr shared how they used the integration between Premiere and After Effects to facilitate rapid generation of temporary “postvis” effects. This helped the editors tell the story while they were waiting on the VFX teams to finish generating the final CGI characters and renders.

MAX
The conference itself kicked off with a keynote presentation of all of Adobe’s new developments and releases. The 150-minute presentation covered all aspects of the company’s extensive line of applications. “Creativity for All” is the primary message Adobe is going for, and they focused on the tension between creativity and time. So they are trying to improve their products in ways that give their users more time to be creative.

The three prongs of that approach for this iteration of updates were:
– Faster, more powerful, more reliable — fixing time-wasting bugs, improving hardware use.
– Create anywhere, anytime, with anyone — adding functionality via the iPad, and shared Libraries for collaboration.
– Explore new frontiers — specifically in 3D with Adobe’s Dimension, Substance and Aero)

Education is also an important focus for Adobe, with 15 million copies of CC in use in education around the world. They are also creating a platform for CC users to stream their working process to viewers who want to learn from them, directly from within the applications. That will probably integrate with the new expanded Creative Cloud app released last month. They also have released integration for Office apps to access assets in CC libraries.

The first application updates they showed off were in Photoshop. They have made the new locked aspect ratio scaling a toggle-able behavior, improved the warp tool and improved ways to navigate deep layer stacks by seeing which layers effect particular parts of an image. But the biggest improvement is AI-based object selection. This makes detailed maskings based on simple box selections or rough lassos. Illustrator now has GPU acceleration, improving performance of larger documents and a path simplifying tool to reduce the number of anchor points.

They released Photoshop for the iPad and announced that Illustrator will be following that path as well. Fresco is headed the other direction and now available on Windows. That is currently limited to Microsoft Surface products, but I look forward to being able to try it out on my ZBook-X2 at some point. Adobe XD has new features, and apparently is the best way to move complex Illustrator files into After Effects, which I learned at one of the sessions later.

Premiere
Premiere Pro 14 has a number of new features, the most significant one being AI-driven automatic reframe to allow you to automatically convert your edited project into other aspect ratios for various deliverables. While 16×9 is obviously a standard size, certain web platforms are optimized for square or tall videos. The feature can also be used to reframe content for 2.35 to 16×9 or 4×3, which are frequent delivery requirements for feature films that I work on. My favorite aspect of this new functionality is that the user has complete control over the results.

Unlike other automated features like warp stabilizer, which only offers on/off of applying the results, the auto-frame function just generates motion effect keyframes that can be further edited and customized by the user… once the initial AI pass is complete. It also has a nesting feature for retaining existing framing choices, that results in the creation of a new single-layer source sequence. I can envision this being useful for a number of other workflow processes — such as preparing for external color grading or texturing passes, etc.

They also added better support for multi-channel audio workflows and effects, improved playback performance for many popular video formats, better HDR export options and a variety of changes to make the motion graphics tools more flexible and efficient for users who use them extensively. They also increased the range of values available for clip playback speed and volume, and added support for new camera formats and derivations.

The brains behind After Effects have focused on improving playback and performance for this release and have made some significant improvements in that regard. The other big feature that actually may make a difference is content-aware fill for video. This was sneak previewed at MAX last year and first implemented in the NAB 2019 release of After Effects, but it should be greatly refined and improved in this version since it’s now twice as fast.

They also greatly improved support for OpenEXR frame sequences, especially with multiple render pass channels. The channels can be labeled; it creates a video contact sheet for viewing all the layers in thumbnail form. EXR playback performance is supposed to be greatly improved as well.

Character Animator is now at 3.0, and they have added keyframing of all editable values, trigger-able reposition “cameras” and trigger-able audio effects, among other new features. And Adobe Rush now supports publishing directly to TikTok.

Content Authenticity Initiative
Outside of individual applications, Adobe has launched the Content Authenticity Initiative in partnership with the NY Times and Twitter. It aims to fight fake news and restore consumer confidence in media. Its three main goals are: trust, attribution and authenticity. It aims to present end users with who created an image and who edited or altered it and, if so, in what ways. Seemingly at odds with that, they also released a new mobile app that edits images upon capture, using AI empowered “lenses” for highly stylized looks, even providing a live view.

This opening keynote was followed by a selection of over 200 different labs and sessions available over the next three days. I attended a couple sessions focused on After Effects, as that is a program I know I don’t use to its full capacity. (Does anyone, really?)

Partners
A variety of other partner companies were showing off their products in the community pavilion. HP was pushing 3D printing and digital manufacturing tools that integrate with Photoshop and Illustrator. Dell has a new 27-inch color accurate monitor with built-in colorimeter, presumably to compete with HP’s top end DreamColor displays. Asus also has some new HDR monitors that are Dolby Vision compatible. One is designed to be portable, and is as thin and lightweight as a laptop screen. I have always wondered why that wasn’t a standard approach for desktop displays.

Keynotes
Tuesday opened with a keynote presentation from a number of artists of different types, speaking or being interviewed. Jason Levine’s talk with M. Night Shyamalan was my favorite part, even though thrillers aren’t really my cup of tea. Later, I was able to sit down and talk with Patrick Palmer, Adobe’s Premiere Pro product manager about where Premiere is headed and the challenges of developing HDR creation tools when there is no unified set of standards for final delivery. But I am looking forward to being able to view my work in HDR while I am editing at some point in the future.

One of the highlights of MAX is the 90-minute Sneaks session on Tuesday night, where comedian John Mulaney “helped” a number of Adobe researchers demonstrate new media technologies they are working on. These will eventually improve audio quality, automate animation, analyze photographic authenticity and many other tasks once they are refined into final products at some point in the future.

This was only my second time attending MAX, and with Premiere Rush being released last year, video production was a key part of that show. This year, without that factor, it was much more apparent to me that I was an engineer attending an event catering to designers. Not that this is bad, but I mention it here because it is good to have a better idea of what you are stepping into when you are making decisions about whether to invest in attending a particular event.

Adobe focuses MAX on artists and creatives as opposed to engineers and developers, who have other events that are more focused on their interests and needs. I suppose that is understandable since it is not branded Creative Cloud for nothing. But it is always good to connect with the people who develop the tools I use, and the others who use them with me, which is a big part of what Adobe MAX is all about.


Mike McCarthy is an online editor/workflow consultant with over 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.

Review: Lenovo Yoga A940 all-in-one workstation

By Brady Betzel

While more and more creators are looking for alternatives to the iMac, iMac Pro and Mac Pro, there are few options with high-quality, built-in monitors: Microsoft Surface Studio, HP Envy, and Dell 7000 are a few. There are even fewer choices if you want touch and pen capabilities. It’s with that need in mind that I decided to review the Lenovo Yoga A940, a 27-inch, UHD, pen- and touch-capable Intel Core i7 computer with an AMD Radeon RX 560 GPU.

While I haven’t done a lot of all-in-one system reviews like the Yoga A940, I have had my eyes on the Microsoft Surface Studio 2 for a long time. The only problem is the hefty price tag of around $3,500. The Lenovo’s most appealing feature — in addition to the tech specs I will go over — is its price point: It’s available from $2,200 and up. (I saw Best Buy selling a similar system to the one I reviewed for around $2,299. The insides of the Yoga and the Surface Studio 2 aren’t that far off from each other either, at least not enough to make up for the $1,300 disparity.)

Here are the parts inside the Lenovo Yoga A940: Intel Core i7-8700 3.2GHz processor (up to 4.6GHz with Turbo Boost), six cores (12 threads) and 12MB cache; 27-inch 4K UHD IPS multitouch 100% Adobe RGB display; 16GB DDR4 2666MHz (SODIMM) memory; 1TB 5400 RPM drive plus 256GB PCIe SSD; AMD Radeon RX 560 4GB graphics processor; 25-degree monitor tilt angle; Dolby Atmos speakers; Dimensions: 25 inches by 18.3 inches by 9.6 inches; Weight: 32.2 pounds; 802.11AC and Bluetooth 4.2 connectivity; side panel inputs: Intel Thunderbolt, USB 3.1, 3-in-1 card reader and audio jack; rear panel inputs: AC-in, RJ45, HDMI and four USB 3.0; Bluetooth active pen (appears to be the Lenovo Active Pen 2); and QI wireless charging technology platform.

Digging In
Right off the bat, I just happened to put my Android Galaxy phone on the odd little flat platform located on the right side of the all-in-one workstation, just under the monitor, and I saw my phone begin to charge wirelessly. QI wireless charging is an amazing little addition to the Yoga; it really comes through in a pinch when I need my phone charged and don’t have the cable or charging dock around.

Other than that nifty feature, why would you choose a Lenovo Yoga A940 over any other all-in-one system? Well, as mentioned, the price point is very attractive, but you are also getting a near-professional-level system in a very tiny footprint — including Thunderbolt 3 and USB connections, HDMI port, network port and SD card reader. While it would be incredible to have an Intel i9 processor inside of the Yoga, the i7 clocks in at 3.2GHz with six cores. Not a beast, but enough to get the job done inside of Adobe Premiere and Blackmagic’s DaVinci Resolve, but maybe with transcoded files instead of Red raw or the like.

The Lenovo Yoga A940 is outfitted with a front-facing Dolby Atmos audio speaker as well as Dolby Vision technology in the IPS display. The audio could use a little more low end, but it is good. The monitor is surprisingly great — the whites are white and the blacks are black; something not everyone can get right. It has 100% Adobe RGB color coverage and is Pantone-validated. The HDR is technically Dolby Vision and looks great at about 350 nits (not the brightest, but it won’t burn your eyes out either). The Lenovo BT active pen works well. I use Wacom tablets and laptop tablets daily, so this pen had a lot to live up to. While I still prefer the Wacom pen, the Lenovo pen, with 4,096 levels of sensitivity, will do just fine. I actually found myself using the touchscreen with my fingers way more than the pen.

One feature that sets the A940 apart from the other all-in-one machines is the USB Content Creation dial. With the little time I had with the system, I only used it to adjust speaker volume when playing Spotify, but in time I can see myself customizing the dials to work in Premiere and Resolve. The dial has good action and resistance. To customize the dial, you can jump into the Lenovo Dial Customization Assistant.

Besides the Intel i7, there is an AMD Radeon RX 560 with 4GB of memory, two 3W and two 5W speakers, 32 GB of DDR4 2666 MHz memory, a 1 TB 5400 RPM hard drive for storage, and a 256GB PCIe SSD. I wish the 1TB drive was also an SSD, but obviously Lenovo has to keep that price point somehow.

Real-World Testing
I use Premiere Pro, After Effects and Resolve all the time and can understand the horsepower of a machine through these apps. Whether editing and/or color correcting, the Lenovo A940 is a good medium ground — it won’t be running much more than 4K Red raw footage in real time without cutting the debayering quality down to half if not one-eighth. This system would make a good “offline” edit system, where you transcode your high-res media to a mezzanine codec like DNxHR or ProRes for your editing and then up-res your footage back to the highest resolution you have. Or, if you are in Resolve, maybe you could use optimized media for 80% of the workflow until you color. You will really want a system with a higher-end GPU if you want to fluidly cut and color in Premiere and Resolve. That being said, you can make it work with some debayer tweaking and/or transcoding.

In my testing I downloaded some footage from Red’s sample library, which you can find here. I also used some BRAW clips to test inside of Resolve, which can be downloaded here. I grabbed 4K, 6K, and 8K Red raw R3D files and the UHD-sized Blackmagic raw (BRAW) files to test with.

Adobe Premiere
Using the same Red clips as above, I created two one-minute-long UHD (3840×2160) sequences. I also clicked “Set to Frame Size” for all the clips. Sequence 1 contained these clips with a simple contrast, brightness and color cast applied. Sequence 2 contained these same clips with the same color correction applied, but also a 110% resize, 100 sharpen and 20 Gaussian Blur. I then exported them to various codecs via Adobe Media Encoder using the OpenCL for processing. Here are my results:

QuickTime (.mov) H.264, No Audio, UHD, 23.98 Maximum Render Quality, 10 Mb/s:
Color Correction Only: 24:07
Color Correction w/ 110% Resize, 100 Sharpen, 20 Gaussian Blur: 26:11
DNxHR HQX 10 bit UHD
Color Correction Only: 25:42
Color Correction w/ 110% Resize, 100 Sharpen, 20 Gaussian Blur: 27:03

ProRes HQ
Color Correction Only: 24:48
Color Correction w/ 110% Resize, 100 Sharpen, 20 Gaussian Blur: 25:34

As you can see, the export time is pretty long. And let me tell you, once the sequence with the Gaussian Blur and Resize kicked in, so did the fans. While it wasn’t like a jet was taking off, the sound of the fans definitely made me and my wife take a glance at the system. It was also throwing some heat out the back. Because of the way Premiere works, it relies heavily on the CPU over GPU. Not that it doesn’t embrace the GPU, but, as you will see later, Resolve takes more advantage of the GPUs. Either way, Premiere really taxed the Lenovo A940 when using 4K, 6K and 8K Red raw files. Playback in real time wasn’t possible except for the 4K files. I probably wouldn’t recommend this system for someone working with lots of higher-than-4K raw files; it seems to be simply too much for it to handle. But if you transcode the files down to ProRes, you will be in business.

Blackmagic Resolve 16 Studio
Resolve seemed to take better advantage of the AMD Radeon RX 560 GPU in combination with the CPU, as well as the onboard Intel GPU. In this test I added in Resolve’s amazing built-in spatial noise reduction, so other than the Red R3D footage, this test and the Premiere test weren’t exactly comparing apples to apples. Overall the export times will be significantly higher (or, in theory, they should be). I also added in some BRAW footage to test for fun, and that footage was way easier to work and color with. Both sequences were UHD (3840×2160) 23.98. I will definitely be looking into working with more BRAW footage. Here are my results:

Playback: 4K realtime playback at half-premium, 6K no realtime playback, 8K no realtime playback

H.264 no audio, UHD, 23.98fps, force sizing and debayering to highest quality
Export 1 (Native Renderer)
Export 2 (AMD Renderer)
Export 3 (Intel QuickSync)

Color Only
Export 1: 3:46
Export 2: 4:35
Export 3: 4:01

Color, 110% Resize, Spatial NR: Enhanced, Medium, 25; Sharpening, Gaussian Blur
Export 1: 36:51
Export 2: 37:21
Export 3: 37:13

BRAW 4K (4608×2592) Playback and Export Tests

Playback: Full-res would play at about 22fps; half-res plays at realtime

H.264 No Audio, UHD, 23.98 fps, Force Sizing and Debayering to highest quality
Color Only
Export 1: 1:26
Export 2: 1:31
Export 3: 1:29
Color, 110% Resize, Spatial NR: Enhanced, Medium, 25; Sharpening, Gaussian Blur
Export 1: 36:30
Export 2: 36:24
Export 3: 36:22

DNxHR 10 bit:
Color Correction Only: 3:42
Color, 110% Resize, Spatial NR: Enhanced, Medium, 25; Sharpening, Gaussian Blur: 39:03

One takeaway from the Resolve exports is that the color-only export was much more efficient than in Premiere, taking just over three or four times realtime for the intensive Red R3D files, and just over one and a half times real time for BRAW.

Summing UpIn the end, the Lenovo A940 is a sleek looking all-in-one touchscreen- and pen-compatible system. While it isn’t jam-packed with the latest high-end AMD GPUs or Intel i9 processors, the A940 is a mid-level system with an incredibly good-looking IPS Dolby Vision monitor with Dolby Atmos speakers. It has some other features — like IR camera, QI wireless charger and USB Dial — that you might not necessarily be looking for but love to find.

The power adapter is like a large laptop power brick, so you will need somewhere to stash that, but overall the monitor has a really nice 25-degree tilt that is comfortable when using just the touchscreen or pen, or when using the wireless keyboard and mouse.

Because the Lenovo A940 starts at just around $2,299 I think it really deserves a look when searching for a new system. If you are working in primarily HD video and/or graphics this is the all-in-one system for you. Check out more at their website.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

HPA’s 2019 Engineering Excellence Award winners  

The Hollywood Professional Association (HPA) Awards Committee have announced the winners of the 2019 HPA Engineering Excellence Award. They were selected by a judging panel after a session held at IMAX on June 22. Honors will be bestowed on November 21 at the 14th annual HPA Awards gala at the Skirball Cultural Center in Los Angeles.

The HPA Awards were founded in 2005 to recognize creative artistry and innovation in the professional media content industry. A coveted honor, the Engineering Excellence Award rewards outstanding technical and creative ingenuity in media, content production, finishing, distribution and archive.

“Every year, it is an absolute pleasure and a privilege to witness the innovative work that is brewing in our industry,” says HPA Awards Engineering Committee chair Joachim Zell. “Judging by the number of entries, which was our largest ever, there is genuine excitement within our community to push our capabilities to the next level. It was a close race and shows us that the future is being plotted by the brilliance that we see in the Engineering Excellence Awards. Congratulations to the winners, and all the entrants, for impressive and inspiring work.”

Adobe After Effects

Here are the winners:
Adobe – Content-Aware Fill for Video in Adobe After Effects
Content-Aware Fill for video uses intelligent algorithms to automatically remove unwanted objects like boom mics or distracting signs from video. Using optical flow technology, Content-Aware Fill references frames before, next to or after an object and fills the area automatically making it look as if the object was never there.

Epic Games — Unreal Engine 4
Unreal Engine is a flexible and scalable realtime visualization platform enabling animation, simulation, performance capture and photorealistic renders at unprecedented speeds. Filmmakers, broadcasters and beyond use Unreal Engine to scout virtual locations and sets, complete previsualization, achieve in-camera final-pixel VFX on set, deliver immersive live mixed reality broadcasts, edit CG characters and more in realtime. Unreal Engine dramatically streamlines content creation and virtual production, affording creators greater flexibility and freedom to achieve their visions.

Pixelworks — TrueCut Motion
TrueCut Motion is a cinematic video tool for finely tuning motion appearance.  It uses Pixelworks’ 20 years of experience in video processing, together with a new motion appearance model and motion dataset. Used as a part of the creative process, TrueCut Motion enables filmmakers to explore a broader range of motion appearances than previously possible.

Portrait Displays and LG Electronics — CalMan LUT based Auto-Calibration Integration with LG OLED TVs
OLED televisions are commonly used in Hollywood for various uses, including as a client viewing monitor, SDR BT.709 reference monitor and as QC monitor for consumer deliverables, including broadcasting, optical media and OTT. To be used in these professional settings, a highly accurate color calibration is essential. Portrait Displays and LG Electronics partnered to bring 1D and 3D LUT-based hardware level CalMan AutoCal to the 2018 and newer LG OLED televisions.

Honorable Mentions were awarded to Ambidio for Ambidio Looking Glass; Grass Valley, for creative grading; and Netflix, Inc. for Photon.

In addition to the honors for excellence in engineering, the HPA Awards will recognize excellence in 12 craft categories, including color grading, editing, sound and visual effects. The recipients of the Judges Award for Creativity and Innovation and Lifetime Achievement Award will be announced in the coming weeks.
Tickets for the 14th annual HPA Awards will be available for purchase later this summer.

Promoting a Mickey Mouse watch without Mickey

Imagine creating a spot for a watch that celebrates the 90th anniversary of Mickey Mouse — but you can’t show Mickey Mouse. Already Been Chewed (ABC), a design and motion graphics studio, developed a POV concept that met this challenge and also tied in the design of the actual watch.

Nixon, a California-based premium watch company that is releasing a series of watches around the Mickey Mouse anniversary, called on Already Been Chewed to create the 20-second spot.

“The challenge was that the licensing arrangement that Disney made with Nixon doesn’t allow Mickey’s image to be in the spot,” explains Barton Damer, creative director at Already Been Chewed. “We had to come up with a campaign that promotes the watch and has some sort of call to action that inspires people to want this watch. But, at the same time, what were we going to do for 20 seconds if we couldn’t show Mickey?”

After much consideration, Damer and his team developed a concept to determine if they could push the limits on this restriction. “We came up with a treatment for the video that would be completely point-of-view, and the POV would do a variety of things for us that were working in our favor.”

The solution was to show Mickey’s hands and feet without actually showing the whole character. In another instance, a silhouette of Mickey is seen in the shadows on a wall, sending a clear message to viewers that the spot is an official Disney and Mickey Mouse release and not just something that was inspired by Mickey Mouse.

Targeting the appropriate consumer demographic segment was another key issue. “Mickey Mouse has long been one of the most iconic brands in the history of branding, so we wanted to make sure that it also appealed to the Nixon target audience and not just a Disney consumer,” Damer says. “When you think of Disney, you could brand Mickey for children or you could brand it for adults who still love Mickey Mouse. So, we needed to find a style and vibe that would speak to the Nixon target audience.”

The Already Been Chewed team chose surfing and skateboarding as dominant themes, since 16-to 30-year-olds are the target demographic and also because Disney is a West Coast brand.
Damer comments, “We wanted to make sure we were creating Mickey in a kind of 3D, tangible way, with more of a feature film and 3D feel. We felt that it should have a little bit more of a modern approach. But at the same time, we wanted to mesh it with a touch of the old-school vibe, like 1950s cartoons.”

In that spirit, the team wanted the action to start with Mickey walking from his car and then culminate at the famous Venice Beach basketball courts and skate park. Here’s the end result.

“The challenge, of course, is how to do all this in 15 seconds so that we can show the logos at the front and back and a hero image of the watch. And that’s where it was fun thinking it through and coming up with the flow of the spot and seamless transitions with no camera cuts or anything like that. It was a lot to pull off in such a short time, but I think we really succeeded.”

Already Been Chewed achieved these goals with an assist from Maxon’s Cinema 4D and Adobe After Effects. With Damer as creative lead, here’s the complete cast of characters: head of production Aaron Smock; 3D design was via Thomas King, Barton Damer, Bryan Talkish, Lance Eckert; animation was provided by Bryan Talkish and Lance Eckert; character animation was via Chris Watson; soundtrack was DJ Sean P.

Adobe updates Creative Cloud

By Brady Betzel

You know it’s almost fall when when pumpkin spice lattes are  back and Adobe announces its annual updates. At this year’s IBC, Adobe had a variety of updates to its Creative Cloud line of apps. From more info on their new editing platform Project Rush to the addition of Characterizer to Character Animator — there are a lot of updates so I’m going to focus on a select few that I think really stand out.

Project Rush

I use Adobe Premiere quite a lot these days; it’s quick and relatively easy to use and will work with pretty much every codec in the universe. In addition, the Dynamic Link between Adobe Premiere Pro and Adobe After Effects is an indispensible feature in my world.

With the 2018 fall updates, Adobe Premiere will be closer to a color tool like Blackmagic’s Resolve with the addition of new hue saturation curves in the Lumetri Color toolset. In Resolve these are some of the most important aspects of the color corrector, and I think that will be the same for Premiere. From Hue vs. Sat, which can help isolate a specific color and desaturate it to Hue vs. Luma, which can help add or subtract brightness values from specific hues and hue ranges — these new color correcting tools further Premiere’s venture into true professional color correction. These new curves will also be available inside of After Effects.

After Effects features many updates, but my favorites are the ability to access depth matte data of 3D elements and the addition of the new JavaScript engine for building expressions.

There is one update that runs across both Premiere and After Effects that seems to be a sleeper update. The improvements to motion graphics templates, if implemented correctly, could be a time and creativity saver for both artists and editors.

AI
Adobe, like many other companies, seem to be diving heavily into the “AI” pool, which is amazing, but… with great power comes great responsibility. While I feel this way and realize others might not, sometimes I don’t want all the work done for me. With new features like Auto Lip Sync and Color Match, editors and creators of all kinds should not lose the forest for the trees. I’m not telling people to ignore these features, but asking that they put a few minutes into discovering how the color of a shot was matched, so you can fix something if it goes wrong. You don’t want to be the editor who says, “Premiere did it” and not have a great solution to fix something when it goes wrong.

What Else?
I would love to see Adobe take a stab at digging up the bones of SpeedGrade and integrating that into the Premiere Pro world as a new tab. Call it Lumetri Grade, or whatever? A page with a more traditional colorist layout and clip organization would go a long way.

In the end, there are plenty of other updates to Adobe’s 2018 Creative Cloud apps, and you can read their blog to find out about other updates.

NAB: Adobe’s spring updates for Creative Cloud

By Brady Betzel

Adobe has had a tradition of releasing Creative Cloud updates prior to NAB, and this year is no different. The company has been focused on improving existing workflows and adding new features, some based on Adobe’s Sensei technology, as well as improved VR enhancements.

In this release, Adobe has announced a handful of Premiere Pro CC updates. While I personally don’t think that they are game changing, many users will appreciate the direction Adobe is going. If you are color correcting, Adobe has added the Shot Match function that allows you to match color between two shots. Powered by Adobe’s Sensei technology, Shot Match analyzes one image and tries to apply the same look to another image. Included in this update is the long-requested split screen to compare before and after color corrections.

Motion graphic templates have been improved with new adjustments like 2D position, rotation and scale. Automatic audio ducking has been included in this release as well. You can find this feature in the Essential Sound panel, and once applied it will essentially dip the music in your scene based on dialogue waveforms that you identify.

Still inside of Adobe Premiere Pro CC, but also applicable in After Effects, is Adobe’s enhanced Immersive Environment. This update is for people who use VR headsets to edit and or process VFX. Team Project workflows have been updated with better version tracking and indicators of who is using bins and sequences in realtime.

New Timecode Panel
Overall, while these updates are helpful, none are barn burners, the thing that does have me excited is the new Timecode Panel — it’s the biggest new update to the Premiere Pro CC app. For years now, editors have been clamoring for more than just one timecode view. You can view sequence timecodes, source media timecodes from the clips on the different video layers in your timeline, and you can even view the same sequence timecode in a different frame rate (great for editing those 23.98 shows to a 29.97/59.94 clock!). And one of my unexpected favorites is the clip name in the timecode window.

I was testing this feature in a pre-release version of Premiere Pro, and it was a little wonky. First, I couldn’t dock the timecode window. While I could add lines and access the different menus, my changes wouldn’t apply to the row I had selected. In addition, I could only right click and try to change the first row of contents, but it would choose a random row to change. I am assuming the final release has this all fixed. If it the wonkiness gets flushed out, this is a phenomenal (and necessary) addition to Premiere Pro.

Codecs, Master Property, Puppet Tool, more
There have been some compatible codec updates, specifically Raw Sony X-OCN (Venice), Canon Cinema Raw Light (C200) and Red IPP2.

After Effects CC has also been updated with Master Property controls. Adobe said it best during their announcement: “Add layer properties, such as position, color or text, in the Essential Graphics panel and control them in the parent composition’s timeline. Use Master Property to push individual values to all versions of the composition or pull selected changes back to the master.”

The Puppet Tool has been given some love with a new Advanced Puppet Engine, giving access to improving mesh and starch workflows to animate static objects. Beyond updates to Add Grain, Remove Grain and Match Grain effects, making them multi-threaded, enhanced disk caching and project management improvements have been added.

My favorite update for After Effects CC is the addition of data-driven graphics. You can drop a CSV or JSON data file and pick-whip data to layer properties to control them. In addition, you can drag and drop data right onto your comp to use the actual numerical value. Data-driven graphics is a definite game changer for After Effects.

Audition
While Adobe Audition is an audio mixing application, it has some updates that will directly help anyone looking to mix their edit in Audition. In the past, to get audio to a mixing program like Audition, Pro Tools or Fairlight you would have to export an AAF (or if you are old like me possibly an OMF). In the latest Audition update you can simply open your Premiere Pro projects directly into Audition, re-link video and audio and begin mixing.

I asked Adobe whether you could go back and forth between Audition and Premiere, but it seems like it is a one-way trip. They must be expecting you to export individual audio stems once done in Audition for final output. In the future, I would love to see back and forth capabilities between apps like Premiere Pro and Audition, much like the Fairlight tab in Blackmagic’s Resolve. There are some other updates like larger tracks and under-the-hood updates which you can find more info about on: https://theblog.adobe.com/creative-cloud/.

Adobe Character Animator has some cool updates like overall character building updates, but I am not too involved with Character Animator so you should definitely read about things like the Trigger Improvements on their blog.

Summing Up
In the end, it is great to see Adobe moving forward on updates to its Creative Cloud video offerings. Data-driven animation inside of After Effects is a game-changer. Shot color matching in Premiere Pro is a nice step toward a professional color correction application. Importing Premiere Pro projects directly into Audition is definitely a workflow improvement.

I do have a wishlist though: I would love for Premiere Pro to concentrate on tried-and-true solutions before adding fancy updates like audio ducking. For example, I often hear people complain about how hard it is to export a QuickTime out of Premiere with either stereo or mono/discrete tracks. You need to set up the sequence correctly from the jump, adjust the pan on the tracks, as well as adjust the audio settings and export settings. Doesn’t sound streamlined to me.

In addition, while shot color matching is great, let’s get an Adobe SpeedGrade-style view tab into Premiere Pro so it works like a professional color correction app… maybe Lumetri Pro? I know if the color correction setup was improved I would be way more apt to stay inside of Premiere Pro to finish something instead of going to an app like Resolve.

Finally, consolidating and transcoding used clips with handles is hit or miss inside of Premiere Pro. Can we get a rock-solid consolidate and transcode feature inside of Premiere Pro? Regardless of some of the few negatives, Premiere Pro is an industry staple and it works very well.

Check out Adobe’s NAB 2018 update video playlist for details on each and every update.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Quick Chat: Creating graphics package for UN’s Equator Prize ceremony

Undefined Creative (UC) was recently commissioned by the United Nations Development Programme (UNDP) to produce a fresh package of event graphics for its Equator Prize 2017 Award Ceremony. This project is the latest in a series of motion design-centered work collaborations between the creative studio and the UN, a relationship that began when UC donated their skills to the Equator Prize in 2010.

The Equator Prize recognizes local and indigenous community initiatives from across the planet that are advancing innovative on-the-ground solutions to climate, environment and poverty challenges. Award categories honor achievement and local innovation in the thematic areas of oceans, forests, grasslands and wildlife protection.

For this year’s ceremony, UNDP wanted a complete refresh that gave the on-stage motion graphics a current vibe while incorporating the key icons behind its sustainable development goals (SDGs). Consisting of a “Countdown to Ceremony” screensaver, an opening sequence, 15 winner slates, three category slates and 11 presenter slates, the package had to align visually with a presentation from National Geographic Society, which was part of the evening’s program.

To bring it all together, UC drew from the SDG color palettes and relied on subject matter knowledge of both the UNDP and National Geographic in establishing the ceremony graphics’ overall look and feel. With only still photos available for the Equator Prize winners, UC created motion and depth by strategically intertwining the best shots with moving graphics and strategically selected stock footage. Naturally moving flora and fauna livened up the photography, added visual diversity and contributed creating a unique aesthetic.

We reached out to Undefined Creative’s founder/creative director Maria Rapetskaya to find out more:

How early did you get involved in the project, and was the client open to input?
We got the call a couple of months before the event. The original show had been used multiple times since we created it in 2010, so the client was definitely looking for input on how we could refresh or even rebrand.

Any particular challenges for this one?
For non-commercial organizations, budgets and messaging are equally sensitive topics. We have to be conscious of costs, and also very aware of Do’s and Don’t’s when it comes to assets and use. Our creative discussions took place over several calls, laying out options and ideas at different budget tiers — anything from simply updating the existing package to creating something entirely different. In case of the latter, parameters had to be established right away for how different “different” could be.

For example, it was agreed that we should stick with photography provided by the 2017 award winners. However, our proposal to include stock for flora and fauna was agreed on by all involved. Which SDG icons would be used and how, what partner and UN organizational branding should be featured prominently as design inspiration, how this would integrate with content being produced for UNDP/Equator Prize by Nat Geo… all of these questions had to be addressed before we started any real ideation in order for the creative to stay on brand, on message, on budget and on time.

What tools did you use on the project?
We relied on Adobe CC, in particular, After Effects, which is our staple software. In this particular project, we also relied heavily on stock from multiple vendors. Pond5 have a robust and cost-effective collection of video elements we were seeking.

Why is this project important to you?
The majority of our clients are for-profit commercial entities, and while that’s wonderful, there’s always a different feeling of reward when we have the chance to do something for the good of humanity at large, however minuscule our contribution is. The winners are coming from such different corners of the globe — at times, very remote. They’re incredibly excited to be honored, on stage, in New York City, and we can only imagine what it feels like to see their faces, the faces of their colleagues and friends, the names of their projects, up on this screen in front of a large, live audience. This particular event brings us a lot closer to what we’re creating, on a really empathetic, human level.

Red Giant Trapcode Suite 14 now available

By Brady Betzel

Red Giant has released an update to its Adobe After Effects focused plug-in toolset Trapcode Suite 14, including new versions of Trapcode Particular and Form as well as an update to Trapcode Tao.

The biggest updates seem to be in Red Giant’s flagship product Trapcode Particular 3. Trapcode Particular is now GPU accelerated through OpenGL with a proclaimed 4X speed increase over previous versions. The Designer has been re-imagined and seems to take on a more Magic Bullet-esque look and feel. You can now include multiple particle systems inside the same 3D space, which will add to the complexity and skill level needed to work with Particular.

You can now also load your own 3D model OBJ files as emitters in the Designer panel or use any image in your comp as a particle. There are also a bunch of new presets that have been added to start you on your Particular system building journey — over 210 new presets, to be exact.

Trapcode Form has been updated to version 3 with the updated Designer, ability to add 3D models and animated OBJ sequences as particle grids, load images to be used as a particle, new graphing system to gain more precise control over the system and over 70 presets in the designer.

Trapcode Tao has been updated with depth of field effects to allow for that beautiful camera-realistic blur that really sets pro After Effects users apart.

Trapcode Particular 3 and Form 3 are paid updates while Tao is free for existing users. If you want to only update Tao make sure you only select Tao for the update otherwise you will install new Trapcode plug-ins over your old ones.

Trapcode Particular 3 is available now for $399. The update is $149 and the academic version is $199. You can also get it as a part of the Trapcode Suite 14 for $999.

Trapcode Form 3 is available now for $199. The update is $99 and the academic costs $99. It can be purchased as part of the Trapcode Suite 14 for $999.

Check out the new Trapcode Suite 14 bundle.

 

Mocha VR: An After Effects user’s review

By Zach Shukan

If you’re using Adobe After Effects to do compositing and you’re not using Mocha, then you’re holding yourself back. If you’re using Mettle Skybox, you need to check out Mocha VR, the VR-enhanced edition of Mocha Pro.

Mocha Pro, and Mocha VR are all standalone programs where you work entirely within the Mocha environment and then export your tracks, shapes or renders to another program to do the rest of the compositing work. There are plugins for Maxon Cinema 4D, The Foundry’s Nuke, HitFilm, and After Effects that allow you to do more with the Mocha data within your chosen 3D or compositing program. Limited-feature versions of Mocha (Mocha AE and Mocha HitFilm) come installed with the Creative Cloud versions of After Effects and HitFilm 4 Pro, and every update of these plugins is getting closer to looking like a full version of Mocha running inside of the effects panel.

Maybe I’m old school, or maybe I just try to get the maximum performance from my workstation, but I always choose to run Mocha VR by itself and only open After Effects when I’m ready to export. In my experience, all the features of Mocha run more smoothly in the standalone than when they’re launched and run inside of After Effects.**

How does Mocha VR compare to Mocha Pro? If you’re not doing VR, stick with Mocha Pro. However, if you are working with VR footage, you won’t have to bend over backwards to keep using Mocha.

Last year was the year of VR, when all my clients wanted to do something with VR. It was a crazy push to be the first to make something and I rode the wave all year. The thing is there really weren’t many tools specifically designed to work with 360 video. Now this year, the post tools for working with VR are catching up.

In the past, I forced previous versions of Mocha to work with 360 footage before the VR version, but since Mocha added its VR-specific features, stabilizing a 360-camera became cake compared to the kludgy way it works with the industry standard After Effects 360 plugin, Skybox. Also, I’ve used Mocha to track objects in 360 before the addition of an equirectangular* camera and it was super-complicated because I had to splice together a whole bunch of tracks to compensate for the 360 camera distortion. Now it’s possible to create a single track to follow objects as they travel around the camera. Read the footnote for an explanation of equirectangular, a fancy word that you need to know if you’re working in VR.

Now let’s talk about the rest of Mocha’s features…

Rotoscoping
I used to rotoscope by tracing every few frames and then refining the frames in between until I found out about the Mocha way to rotoscope. Because Mocha combines rotoscoping with tracking of arbitrary shapes, all you have to do is draw a shape and then use tracking to follow and deform all the way through. It’s way smarter and more importantly, faster. Also, with the Uberkey feature, you can adjust your shapes on multiple frames at once. If you’re still rotoscoping with After Effects alone, you’re doing it the hard way.

Planar Tracking
When I first learned about Mocha it was all about the planar tracker, and that really is still the heart of the program. Mocha’s basically my go-to when nothing else works. Recently, I was working on a shot where a woman had her dress tucked into her pantyhose, and I pretty much had to recreate a leg of a dress that swayed and flowed along with her as she walked. If it wasn’t for Mocha’s planar tracker I wouldn’t have been able to make a locked-on track of the soft-focus (solid color and nearly without detail) side of the dress. After Effects couldn’t make a track because there weren’t enough contrast-y details.

GPU Acceleration
I never thought Mocha’s planar tracking was slow, even though it is slower than point tracking, but then they added GPU acceleration a version or two ago and now it flies through shots. It has to be at least five times as fast now that it’s using my Nvidia Titan X (Pascal), and it’s not like my CPU was a slouch (an 8-core i7-5960X).

Object Removal
I’d be content using Mocha just to track difficult shots and for rotoscoping, but their object-removal feature has saved me hours of cloning/tracking work in After Effects, especially when I’ve used it to remove camera rigs or puppet rigs from shots.

Mocha’s remove module is the closest thing out there to automated object removal***. It’s as simple as 1) create a mask around the object you want to remove, 2) track the background that your object passes in front of, and then 3) render. Okay, there’s a little more to it, but compared to the cloning and tracking and cloning and tracking and cloning and tracking method, it’s pretty great. Also, a huge reason to get the VR edition of Mocha is that the remove module will work with a 360 camera.

Here I used Mocha object removal to remove ropes that pulled a go-cart in a spot for Advil.

VR Outside of After Effects?
I’ve spent most of this article talking about Mocha with After Effects, because it’s what I know best, but there is one VR pipeline that can match nearly all of Mocha VR’s capabilities: the Nuke plugin Cara VR, but there is a cost to that workflow. More on this shortly.

Where you will hit the limit of Mocha VR (and After Effects in general) is if you are doing 3D compositing with CGI and real-world camera depth positioning. Mocha’s 3D Camera Solve module is not optimized for 360 and the After Effects 3D workspace can be limited for true 3D compositing, compared to software like Nuke or Fusion.

While After Effects sort of tacked on its 3D features to its established 2D workflow, Nuke is a true 3D environment as robust as Autodesk Maya or any of the high-end 3D software. This probably sounds great, but you should also know that Cara VR is $4,300 vs. $1,000 for Mocha VR (the standalone + Adobe plugin version) and Nuke starts at $4,300/year vs. $240/year for After Effects.

Conclusion
I think of Mocha as an essential companion to compositing in After Effects, because it makes routine work much faster and it does some things you just can’t do with After Effects alone. Mocha VR is a major release because VR has so much buzz these days, but in reality it’s pretty much just a version of Mocha Pro with the ability to also work with 360 footage.

*Equirectangular is a clever way of unwrapping a 360 spherical projection, a.k.a, the view we see in VR, by flattening it out into a rectangle. It’s a great way to see the whole 360 view in an editing program, but A: it’s very distorted so it can cause problems for tracking and B: anything that is moving up or down in the equirectangular frame will wrap around to the opposite side (a bit like Pacman when he exits the screen), and non-VR tracking programs will stop tracking when something exits the screen on one side.

**Note: According to the developer, one of the main advantages to running Mocha as a plug-in (inside AE, Premiere, Nuke, etc) for 360 video work is that you are using the host program’s render engine and proxy workflow. Having the ability to do all your tracking, masking and object removal on proxy resolutions is a huge benefit when working at large 360 formats that can be as large as 8k stereoscopic. Additionally, the Mocha modules that render, such as reorient for horizon stabilization or remove module will render inside the plug-in making for a streamlined workflow.

***FayOut was a “coming soon” product that promised an even more automated method for object removal, but as of the publishing of this article it appears that they are no longer “coming soon” and may have folded or maybe their technology was purchased and it will be included in a future product. We shall see…
________________________________________
Zach Shukan is the VFX specialist at SilVR and is constantly trying his hand at the latest technologies in the video post production world.

Adobe acquires Mettle’s SkyBox tools for 360/VR editing, VFX

Adobe has acquired all SkyBox technology from Mettle, a developer of 360-degree and virtual reality software. As more media and entertainment companies embrace 360/VR, there is a need for seamless, end-to-end workflows for this new and immersive medium.

The Skybox toolset is designed exclusively for post production in Adobe Premiere Pro CC and Adobe After Effects CC and complements Adobe Creative Cloud’s existing 360/VR cinematic production technology. Adobe will integrate SkyBox plugin functionality natively into future releases of Premiere Pro and After Effects.

To further strengthen Adobe’s leadership in 360-degree and virtual reality, Mettle co-founder Chris Bobotis will join Adobe, bringing more than 25 years of production experience to his new role.

“We believe making virtual reality content should be as easy as possible for creators. The acquisition of Mettle SkyBox technology allows us to deliver a more highly integrated VR editing and effects experience to the film and video community,” says Steven Warner, VP of digital video and audio at Adobe. “Editing in 360/VR requires specialized technology, and as such, this is a critical area of investment for Adobe, and we’re thrilled Chris Bobotis has joined us to help lead the charge forward.”

“Our relationship started with Adobe in 2010 when we created FreeForm for After Effects, and has been evolving ever since. This is the next big step in our partnership,” says Bobotis, now director, professional video at Adobe. “I’ve always believed in developing software for artists, by artists, and I’m looking forward to bringing new technology and integration that will empower creators with the digital tools they need to bring their creative vision to life.”

Introduced in April 2015, SkyBox was the first plugin to leverage Mettle’s proprietary 3DNAE technology, and its success quickly led to additional development of 360/VR plugins for Premiere Pro and After Effects.

Today, Mettle’s plugins have been adopted by companies such as The New York Times, CNN, HBO, Google, YouTube, Discovery VR, DreamWorks TV, National Geographic, Washington Post, Apple and Facebook, as well as independent filmmakers and YouTubers.

Nice Shoes Creative Studio animates limited-edition Twizzlers packages

Twizzlers and agency Anomaly recently selected 16 artists to design a fun series of limited edition packages for the classic candy. Each depicts various ways people enjoy Twizzlers. New York’s Nice Shoes Creative Studio, led by creative director Matt Greenwood, came on board to introduce these packages with an animated 15-second spot.

Three of the limited edition packages are featured in the fast-paced spot, bringing to life the scenarios of car DJing, “ugly crying” at the movies, and studying in the library, before ending on a shot that incorporates all of the 16 packages. Each pack has its own style, characters, and color scheme, unique to the original artists, and Nice Shoes was careful to work to preserve this as they crafted the spot.

“We were really inspired by the illustrations,” explains Greenwood. “We stayed close to the original style and brought them into a 3D space. There’s only a few seconds to register each package, so the challenge was to bring all the different styles and colors together within this time span. Select characters and objects carry over from one scene into the next, acting as transitional elements. The Twizzlers logo stays on-screen throughout, acting as a constant amongst the choreographed craziness.”

The Nice Shoes team used a balance of 3D and 2D animation, creating a CG pack while executing the characters on the packs with hand-drawn animation. Greenwood proposed taking advantage of the rich backgrounds that the artists had drawn, animating tiny background elements in addition to the main characters in order to “make each pack feel more alive.”

The main Twizzlers pack was modeled, lit, animated and rendered in Autodesk Maya which was composited in Adobe After Effects together with the supporting elements. These consisted of 2D hand-drawn animations created in Photoshop and 3D animated elements made with Mason Cinema 4D.

“Once we had the timing, size and placement of the main pack locked, I looked at which shapes would make sense to bring into a 3D space,” says Greenwood. “For example, the pink ribbons and cars from the ‘DJ’ illustration worked well as 3D objects, and we had time to add touches of detail within these elements.”

The characters on the packs themselves were animated with After Effects and applied as textures within the pack artwork. “The flying books and bookcases were rendered with Sketch and Toon in Cinema 4D, and I like to take advantage of that software’s dynamics simulation system when I want a natural feel to objects falling onto surfaces. The shapes in the end mnemonic are also rendered with Sketch and Toon and they provide a ‘wipe’ to get us to the end lock-up,” says Greenwood.

The final step during the production was to add a few frame-by-frame 2D animations (the splashes or car exhaust trail, for example) but Nice Shoes Creative Studio waited until everything was signed off before they added these final details.

“The nature of the illustrations allowed me to try a few different approaches and as long as everything was rendered flat or had minimal shading, I could combine different 2D and 3D techniques,” he concludes.

Frame.io 2.0 offers 100 new features, improvements for collaboration

Frame.io, developers of the video review and collaboration platform for content creators, has unveiled Frame.io 2.0 , an upgrade offering over 100 new features and improvements. This new version features new client Review Pages, which expands content review and sharing. In addition, the new release offers deeper workflow integration with Final Cut Pro X and Avid Media Composer, plus a completely re-engineered player.

“Frame.io 2 is based on everything we’ve learned from our customers over the past two years and includes our most-requested features,” says Emery Wells, CEO of Frame.io.

Just as internal teams can collaborate using Frame.io’s comprehensive annotation and feedback tools, clients can now provide detailed feedback on projects with Review Pages, which is designed to make the sharing experience simple, with no log-in required.

Review Pages give clients the same commenting ability as collaborators, without exposing them to the full Frame.io interface. Settings are highly configurable to meet specific customer needs, including workflow controls (approvals), security (password protection, setting expiration date) and communication (including a personalized message for the client).

The Review Pages workflow simplifies the exchange of ideas, consolidating feedback in a succinct manner. For those using Adobe Premiere or After Effects, those thoughts flow directly into the timeline, where you can immediately take action and upload a new version. Client Review Pages are also now available in the Frame.io iOS app, allowing collaboration via iPhones and iPads.

Exporting and importing comments and annotations into Final Cut Pro X and Media Composer has gotten easier with the upgraded, free desktop companion app, which allows users to open downloaded comment files and bring them into the editor as markers. There is now no need to toggle between Frame.io and the NLE.

Users can also now copy and paste comments from one version to another. The information is exportable in a variety of formats, whether that’s a PDF containing a thumbnail, timecode, comment, annotation and completion status that can be shared and reviewed with the team or as a .csv or .xml file containing tons of additional data for further processing.

Also new to Frame.io 2.0 is a SMPTE-compliant source timecode display that works with both non-drop and drop-frame timecode. Users can now download proxies straight from Frame.io.

The Frame.io 2.0 player page now offers better navigation, efficiency and accountability. New “comment heads” allow artists to visually see who left a comment and where so they can quickly find and prioritize feedback on any given project. Users can also preview the next comment, saving them time when one comment affects another.

The new looping feature, targeting motion and VFX artists, lets users watch the same short clip on loop. You can even select a range within a clip to really dive in deep. Frame.io 2.0’s asset slider makes it easy to navigate between assets from the player page.

The new Frame.io 2.0 dashboard has been redesigned for speed and simplicity. Users can manage collaborators for any given project from the new collaborator panel, where adding an entire team to a project takes one click. A simple search in the project search bar makes it easy to bring up a project. The breadcrumb navigation bar tracks every move deeper into a sub-sub-subfolder, helping artists stay oriented when getting lost in their work. The new list view option with mini-scrub gives users the birds-eye view of everything happening in Frame.io 2.0.

Copying and moving assets between projects takes up no additional storage, even when users make thousands of copies of a clip or project. Frame.io 2.0 also now offers the ability to publish direct to Vimeo, with full control over publishing options, so pros can create the description and set privacy permissions, right then and there.

Behind the Title: Artist/Creative Director Barton Damer

NAME: Barton Damer

COMPANY: Dallas-based  Already Been Chewed

CAN YOU DESCRIBE YOUR COMPANY?
AlreadyBeenChewed is a boutique studio that I founded in 2010. We have created a variety of design, motion graphics and 3D animated content for iconic brands, including Nike, Vans, Star Wars, Harry Potter and Marvel Comics. Check out our motion reel.

WHAT’S YOUR JOB TITLE?
Owner/Founding Artist/Creative Director

WHAT DOES THAT ENTAIL?
My job is to set the vibe for the types of projects, clients and style of work we create. I’m typically developing the creative, working with our chief strategy officer to land projects and then directing the team to execute the creative for the project.

WHAT WOULD SURPRISE PEOPLE ABOUT WHAT FALLS UNDER THAT TITLE?
When you launch out on your own, it’s surprising how much non-creative work there is to do. It’s no longer good enough to be great at what you do (being an artist). Now you have to be excellent with communication skills, people skills, business, organization, marketing, sales and leadership skills. It’s surprising how much you have to juggle in the course of a single day and still hit deadlines.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Developing a solution that will not only meet the clients needs but also push us forward as a studio is always exciting. My favorite part of any job is making sure it looks amazing. That’s my passion. The way it animates is secondary. If it doesn’t look good to begin with, it won’t look better just because you start animating it.

WHAT’S YOUR LEAST FAVORITE?
Dealing with clients that stress me out for various reasons —whether it’s because they are scope creeping or not realizing that they signed a contract… or not paying a bill. Fortunately, I have a team of great people that help relieve that stress for me, but it can still be stressful knowing that they are fighting those battles for the company. We get a lot of clients who will sign a contract without even realizing what they agreed to. It’s always stressful when you have to remind them what they signed.

WHAT IS YOUR FAVORITE TIME OF THE DAY?
Night time! That’s when the freaks come out! I do my best creative at night. No doubt!

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Real estate investing/fixing up/flipping. I like all aspects of designing, including interior design. I’ve designed and renovated three different studio spaces for Already Been Chewed over the last seven years.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I blew out my ACL and tore my meniscus while skateboarding. I wanted to stay involved with my friends that I skated with knowing that surgery and rehab would have me off the board for at least a full year. During that time, I began filming and editing skate videos of my friends. I quickly discovered that the logging and capturing of footage was my least favorite part, but I loved adding graphics and motion graphics to the skate videos. I then began to learn Adobe After Effects and Maxon Cinema 4D.

At this time I was already a full-time graphic designer, but didn’t even really know what motion graphics were. I had been working professionally for about five or six years before making the switch from print design to animation. That was after dabbling in Flash animations and discovering I didn’t want to do code websites (this was around 2003-2004).

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
We recently worked with Nike on various activations for the Super Bowl, March Madness and got to create motion graphics for storefronts as part of the Equality Campaign they launched during Black History Month. It was cool to see our work in the flagship Niketown NYC store while visiting New York a few weeks ago.

We are currently working on a variety of projects for Nike, Malibu Boats, Training Mask, Marvel and DC Comics licensed product releases, as well as investing heavily in GPUs and creating 360 animated videos for VR content.

HOW DID THE NIKE EQUALITY MOTION GRAPHICS CAMPAIGN COME TO FRUITION?
Nike had been working on a variety of animated concepts to bring the campaign to life for storefronts. They had a library of animation styles that had already been done that they felt were not working. Our job was to come up with something that would benefit the campaign style.

We recreated 16 athlete portraits in 3D so that we could cast light and shadows across their faces to slowly reveal them from black and also created a seamless video loop transitioning between the athlete portraits and various quotes about equality.

CAN YOU DESCRIBE THE MOTION GRAPHICS SCOPE OF THE NIKE EQUALITY CAMPAIGN, AND THE SOFTWARE USED?
The video we created was used in various Nike flagship stores — Niketown NYC, Soho and LA, to name a few. We reformatted the video to work in a variety of sizes. We were able to see the videos at Niketown NYC where it was on the front of the window displays. It was also used on large LED walls on the interior as well as a four-story vertical screen in store.

We created the portrait technique on all 16 athletes using Cinema 4D and Octane. The remainder of the video was animated in After Effects.

The portraits were sculpted in Cinema 4D and we used camera projection to accurately project real photos of the athletes onto the 3D portrait. This allowed us to keep 100 percent accuracy of the photos Nike provided, but be able to re-light and cast shadows accordingly to reveal the faces up from black.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
That’s a tough one. Usually, it’s whatever the latest project is. We’re blessed to be working on some really fun projects. That being said… working on Vans 50th Anniversary campaign for the Era shoe is pretty epic! Especially since I am a long time skateboarder.

Our work was used globally on everything from POP displays to storefronts to interactive Website takeover and 3D animated spots for broadcast. It was amazing to see it being used across so many mediums.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
A computer, my iPhone and speakers!

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
I’m very active on Instagram and Facebook. I chose to say “no” to Snapchat in hopes that it will go away so that I don’t have to worry about one more thing (he laughs), and twitter is pretty much dead for me these days. I log in once a month and see if I have any notifications. I also use Behance and LinkedIn a lot, and Dribbble once in a blue moon.

DO YOU LISTEN TO MUSIC WHILE YOU WORK? IF SO, WHAT KIND?
My 25-year-old self would cyber bully me for saying this but soft Drake is “Too Good” these days. Loving Travis Scott and Migos among a long list of others.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
First I bought a swimming pool to help me get away from the computer/emails and swim laps with the kids. That worked for a while, but then I bought a convertible BMW to try to ease the tension and enjoy the wind through my hair. Once that wore off and the stress came back, I bought a puppy. Then I started doing yoga. A year later I bought another puppy.

Quick Chat: Emery Wells discusses Frame.io for Adobe After Effects

By Randi Altman

Frame.io is a cloud-based video collaboration tool that was designed to combine the varied ways pros review and approve projects — think Dropbox, Vimeo or email. Frame.io allows you to create projects and add collaborators and files to share in realtime.

They are now offering integration with Adobe’s After Effects that includes features like realtime comments and annotations that sync to your comp, the ability to import comments and annotations into your comp as live shape layers, and uploads of project files and bins.

To find out more, I reached out to Frame.io’s co-founder/CEO Emery Wells.

You just launched a panel for Adobe After Effects. Why was this the next product you guys targeted?
We launched our first Adobe integration with Premiere Pro this past NAB. It was a huge amount of work to rebuild all the Frame.io collaboration features for the Adobe Extension architecture, but it was worth the effort. The response from the Premiere integration was one of the best and biggest we received. After Effects is Premiere’s best friend. It’s the workhorse of the post industry. From complex motion graphics and visual effects to simple comps and title sequences, After Effects is one the key tools video pros rely on so we knew we had to extend all of the capabilities into AE.

Can you discuss the benefits users get from this panel?
Workflow is often one of the biggest frustrations any post pro faces. You really just want to focus on making cool stuff, but inevitably that requires wrangling renders, uploading files everywhere, collecting feedback and generally just doing a bunch of stuff that has nothing to do what you’re good at and what you enjoy. Frame.io for Adobe After Effects allows you to focus on the work you do well in the tool you use to do it. When you need to get feedback from someone, just upload your comp to Frame.io from within AE. Those people will immediately get a notification via email or their phone and they can start leaving feedback immediately. That feedback then flows right back into your comp where you’re doing the work.

We just cut out all the inefficient steps in between. What it really provides, more than anything else, is rapid iteration. The absolute best work only comes through that creative iteration. We never nail something on our first try. It’s the 10th try, the 50th try. Being able to try things quickly and get feedback quickly not only saves time and money, but will actually produce better work.

Will there be more Adobe collaboration offerings to come?
The way we built the panel for Premiere and After Effects actually uses the entire Frame.io web application codebase. It essentially just has a different skin on it so it feels native to Adobe apps. What that essentially means is all the updates we do to the core web application get inherited by Premiere and After Effects, so there will be many more features to come.

Not long ago Frame.io got a huge infusion of cash thanks to some heavy-hitter investors. How has this changed the way you guys work?
It’s allowing us to move faster and in parallel. We’ve now shipped four really unique products in about a year and half. The core web app, the Apple Design award-winning iOS app, the full experiences that live inside Premiere and AE, and our desktop companion app that integrated with Final Cut Pro X. All these products require considerable resources to maintain and push forward, so the capital infusion will allow us to continue building a complete ecosystem of apps that all work together to solve the most essential creative collaboration challenges.

What’s next for Frame.io?
The integrations are a really key part of our strategy, and you’ll see more of them moving forward. We want to embed Frame.io as deeply as we can in the creative apps so it just becomes a seamless part of your experience.

Check out this video for more:

Review: The HP Z1G3 All-in-One workstation

By Brady Betzel

I’ll admit it. I’ve always been impressed with HP’s All-in-One workstations — from their z840 to their zBook mobile workstation and now their HP Z1G3. Yes, I know, the HP line of workstations are not cheap. In fact, you can save quite a bit of money building your own system, but you will probably have tons of headaches unless you are very confident in your computer-building skills. And if you don’t mind standing in the return line at the Fry’s Electronics.

HP spends tons of time and money on ISV certifications for their workstations. ISV certification stands for Independent Software Vendor certification. In plain English it means that HP spends a lot of time and money making sure the hardware inside of your workstation works with the software you use. For an industry pro that means apps like Adobe’s Premiere Pro and After Effects, Avid Media Composer, Autodesk products like 3DS Max and many others.

For this review,  I tested apps like Avid Media Composer, FilmLight’s Baselight for Media Composer color correction plug-in, Adobe Premiere Pro, Adobe Media Encoder and Adobe After Effects, as well as Blackmagic’s Resolve 12.5.2, which chewed through basic color correction. In terms of testing time, I typically keep a review computer system for a couple of months, but with this workstation I really wanted to test it as thoroughly as possible — I’ve had the workstation for three months and counting, and I’ve been running the system through all the appropriate paces.

I always love to review workstations like the HP Z1G3 because of the raw power they possess. While HP sent me one of the top-of-the-line Z1G3 configurations, which retails for a list price of $3,486, they have a pretty reasonable starting price at $1,349. From Intel i3, i5 and i7 configurations all the way up to the all mighty Intel Xeon — the HP Z1G3 can be customized to fit into your workflow whether you just need to check your email or color correct video from your GoPro.

Here are the specs that make up the HP Z1G3 All-in-One workstation I received:

● 23.6-inch UHD/4K non-glare and non-touch display (3840×2160)
● Intel Xeon E3-1270 v5 CPU, 3.6GHz (4 Cores / 8 Threads)
● 64GB DDR4 SODIMM 2133 GHz (4 x 16GB)
● Nvidia Quadro M2000M graphics (4GB)
● Two Z Turbo drives (512GB, PCIe M.2)
● Wireless keyboard and mouse
● Two Thunderbolt 3/USB 3.1 ports
● USB charging port
● Media card reader
● DisplayPort out

As I mentioned earlier, I tested the Z1G3 with many different apps, but recently I’ve been diving deeper into color correction, and luckily for my testing this fits right in. A few of the most strenuous real-world tests for computer systems is running 3D modeling apps like Maxon Cinema 4D and color correction suites like Resolve. Of course, apps like After Effects are great tests as well, but adding nodes on nodes on nodes in Resolve will really tax your CPU, as well as your GPU.

One thing that can really set apart high-end systems like the Z1G3 is the delay when using a precision color correction panel like Tangent’s Elements or Ripple. Sometimes you will move one of the color wheel balls and a half a second later the color wheel moves on screen. I tried adding a few clips and nodes on the timeline and when using the panels, I noticed no discernible delay (at least more than what I would expect). While this isn’t a scientific test, it is crucial for folks looking to plug in external devices.

For more scientific tests I stuck to apps like Cinebench from Maxon, AJA’s System Test and Blackmagic’s Disk Speed Test. In Cinebench, the Z1G3 ranked at the top of the list when compared to similar systems. In AJA’s System Test I tested the read/write speed of the hp-z1g3-aja-system-test-copynon-OS drive (basically the editing or cache drive). It sustained around 1520MB/s read and 1490MB/s write. I say around because I couldn’t get the AJA app to display the entire read/write numbers because of the high-resolution/zoom in Windows, I tried scaling it down to 1920×1080 but no luck. In Blackmagic’s Disk Speed Test, I was running at 1560MB/s read and 1497.3MB/s write. The drive that I ran this test on is HP’s version of the M.2 PCIe SSD powered by Samsung, more affectionately known by HP as a Z-Turbo drive. The only thing better at the moment would be a bunch of these drives arranged in a RAID-0 configuration. Luckily, you can do that through the Thunderbolt 3 port with some spare SSDs you have lying around.

Almost daily I ran Premiere Pro CC, Media Encoder and Resolve Studio 12.5.2. I was really happy with the performance in Premiere. When working with QuickTimes in inter-frame codecs like H.264 and AVC-HD (non-edit friendly codecs), I was able to work without too much stuttering in the timeline. When I used intra-frame codecs like ProRes HQ from a Blackmagic’s Pocket Cinema Camera, Premiere worked great. I even jumped into Adobe’s Lumetri color tools while using Tangent’s Ripple external color correction panel and it worked with little discernable delay. I did notice that Premiere had a little more delay when using the external color correction panel than Media Composer and Resolve, but that seemed to be more of a software problem rather than a workstation problem.

One of my favorite parts about using a system with an Nvidia graphics card, especially a Quadro card like the M2000M, is the ability to encode multiple versions of a file at once. Once I was done editing some timelapses in Premiere, I exported using Media Encoder. I would apply three presets I made: one square 600×600 H.264 for Instagram, one 3840×2160 H.264 for YouTube and an Animated GIF at 480×360 for Twitter. Once I told Media Encoder to encode, it ran all three exports concurrently — a really awesome feature. With the Nvidia Quadro card installed, it really sped along the export.

Media Composer
Another app I wanted to test was Media Composer 8.6.3. Overall Media Composer ran great except for the high-resolution display. As I’ve said in previous reviews, this isn’t really the fault of HP, but more of the software manufacturers who haven’t updated their interfaces to adapt to the latest UHD displays. I had filmed a little hike I took with my five-year-old. I gave him a GoPro while I had my own. Once we got the footage back home, I imported it into Media Composer, grouped the footage and edited it using the multi-cam edit workflow.

Simply put, the multi-camera split was on the left and the clip I had in the sequence was playing simultaneously on the right. Before I grouped the footage into a multi-group, I transcoded the H.264s into DNxHD 175 an intra-frame, edit-friendly codec. The transcode was nearly realtime, so it took 60 minutes to transcode a 60-minute H.264 — which is not bad. In the end, I was able to edit the two-camera multi-group at 1920×1080 resolution with only minor hiccups. Occasionally, I would get caught in fast forward for a few extra seconds when J-K-L editing, but nothing that made me want to throw my keyboard or mouse against the wall.

Once done editing, I installed the FilmLight color correction plug-in for Media Composer. I had a really awesome experience coloring using Baselight in Media Composer on the Z1G3. I didn’t have any slowdowns, and the relationship between using the color correction panel and Baselight was smooth.

Resolve
The last app I tested with HP’s Z1G3 All-in-One Workstation was Blackmagic’s Resolve 12.5.2. Much like my other tests, I concentrated on color correction with the Tangent Ripple and Element-Vs iOS app. I had four or five nodes going in the color correction page before I started to see a slow down. I was using the native H.264 and ProRes HQ files from the cameras, so I didn’t make it easy for Resolve, but it still worked. Once I added a little sharpening to my clips, the HP Z1G3 really started to kick into gear. I heard the faint hum of fans, which up until this point hadn’t kicked in. This is also where the system started to slow down and become sluggish.

Summing Up
The Z1G3 is one of my favorite workstations, period. A while ago, I reviewed the previous All-in-One workstation from HP, the Z1G2, and at the time it was my favorite. One of my few complaints was that, while it was easy to fix, it was very heavy and bulky. When I opened the Z1G3 box, I immediately noticed how much lighter and streamlined the design was. It almost felt like they took away 50 percent of the bulk, which is something I really appreciate. I can tell that one of the main focuses with the Z1G3 was minimizing its footprint and weight, while increasing the power. HP really knocked it out of the park.

One of the only things that I wish was different on the Z1G3 I tested was the graphics card. While the Nvidia Quadro M2000M is a great graphics card, it is a “mobile” version of a Quadro, which has 128 fewer CUDA cores and 26GB/s less bandwidth than its desktop equivalent the M2000. I would love the option of a full-sized Quadro and instead of the mobile version but I also understand the power consumption will go up as well as the form factor, so maybe I give HP a pass here.

In the end, I know everyone reading this review is saying to themselves, “I love my iMac so why would I want the HP Z1G3?” If you are a die-hard Apple user, or you just saw the new Microsoft Surface Studio announcement, then it might be a hard sell, but I love both Windows- and Mac OS-based systems, and the Z1G3 is awesome. What’s even more awesome is that it is easily upgradeable. I took off the back cover, and with simple switch I could have added a 2.5-inch hard drive or two in under a minute. If you are looking for a new powerful workstation and want one that not only stands up to Resolve and Premiere Pro CC, the HP Z1G3 is for you.


Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Earlier this year, Brady was nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.

Review: Red Giant’s Universe 2

By Brady Betzel

Throughout 2016, we have seen some interesting acquisitions in the world of post production software and hardware — Razer bought THX, Blackmagic bought Ultimatte and Fairlight and Boris FX bought GenArts, to name a few. We’ve also seen a tremendous consolidation of jobs. Editors are now being tasked as final audio mixers, final motion graphics creators, final colorists and much more.

Personally, I love doing more than just editing, so knowing tools like Adobe After Effects and DaVinci Resolve, in addition to Avid Media Composer, has really helped me become not only an editor but someone who can jump into After Effects or Resolve and do good work.

hudUnfortunately, for some people it is the nature of the post beast to know everything. Plug-ins play a gigantic part in balancing my workload, available time and the quality of the final product. If I didn’t have plug-ins like Imagineer’s Mocha Pro, Boris’s Continuum Complete, GenArt’s Sapphire and Red Giant’s Universe 2, I would be forced to turn down work because the time it would take to create a finished piece would outweigh the fee I would be able to charge a client.

A while back, I reviewed Red Giant’s Universe when it was in version 1, (check it out here). In the beginning Universe allowed for lifetime, annual and free memberships. It seems the belt has tightened a little for Red Giant as Universe 2 is now $99 a year, $20 a month or a 14-day free trial. No permanent free version or lifetime memberships are offered (if you downloaded the free Universe before June 28, you will still be able to access those free plug-ins in the Legacy group). Moreover, they have doubled the monthly fee from $10 to $20 — definitely trying to get everyone on to the annual subscription train.

Personally, I think this resulted from too much focus on the broad Universe, trying to jam in as many plug-ins/transitions/effects as possible and not working on specific plug-ins within Universe. I actually like the renewed focus of Red Giant toward a richer toolset as opposed to a full toolset.

Digging In
Okay, enough of my anecdotal narrative and on to some technical awesomeness. Red Giant’s Universe 2 is a vast plug-in collection that is compatible with Adobe’s Premiere Pro and After Effects CS6-CC 2015.3; Apple Final Cut Pro X 10.0.9 and later; Apple Motion 5.0.7 and later; Vegas 12 and 13; DaVinci Resolve 11.1 and later; and HitFilm 3 and 4 Pro. You must have a compatible GPU installed as Universe does not have a CPU fallback plan for unsupported machines. Basically you must have 2GB or higher GPU, and don’t forget about Intel as their graphic support has improved a lot lately. For more info on OS compatibility and specific GPU requirements, check out Red Giant’s compatibility page.

Universe 2 is loaded with great plug-ins that, once you dig in, you will want to use all the time. For instance, I really like the ease of use of Universe’s RGB Separation and Chromatic Glow. If you want a full rundown of each and every effect you should download the Universe 2 trial and check this out. In this review I am only going to go over some of the newly added plug-ins — HUD Components,  Line, Logo Motion and Color Stripe — but remember there are a ton more.

I will be bouncing around different apps like Premiere Pro and After Effects. Initially I wanted to see how well Universe 2 worked inside of Blackmagic’s DaVinci Resolve 12.5.2. Resolve gave me a little trouble at first; it began by crashing once I clicked on OpenFX in the Color page. I rebooted completely and got the error message that the OpenFX had been disabled. I did a little research (and by research I mean I typed ”Disabled OpenFX Resolve” into Google), and  stumbled on a post on Blackmagic’s Forum that suggested deleting “C:\ProgramData\Blackmagic Design\Davinci Resolve\Support\OFXPluginCache.xml” might fix it. Once I deleted that and rebooted Resolve, I clicked on the OpenFX tab in the Color Page, waited 10 minutes, and it started working. From that point on it loaded fast. So, barring the Resolve installation hiccup, there were no problems installing in Premiere and After Effects.

Once installed, you will notice that Universe has a few folders inside of your plug-in’s drop down: Universe Blur, Universe Distort, Universe Generators, Universe Glow, Universe Legacy, Universe Motion Graphics, Universe Stylize and Universe Utilities. You may recognize some of these if you have used an earlier version of Universe, but something you will not recognize is that each Universe plug-in now has a “uni.” prefix.

I am still not sure whether I like this or hate this. On one hand it’s easy to search for if you know exactly what you want in apps like Premiere. On the other hand it runs counterintuitive to what I am used to as a grouchy old editor. In the end, I decided to run my tests in After Effects and Premiere. Resolve is great, but for tracking a HUD in 3D space I was more comfortable in After Effects.

HUD Components
First up is HUD Components, located under the Universe Motion Graphics folder and labeled: “uni.HUD Components.” What used to take many Video CoPilot tutorials and many inspirational views of HUD/UI master Jayse Hansen’s (@jayse_) work, now takes me minutes thanks to the new HUD components. Obviously, to make anything on the level of a master like Jayse Hansen will take hundreds of hours and thousands of attempts, but still — with Red Giant HUD Components you can make those sci-fi in-helmet elements quickly.

When you apply HUD Components to a solid layer in After Effects you can immediately see the start of your HUD. To see what the composite over my footage would look like, I went to change the blend mode to Add, which is listed under “Composite Settings.” From there you can see some awesome pre-built looks under the Choose a Preset button. The pre-built elements are all good starting points, but I would definitely dive further into customizing, maybe layer multiple HUDs over each other with different Blend Modes, for example.

Diving further into HUD Components, there are four separate “Elements” that you can customize, each with different images, animations, colors, clone types, and much more. One thing to remember is that when it comes to transformation settings and order of operations work from the top down. For instance, if you change the rotation on element one, it will affect each element under it, which is kind of handy if you ask me. Once you get the hang of how HUD Components works, it is really easy to make some unique UI components. I really like to use the uni.Point Zoom effect (listed under Universe Glow in the Effects & Presets); it gives you a sort of projector-like effect with your HUD component.

There are so many ways to use and apply HUD Components in everyday work, from building dynamic lower thirds with all of the animatable arcs, clones and rotations to building sci-fi elements, applying Holomatrix to it and even Glitch to create awesome motion graphics elements with multiple levels of detail and color. I did try using HUD Components in Resolve when tracking a 3D object but couldn’t quite get the look I wanted, so I ditched it and used After Effects.

Line
Second up is the Line plug-in. While drawing lines along a path in After Effects isn’t necessarily hard, it’s kind of annoying — think having to make custom map graphics to and from different places daily. Line takes the hard work out of making line effects to and from different points. This plug-in also contains the prefix uni. and is located under Universe Motion Graphics labeled uni.Line.

This plug-in is very simple to use and animate. I quickly found a map, applied uni.Line, placed my beginning and end points, animated the line using two keyframes under “Draw On” and bam! I had an instant travel-vlog style graphic that showed me going from California to Australia in under three minutes (yes, I know three minutes seems a little fast to travel to Australia but that’s really how long it took, render and all). Under the Effect Controls you can find preset looks, beginning and ending shape options like circles or arrows, line types, segmented lines and curve types. You can even move the peak of the curve under bezier style option.

Logo Motion
Third is Logo Motion, located under Universe Motion Graphics titled uni.LogoMotion. In a nutshell you can take a pre-built logo (or anything for that matter), pre-compose it, throw the uni.LogoMotion effect on top, apply a preset reveal, tweak your logo animation, dynamically adjust the length of your pre-comp — which directly affects the logo’s wipe on and off — and, finally, render.

This is another plug-in that makes my life as an editor who dabbles in motion graphics really easy. Red Giant even included some lower third animation presets that help create dynamic lower third movements. You can select from some of the pre-built looks, add some motion while the logo is “idle,” adjust things like rotation, opacity and blur under the start and end properties, and even add motion blur. The new preset browser in Universe 2 really helps with plug-ins like Logo Motion where you can audition animations easily before applying them. You can quickly add some life to any logo or object with one or two clicks; if you want to get detailed you can dial in the idle animation and/or transition settings.

Color Stripe
Fourth is Color Stripe, a transition that uses color layers to wipe across and reveal another layer. This one is a pretty niche case use, but is still worth mentioning. In After Effects. transitions are generally a little cumbersome. I found the Universe 2 transitions infinitely easier to use in NLEs like Adobe Premiere. From the always-popular swish pan to exposure blur, there are some transitions you might use once or some you might use a bunch. Color Stripe is a transition that you probably won’t want to use too often, but when you do need it, it will be right at your fingertips. You can choose from different color schemes like analogous, tetradic, or even create a custom scheme to match your project.

In the end, Universe 2 has some effects that are essential once you begin using them, like uni.Unmult, uni.RGB Separation and the awesome uni.Chromatic Glow. The new ones are great too, I really like the ease of use of uni.HUD Components. Since these effects are GPU accelerated you might be surprised at how fast and fluid they work in your project without slowdowns. For anyone who likes apps like After Effects, but can’t afford to spend hours dialing in the perfect UI interface and HUD, Universe 2 is perfect for you. Check out all of the latest Red Giant Universe 2 tools here.

Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Earlier this year, Brady was nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.

Updates to Adobe Creative Cloud include project sharing, more

By Brady Betzel

Adobe has announced team project sharing!! You read that right — the next Adobe Creative Cloud update, to be released later this year, will have the one thing I’ve always said kept Adobe from punching into Avid’s NLE stake with episodic TV and film editors.

While “one thing” is a bit of hyperbole, Team Projects will be much more than just simple sharing within Adobe Premiere Pro. Team Projects, in its initial stage, will also work with Adobe After Effects, but not with Adobe Audition… at least not in the initial release. Technically speaking, sharing projects within Creative Cloud seems like it will follow a check-in/check-out workflow, allowing you to approve another person’s updates to override yours or vice-versa.

During a virtual press demo, I was shown how the Team Projects will work. I asked if it would work “offline,” meaning without Internet connection. Adobe’s representative said that Team Projects will work with intermittent Internet disconnections, but not fully offline. I asked this because many companies do not allow their NLEs or their storage to be attached to any Internet-facing network connections. So if this is important to you, you may need to do a little more research once we actually can get our hands on this release.

My next question was if Team Projects was a paid service. The Adobe rep said they are not talking the business side of this update yet. I took this as an immediate yes, which is fine, but officially they have no comment on pricing or payment structure, or if it will even cost extra at all.

Immediately after I asked my last question, I realized that this will definitely tie in with the Creative Cloud service, which likely means a monthly fee. Then I wondered where exactly will my projects live? In the cloud? I know the media can live locally on something like an Avid ISIS or Nexis, but will the projects be shared over the Internet? Will we be able to share individual sequences and/or bins or just entire projects? There are so many questions and so many possibilities in my mind, it really could change the multiple editor NLE paradigm if Adobe can manage it properly. No pressure Adobe.

Other Updates
Some other Premiere Pro updates include: improved caption and subtitling tools; updated Lumetri Color tools, including much needed improvement to the HSL secondaries color picker; automatic recognition of VR/360 video and what type of mapping it needs; improved virtual reality workflow; destination publishing will now include Behance (No Instagram export option?); improved Live Text Templates, including a simplified workflow that allows you to share Live Text Templates with other users (will even sync Fonts if they aren’t present from Typekit) and without need for an After Effects License; native DNxHD and DNxHR QuickTime export support, audio effects from Adobe Audition, Global FX mute to toggle on and off all video effects in a sequence; and, best of all, a visual keyboard to map shortcuts! Finally, another prayer for Premiere Pro has been answered. Unfortunately, After Effects users will have to wait for a visual keyboard for shortcut assignment (bummer).

After Effects has some amazing updates in addition to Project Sharing, including a new 3D render engine! Wow! I know this has been an issue for anybody trying to do 3D inside of After Effects via Cineware. Most people will purchase VideoCopilot’s Element 3D to get around this, but for those that want to work directly with Maxon’s Cinema 4D, this may be the update that alleviates some of your 3D disdain via Cineware. They even made mention that you do not need a GPU for this to work well. Oh, how I would love for this to come to fruition. Finally, there’s a new video preview architecture for faster playback that will hopefully allow for a much more fluid and dynamic playback experience.

After Effects C4D RenderAdobe Character Animator has some updates too. If you haven’t played with Character Animator you need to download it now and just watch the simple tutorials that come with the app — you will be amazed, or at least your kids will be. If you haven’t seen how the Simpson’s used Character Animator, you should check it out with a YouTube search. It is pretty sweet. In terms of incoming updates, there will be faster and easier puppet creation, improved round trip workflow between Photoshop and Illustrator, and the ability to use grouped keyboard triggers.

Summing Up
In the end, the future is still looking up for the Adobe Creative Cloud video products, like Premiere Pro and After Effects. If there is one thing to jump out of your skin over in the forthcoming update it is Team Projects. If Team Projects works and works well, the NLE tide may be shifting. That is a big if though because there have been some issues with previous updates — like media management within Premiere Pro — that have yet to be completely ironed out.

Like I said, if Adobe does this right it will be game-changing for them in the shared editing environment. In my opinion, Adobe is beginning to get its head above water in the video department. I would love to see these latest updates come in guns blazing and working. From the demo I saw it looks promising, but really there is only one way to find out: hands-on experience.

Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com, and follow him on Twitter @allbetzroff. Earlier this year, Brady was nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.

Review: Red Giant’s Trapcode Suite 13, Part 1

By Brady Betzel

Have you ever watched a commercial on YouTube and thought, how in the world do these companies have the budget for the VFX and motion graphics work featured? Well, many don’t, but they do have access to talented artists with access to affordable tools that bring pricey looks. Most motion graphics creators have a toolbox full of goodies that help them build great-looking products. Whether it’s preset transitions, graphic overlays or plugins — there are ways to incorporate high-production value without the million-dollar price tag.

Particular

One of those tools that many Adobe After Effects motion graphics artists have in their toolbox is Red Giant’s Trapcode Suite, which is currently in version 13. While it isn’t cheap, if you are focused on that style of motion graphics, it can definitely pay for itself after just a few jobs. Inside the suite are magical plug-ins like the famous Trapcode Particular, Trapcode Form, Trapcode Mir, Trapcode Tao, Trapcode Shine, Trapcode Lux, Trapcode 3D Stroke, Trapcode Echospace, Trapcode Starglow, Trapcode Sound Keys and Trapcode Horizon. Holy cow, that is a lot.

The complete Trapcode Suite 13 works with After Effects (CS6 through CC 2015 officially, including the latest 2015.3 update, just make sure to download the update installer from Red Giant since it might not appear in your Red Giant Link updater), as well as a couple like Shine, 3D Stroke and Starglow that will also work in Adobe Premiere (the same version compatibility as After Effects). A good resource to get your feet wet is on the Red Giant tutorial page where you can find a lot of info and in-depth tutorials from the likes of the master Harry Frank (@graymachine) and Chad Perkins (@chad_perkins).

That being said, if you have no idea what the Trapcode Suite entails, buckle up. It is one of the most useful but intricate plug-ins you will see with a $999 price tag to match ($199 if you are upgrading). Of course, you can pick and choose the product you want, such as Shine for $99 or even Particular for $399, but the entire suite is worth the investment.

Particular

Particular

As an editor, I spend the majority of my time inside of a nonlinear editor like Adobe Premiere or Avid Media Composer/Symphony — probably 80 percent if I had to estimate, the other 20 percent is divided between color correction solutions and VFX/graphics packages like After Effects, Blackmagic Resolve, and others. Because I don’t get a lot of time to play around creatively, I really need to know the suite I am working in and be as efficient as possible. For instance, products like Mocha Pro, Keylight in After Effects and Red Giant’s Trapcode Suite 13 are enhancers that help me be as efficient as I can be as an editor without sacrificing quality for time.

In the latest Trapcode Suite 13 update, Trapcode Particular 2.5 seems to have been updated the most while Trapcode Tao is a new addition to the suite, and the rest were given modest enhancements as well. I will try to touch on each of the products so this will be a two-part review.

Particular
Trapcode Particular is one of the plug-ins that most After Effects nerds/aficionados/experts have encountered. If you have been a little wary and intimidated of Particular because of its complexity, now is the time to dive into using Red Giant’s incredible particle building system. In the 2.5 update, Red Giant added the Effects Builder, which seems to resemble the Magic Bullet Looks builder a little, and I love that. Like I said earlier I don’t typically have eight hours to creatively throw darts at a particle system in hopes of creating a solar system fly-through.

Luckily, the new Effect Builder allows you to easily create your particle system and be emitting (or exploding) in minutes. While it isn’t “easy,” per se, to create a particle system like those featured on Trapcode creator Peder Norrby’s (@trapcode_lab) website, the Effects Builder, along with some tutorial watching (mixed with some patience and love) will send you down a Trapcode rabbit hole that will allow you to create some of the most stunning artwork I’ve seen created in After Effects. Don’t give up if you find it overwhelming, because this is one of those plug-ins that will make you money if you can grasp it. One thing I did notice was the Effects Builder interface was tiny and did not scale with the resolution I was using on my system (2560×1440), but After Effects appeared fine.

If you are an experienced user of Trapcode Particular you should be happier with the updated graphing system that lets you set size and opacity over the life of your particle by directly drawing points on your graph, smoothing, deleting and even randomizing. I really loved using this graph. I immediately saw results that mimic using color curves against an RGB Parade and Waveform on a color scope. Particular has also bumped its particle count up from 20 to 30 million, which will matter to someone creating fireworks back plates for the Fourth of July, I’m sure.

Shine

Shine

Shine
Second on my Trapcode Suite 13 hit list is Trapcode Shine, which might not be the most obviously glamorous update to many people, but still has its merits. The largest update is the ability to attach Shine to After Effects light sources easily. Before you would have to do some fancy footwork that most editors don’t have the time or interest in doing, but as long as your light is named “Shine,” with proper spelling and capitalization, your light now controls the light rays produced by the Shine plug-in.

One thing that most After Effects users know to be a staple is the use of Fractal Noise. Whether you are trying to replicate light rays with realistic and organic effects or a fancy text reveal where you use a Fractal Noise mask as your transition, Fractal Noise is a must use effect. Trapcode Shine has Fractal Noise built into the plug-in now, including the use of 3D fractal noise to create a type of parallax within your light ray work. Simply, parallax is the way the foreground moves in relation to the background. Think of a camera on a slider as it moves from left to right your foreground might stay in relatively same position while the background moves much more — this is your parallax.

One thing that you will always use when applying Fractal Noise is animating the Evolution to add realism. Plus, adding the script “*time” to multiply the evolution factor is an easy way to move the fractal noise along its path. Shine has an “Evolution Speed” under the Fractal Noise heading that allows you to easily adjust the evolution without any scripting (I love this!). Being able to quickly add fractal noise into your light rays really improves my efficiency when a client asks for “that fancy text with those light rays poking through,” but wants to pay exactly zero dollars and zero cents.

Lux and Starglow
Trapcode Lux and Starglow are some other light-focused plug-ins that can add that subtle or dramatic detail to your work setting you apart from the rest of the general motion graphics population. Lux is a fast and easy way to add volumetric drama to point and spotlights. Much like the other plug-inStarglows, you need to apply Lux to a new solid, adjust the specific parameters for the spot or point lights in your composition and, my favorite part, tell Lux if you want to apply to lights named anything, “Lux,” “Front” or “Back.”

Simply, instead of just seeing the emanating light from an After Effects light source, you will now see the physical light source when Lux is added. Lux really shows its power when you need to add a light source to something like an after burner on a jet or the tip of a comet-like fireball. Adding physical light points so easily really opened up my way of thinking. It’s a relatively small feature, but it’s similar to knowing how to do something, but also knowing it takes four hours to accomplish it, so because of diminishing returns you just move along. Now I can do that same thing in little to no time and add that finishing touch easily. This makes me more money and makes the client more confident.

Trapcode Starglow is a small-yet-powerful plug-in that gives life-like glow to bright objects. Think of the star or cross-hatch streaks that can appear on stars or street lights in TV shows and movies. Included in all of the Trapcode Suite are presets, and Starglow is no different with 49 presets, each containing various ray length, color, ray direction and more — all of which are the starting points I like to use when figuring out just what type of Starglow I want to go with.

So far, I’ve covered four of 11 plug-ins contained in the Trapcode Suite 13, all of which are amazing and full of ideas that will undoubtedly elevate your work to a higher level. Something I have noticed over the last few years is a lot of amazing work that comes from those using After Effects; most of it, though, has the scent of a preset and/or tutorial that someone watched, duplicated and exported for their display. One tip that will overstep that ordinary look is to double- and triple-stack effects (in particular the same effect) to add varying levels of depth, color and detail that you couldn’t get with just one instance of a plug-in.

In Part 2 of my Red Giant Trapcode Suite 13 Review, I will tackle the rest of this behemoth plug-in set: Trapcode Form, Trapcode Mir, Trapcode Tao, Trapcode 3D Stroke, Trapcode Echospace, Trapcode Sound Keys, and Trapcode Horizon.

Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com, and follow him on Twitter @allbetzroff. Earlier this year, Brady was nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.

Brickyard VFX creates a chocolate world for Ghirardelli

Brickyard VFX’s Santa Monica studio worked with production company Mercy Brothers and ad agency FCB West on a new two-spot Ghirardelli campaign. Brickyard provided complete creative solutions for the campaign from start to finish, including production, edit, motion design, VFX, and finishing.

Chocolate Carving brings viewers into a completely chocolate world with detailed animated drawings that come to life in chocolate, carving out iconic San Francisco hallmarks, such as the Golden Gate Bridge, a cable car and, of course, Ghirardelli Square. Brickyard motion designer Anton Thallner created the illustrations of each element in Adobe After Effects, which CG supervisor/creative director David Blumenfeld then animated in Autodesk Maya to appear as if they were embossed within a 3D chocolate canvas.

Chocolate Carving

“They presented us with a carved chocolate tour of San Francisco with Ghirardelli Square as the destination.  We created the visual palate, ” explains Brickyard managing partner Steve Michaels. “Anton Thallner designed the chocolate story and created all the iconic elements of San Francisco, while David Blumenfeld turned Anton’s line animation into chocolate carved images and timed the piece to tell the story.

As far as the TimeLapse spot, the agency afforded Brickyard a great amount of creative freedom. “We decided to bring Chachi Ramirez of Mercy Brothers in to direct the spot and take on the construction of the entire Ghirardelli Square,” explains Michaels. “They presented the concept of a Ghirardelli fan so enamored with the product that he creates an homage to the factory entirely using only the mini chocolates. The design and execution was a labor of love between the forces of Brickyard VFX, Mercy Brothers and Bix Pix Entertainment.”

 

Review: Maxon’s Cinema 4D R17

By Brady Betzel

Over the years, I have seen Maxon take Cinema 4D from something that only lived on the periphery of my workflow to an active tool alongside apps such as Adobe After Effects and Adobe Premiere and Avid’s Media Composer/Symphony.

What I have seen happenis the simplification of workflow and capabilities within Cinema 4D’s releases. This brings us to the latest Cinema 4D release: Cinema 4D R17. This release not only builds on the previous R16 release, like improved Motion Tracking and Shaders, but Maxon continues to add new functionality with things like the Take System, Color Chooser or the Variation Shader.

Variation Shader

Because I work in television, I previously thought that I only needed Cinema 4D when creating titles — I couldn’t quite get that gravitas that I was looking for in apps like Media Composer’s Title Tool, After Effects or even Photoshop (i.e. raytracing or great reflections and ambient occlusion that Cinema 4D always conveyed to me). These days I am searching out tutorials and examples of new concepts and getting close to committing to designing one thing a day, much like @beeple or @gsg3d’s daily renders.

Doing a daily render is a great way to get really good at a tool like Cinema 4D. It feels like Maxon is shaping a tool that, much like After Effects, is becoming usable to almost anyone that can get their hands on it — which is a lot of people, especially if you subscribe to Adobe’s Creative Cloud with After Effects, because Cinema 4D Lite/Cineware is included.

Since I am no EJ Hassenfratz (@eyedesyn), I won’t be covering the minute details of Cinema 4D R17, but I do want to write about a few of my favorite updates in hopes that you’ll get excited and jump into the sculpting, modeling or compositing world inside of Cinema 4D R17.

The Take System
If you’ve ever had to render out multiple versions of the same scene, you will want to pay attention to the new Take System in Cinema 4D R17. Typically, you build many projects with duplicated scenes to create different versions of the same scene. Maybe you are modeling a kitchen and want to create five different animations, each with their own materials for the cabinet faces as well as unique camera moves. Thanks to Cinema 4D R17’s Take System you can create different takes within the same project saving tons of time and file size.

Take System

Under the Objects tab you will see a new Takes tab. From there you will generate new takes, enable Auto Take (much like auto keyframing, it saves each unique take’s actions) and perform other take specific functions like overrides. The Take System uses a hierarchical structure that allows for child takes to take the properties of its parents. At the top is the main take, and any changes made there affect all of its children underneath.

Say you want your kitchen to have the same floor but different cabinet face materials. You would first create your scene as you want it to look in the overall sense, then in the Take menu you would add takes for each version you want, name it appropriately for easy navigation later, enable Auto Take, change any attributes to that specific take, save and render!

In the Render Settings under Save > File… you can choose from the drop-down menu how you want your takes named upon export. There are a bunch of presets in there that will get you started. Technically, Maxon refers to this update as Variable Path and File Names, or “Tokens.”

This is a gigantic addition to Cinema 4D’s powerful toolset that should breathe a sigh of relief into anyone who has had to export multiple versions of the same scene. Now instead of multiple projects you can save all of your versions in one place.

Pen and Spline Tools
One of the hardest things to wrap my head around when I was diving into the world of 3D was how someone actually draws in 3D space. What the hell is a Z-Axis anyways? I barely know x and y! Well, after Googling what a Z-Axis is, you will also understand that technically, with a modern-day computing set-up, you can’t literally draw in 3D space without some special hardware.

pen tool

However, in Cinema 4D you can draw on one plane (i.e., Front View), then place that shape inside of a Lathe and bam! — you have drawn in 3D space complete with x,y and z dimensions. So while that is a super-basic example, the new Pen Tool and Spline Tool options allow for someone with little to no 3D experience to jump into Cinema 4D R17 and immediately get modeling.

For an example, if you grab the Pen tool and draw some sort of geometry and then want to cut a hole in it, you can now grab a new circle, place it where you want it to intersect the beautiful object you just drew, highlight the object you want to use as the object that will do the cutting (if you use the Spline Subtract), then hold Control on Windows and Command Mac and click on the object you want to cut from. Then go into the Pen/Spline menu and click Spline Subtract, Spline Union, Spline And, Spline Or or Spline Intersect. You now have a new permanent way to alter your geometry in a much more efficient way. Try it yourself; it’s a lot easier than reading about it.

I used this to create some — I’ll call them unique — shapes and was able to make intersection cuts easily and painlessly.

I also like the Spline Smooth tools. You’ve drawn your spline but want to add some flare —click on the Spline Smooth tools and under the options check off exactly what you want to do to your spline with your brush (think of Spline Smooth like the Liquify tool in Photoshop where you can bulge, flatten or even spiral your work). Under the options you can choose Smooth, Flatten, Random, Pull, Spiral, Inflate and Project. The Spiral function is a great way to give some unique wave-like looks to your geometry

Color Chooser
Another update to Cinema 4D R17 that I really love is the updated Color Chooser. While in theory it’s a small update, it’s a huge update for me. I really like to use different color harmonies when doing anything related to color and color correction. In Cinema 4D R17 you can choose from RGB, HSV or Kelvin color modes. In RGB there are presets to help guide you in making harmonious color choices with presets for Monochromatic, Analogous, Complementary, Tetrad, Semi-ComplemenColor Choosertary and Equiangular color choices. If you don’t have much experience in color theory it might be a good time to run to your local library and find a book; it will really help you make conscious and appropriate color choices when you need to.

Besides the updated color theory based layouts, you can import your own image and create a custom color swatch that can be saved. In addition, a personal favorite is the Color Mixer. You can choose two colors and use a slider to find a mix of the two colors you chose. A lot of great experimentation can happen here.

Lens Distortion
When compositing objects, text or whatever you can think of into a scene it can get frustrating when dealing with footage that has extreme lens curvature. In Cinema 4D R17 you can easily create a Lens Profile that can then be applied as either a shader or a post effect to your final render.

To do this you need to build a Lens Profile by going to the Tools menu and clicking Lens Distortion, then load the image you want to use as reference. From there you need to tell Cinema 4D R17 what it should consider a straight line — like a sidewalk, which in theory should be horizontally straight, or a light pole, which should be vertically straight

Lens Distortion

To do this you need to click Add N-Point Line and line it up against your “straight” object, you can add multiple points as necessary to create changes in line angle, choose a lens distortion model that you think should be close to your lens type (3D Standard Classic is a good one to start with), click Auto Solve and then save your profile to apply when you render your scene. To load the profile on render find your Render Settings > Effects > Lens Distortion and load it from there.

Book Generator
I love that Maxon includes some shiny bells and whistles to their updates. Whether it is a staircase from R16 or Grow Grass, etc, I always love updates that make me say, “Wow that’s cool.” Whether or not I use it a lot is another story.

In Cinema 4D R17, the Book Generator is the “wow” factor for me. Obviously it has a very niche use but it’s still really cool. In your content browser just search for Book Generator and throw it on your scene. To make the books land on a shelf you need to first create the shelves, make them editable, then click “Add Selection as One Group” or “Add Selection as Separate Groups” if you want to control them individually. Afterwards under the Book Generator object you can click on the Selection, which are the actual books. Under User Data you can customize things like overall book size, type of books or magazine, randomness, textures and bookends, and even make the books lean on each other if they are spaced out.

book generator

It’s pretty sweet once you understand how it works. If you want different pre-made textures for your magazines or books you can search for “book” in the Content Browser. They have many different kinds of textures including one for the inside pages.

Summing Up
I detailed just a few great updates to Maxon’s Cinema 4D R17, but there are tons more. The awesome ability to import SketchUp files directly into Cinema 4D R17 is very handy and keyframe handling updates and the possibilities from the Variation Shader make Cinema 4D R17 full of endless possibilities.

If you aren’t ready to drop the $3,695 on the Cinema 4D R17 Studio edition, $2,295 on the Visualize edition, $1,695 on the Broadcast edition or $995 on the Prime edition, make sure you check out the version that comes with Adobe After Effects CC (Cineware/Cinema 4D Light). While it won’t be as robust as the other versions, it will give you a taste of what is possible and may even spark your imagination to try something new like modeling! Check out the different versions here: http://www.maxon.net/products/general-information/general-information/product-comparison.html.

Keep in mind that if you are new to the world of 3D modeling or Cinema 4D and want to find some great learning resources you should check out Sean Frangella on YouTube: https://www.youtube.com/user/seanfrangella @seanfrangellawww.greyscalegorilla.com/blog, Cineversity: and www.motionworks.net @motionworks. Cineversity even used my alma mater, California Lutheran University in their tutorials!

Brady Betzel is an online editor at Margarita Mix in Hollywood. Previously, he was editing The Real World at Bunim-Murray Productions. You can email Brady at bradybetzel@gmail.com, and follow him on Twitter, @allbetzroff.

Top 3: My picks from Adobe’s Creative Cloud update

By Brady Betzel

Adobe’s resolve to update its Creative Cloud apps on a regular basis has remained strong. The latest updates, released on December 1, really hammer home Adobe’s commitment to make editing video, creating visual effects and color correcting on a tablet a reality, but it doesn’t end there. They have made their software stronger across the board, whether you are using a tablet, mobile workstation or desktop.

After Effects and Stacked Panels

I know everyone is going to have their own favorites, but here are my top three from the latest release:

1. Stacked Panels
In both After Effects and Premiere you will notice the ability to arrange your menus in Stacked Panels. I installed the latest updates on a Sony VAIO tablet and these Stacked Panels were awesome!

It’s really a nice way to have all of your tools on screen without having them take up too much real estate. In addition, Adobe has improved touch-screen interaction with the improved ability to pinch and zoom different parts of the interface, like increasing the size of a thumbnail with a pinch-to-zoom.

In Premiere, to find the Stacked Panels you need to find the drop down menu in the project panel, locate Panel Group Settings and then choose Stacked Panel Group and Solo Panels in Stack, if you want to only view one at a time. I highly recommend using the Stacked Panels if you are using a touchscreen, like a tablet or some of the newer mobile workstations out there in the world. Even if you aren’t, I really think it works well.

Premiere Pro and Optical Flow

Premiere Pro and Optical Flow

2. Optical Flow Time Remapping
Most editors are probably thinking, “Avid has had this for years and years and years, just like Avid had Fluid Morph years before Adobe introduced Morph Cut.” While I thought the exact same thing, I really love that Adobe’s version is powered by the GPU. This really beefs up the speed of the latest HP z840 with Nvidia Quadro or GTX 980 Ti graphics cards and all their CUDA cores. Be warned though, Optical Flow (much like Morph Cut) works only in certain situations.

If you’ve ever used Twixtor or Fluid Motion in Media Composer, you know that sometimes there is a lot of work that goes into making those effects work. It’s not always the right solution to time remapping footage, especially if you are working on content that will air on broadcast television — even though Optical Flow may look great, some content will fail certain networks’ quality control because of the weird Jello-looking artifacting that can occur.

After Effects and the Lumetri Color Panel

3. Lumetri Color inside of After Effects
While you might already have a favorite coloring app or plug-in to use, having the ability to take clips from Premiere to After Effects, while carrying over the color correction you made inside of the Lumetri panels, is key. In addition, you can use the Lumetri effect inside of After Effects (located under the Utility category) to quickly color your clips inside of After Effects.

Overall, this round of updates seemed to be par for the course, nothing completely revolutionary but definitely useful and wanted. Personally, I don’t think that adding HDR capabilities should have taken precedence over some other updates, such as collaboration improvements (think Avid Media Composer and Avid’s Shared Storage solution, ISIS), general stability improvements, media management, etc. But Adobe is holding true to their word and bringing some of the latest and greatest improvements to their users… and causing users (and manufacturers) of other tools to take notice.

Brady Betzel is an online editor at Margarita Mix in Hollywood. Previously, he was editing The Real World at Bunim Murray Productions. You can email Brady at bradybetzel@gmail.com, and follow him on Twitter, @allbetzroff.

Vidcheck heading to NAB with ‘Vidapps’ for After Effects, Premiere

Video processed with Adobe After Effects or Premiere Pro will soon be able to take advantage of Vidcheck’s intelligent “Vidapps-Video” plug-in to correct RGB gamut and YUV levels within the NLE.

Uk-based Vidcheck, which makes automated quality control software with patented intelligent video and audio correction, will be at NAB this year showing the latest version of its Vidchecker and Vidfixer product suites. These have been extended to include a range of video applications (Vidapps) for Adobe After Effects and Premiere Pro, enabling users to check and automatically correct video and audio errors without leaving the Adobe environment.

This means that users of Adobe After Effects and Premiere Pro can correct illegal video levels to broadcast safe parameters using Vidcheck’s patented algorithms which correct the video without clamping it. (Clamping being an antiquated means of achieving broadcast safe content that can cause undesirable degradations of the resulting picture).

Vidcheck’s Vidapps-Video provides checking and correction as part of the edit process and is designed to be used as the last stage of post production, immediately before the media is rendered. This approach means the user can be confident that the rendered media will fully comply with the specified requirements before it leaves the Adobe environment.

As part of using Vidapps, an XML report can be generated and saved as a record that QC corrections were done and, if required, be forwarded to the client for the media file. The report can also be ‘skinned’ with the logo and colors of the post house/video editor to make it highly specific and identifiable to them.

Additional Vidapps plug-ins are currently available for audio and photosensitive epilepsy (PSE) checking, and others are in development for introduction later in 2015.

Vidcheck’s core Vidchecker and Vidfixer AQC products are scalable from low-cost versions for post production to sophisticated Vidchecker/Vidfixer Grid systems suitable for larger enterprises. In addition to watch folder automation, Vidcheck’s API has been integrated into many MAM and workflow engine solutions across the industry for seamless addition of complex AQC into any workflow.

Making ‘Being Evel’: James Durée walks us through post

Compositing played a huge role in this documentary film.

By Randi Altman

Those of us of a certain age will likely remember being glued to the TV as a child watching Evel Knievel jump his motorcycle over cars and canyons. It felt like the world held its collective breath, hoping that something horrible didn’t happen… or maybe wondering what it would be like if something did.

Well, Johnny Knoxville, of Jackass and Bad Grandpa fame, was one of those kids, as witnessed by, well, his career. Knoxville and Oscar-winning filmmaker Daniel Junge (Saving Face) combined to make Being Evel, a documentary on the daredevil’s life and career. Produced by Knoxville’s Dickhouse Productions (yup, that’s right) and HeLo, it premiered at Sundance this year.

Continue reading

Five Adobe After Effects Shortcuts

By Brady Betzel

As an editor, most of my day is spent inside of Avid Media Composer, but occasionally I will get to turn on my Spotify, groove to the music and crank out some Adobe After Effects or Maxon Cinema 4D work. Over the years I’ve found some shortcuts within After Effects that make my job easier, and I wanted to share five of my favorites… from an editor’s perspective.

Double Click in the project window to import an asset
When importing assets into an Adobe After Effects project I often see people do the archaic: File > Import. Instead, if you just double click in the Project Window you will save yourself a few steps. Simple, but I see it all the time.

Tilde key (`) to make full screen
Continue reading