Tag Archives: Adobe After Effects

Quick Chat: Creating graphics package for UN’s Equator Prize ceremony

Undefined Creative (UC) was recently commissioned by the United Nations Development Programme (UNDP) to produce a fresh package of event graphics for its Equator Prize 2017 Award Ceremony. This project is the latest in a series of motion design-centered work collaborations between the creative studio and the UN, a relationship that began when UC donated their skills to the Equator Prize in 2010.

The Equator Prize recognizes local and indigenous community initiatives from across the planet that are advancing innovative on-the-ground solutions to climate, environment and poverty challenges. Award categories honor achievement and local innovation in the thematic areas of oceans, forests, grasslands and wildlife protection.

For this year’s ceremony, UNDP wanted a complete refresh that gave the on-stage motion graphics a current vibe while incorporating the key icons behind its sustainable development goals (SDGs). Consisting of a “Countdown to Ceremony” screensaver, an opening sequence, 15 winner slates, three category slates and 11 presenter slates, the package had to align visually with a presentation from National Geographic Society, which was part of the evening’s program.

To bring it all together, UC drew from the SDG color palettes and relied on subject matter knowledge of both the UNDP and National Geographic in establishing the ceremony graphics’ overall look and feel. With only still photos available for the Equator Prize winners, UC created motion and depth by strategically intertwining the best shots with moving graphics and strategically selected stock footage. Naturally moving flora and fauna livened up the photography, added visual diversity and contributed creating a unique aesthetic.

We reached out to Undefined Creative’s founder/creative director Maria Rapetskaya to find out more:

How early did you get involved in the project, and was the client open to input?
We got the call a couple of months before the event. The original show had been used multiple times since we created it in 2010, so the client was definitely looking for input on how we could refresh or even rebrand.

Any particular challenges for this one?
For non-commercial organizations, budgets and messaging are equally sensitive topics. We have to be conscious of costs, and also very aware of Do’s and Don’t’s when it comes to assets and use. Our creative discussions took place over several calls, laying out options and ideas at different budget tiers — anything from simply updating the existing package to creating something entirely different. In case of the latter, parameters had to be established right away for how different “different” could be.

For example, it was agreed that we should stick with photography provided by the 2017 award winners. However, our proposal to include stock for flora and fauna was agreed on by all involved. Which SDG icons would be used and how, what partner and UN organizational branding should be featured prominently as design inspiration, how this would integrate with content being produced for UNDP/Equator Prize by Nat Geo… all of these questions had to be addressed before we started any real ideation in order for the creative to stay on brand, on message, on budget and on time.

What tools did you use on the project?
We relied on Adobe CC, in particular, After Effects, which is our staple software. In this particular project, we also relied heavily on stock from multiple vendors. Pond5 have a robust and cost-effective collection of video elements we were seeking.

Why is this project important to you?
The majority of our clients are for-profit commercial entities, and while that’s wonderful, there’s always a different feeling of reward when we have the chance to do something for the good of humanity at large, however minuscule our contribution is. The winners are coming from such different corners of the globe — at times, very remote. They’re incredibly excited to be honored, on stage, in New York City, and we can only imagine what it feels like to see their faces, the faces of their colleagues and friends, the names of their projects, up on this screen in front of a large, live audience. This particular event brings us a lot closer to what we’re creating, on a really empathetic, human level.

Red Giant Trapcode Suite 14 now available

By Brady Betzel

Red Giant has released an update to its Adobe After Effects focused plug-in toolset Trapcode Suite 14, including new versions of Trapcode Particular and Form as well as an update to Trapcode Tao.

The biggest updates seem to be in Red Giant’s flagship product Trapcode Particular 3. Trapcode Particular is now GPU accelerated through OpenGL with a proclaimed 4X speed increase over previous versions. The Designer has been re-imagined and seems to take on a more Magic Bullet-esque look and feel. You can now include multiple particle systems inside the same 3D space, which will add to the complexity and skill level needed to work with Particular.

You can now also load your own 3D model OBJ files as emitters in the Designer panel or use any image in your comp as a particle. There are also a bunch of new presets that have been added to start you on your Particular system building journey — over 210 new presets, to be exact.

Trapcode Form has been updated to version 3 with the updated Designer, ability to add 3D models and animated OBJ sequences as particle grids, load images to be used as a particle, new graphing system to gain more precise control over the system and over 70 presets in the designer.

Trapcode Tao has been updated with depth of field effects to allow for that beautiful camera-realistic blur that really sets pro After Effects users apart.

Trapcode Particular 3 and Form 3 are paid updates while Tao is free for existing users. If you want to only update Tao make sure you only select Tao for the update otherwise you will install new Trapcode plug-ins over your old ones.

Trapcode Particular 3 is available now for $399. The update is $149 and the academic version is $199. You can also get it as a part of the Trapcode Suite 14 for $999.

Trapcode Form 3 is available now for $199. The update is $99 and the academic costs $99. It can be purchased as part of the Trapcode Suite 14 for $999.

Check out the new Trapcode Suite 14 bundle.

 

Mocha VR: An After Effects user’s review

By Zach Shukan

If you’re using Adobe After Effects to do compositing and you’re not using Mocha, then you’re holding yourself back. If you’re using Mettle Skybox, you need to check out Mocha VR, the VR-enhanced edition of Mocha Pro.

Mocha Pro, and Mocha VR are all standalone programs where you work entirely within the Mocha environment and then export your tracks, shapes or renders to another program to do the rest of the compositing work. There are plugins for Maxon Cinema 4D, The Foundry’s Nuke, HitFilm, and After Effects that allow you to do more with the Mocha data within your chosen 3D or compositing program. Limited-feature versions of Mocha (Mocha AE and Mocha HitFilm) come installed with the Creative Cloud versions of After Effects and HitFilm 4 Pro, and every update of these plugins is getting closer to looking like a full version of Mocha running inside of the effects panel.

Maybe I’m old school, or maybe I just try to get the maximum performance from my workstation, but I always choose to run Mocha VR by itself and only open After Effects when I’m ready to export. In my experience, all the features of Mocha run more smoothly in the standalone than when they’re launched and run inside of After Effects.**

How does Mocha VR compare to Mocha Pro? If you’re not doing VR, stick with Mocha Pro. However, if you are working with VR footage, you won’t have to bend over backwards to keep using Mocha.

Last year was the year of VR, when all my clients wanted to do something with VR. It was a crazy push to be the first to make something and I rode the wave all year. The thing is there really weren’t many tools specifically designed to work with 360 video. Now this year, the post tools for working with VR are catching up.

In the past, I forced previous versions of Mocha to work with 360 footage before the VR version, but since Mocha added its VR-specific features, stabilizing a 360-camera became cake compared to the kludgy way it works with the industry standard After Effects 360 plugin, Skybox. Also, I’ve used Mocha to track objects in 360 before the addition of an equirectangular* camera and it was super-complicated because I had to splice together a whole bunch of tracks to compensate for the 360 camera distortion. Now it’s possible to create a single track to follow objects as they travel around the camera. Read the footnote for an explanation of equirectangular, a fancy word that you need to know if you’re working in VR.

Now let’s talk about the rest of Mocha’s features…

Rotoscoping
I used to rotoscope by tracing every few frames and then refining the frames in between until I found out about the Mocha way to rotoscope. Because Mocha combines rotoscoping with tracking of arbitrary shapes, all you have to do is draw a shape and then use tracking to follow and deform all the way through. It’s way smarter and more importantly, faster. Also, with the Uberkey feature, you can adjust your shapes on multiple frames at once. If you’re still rotoscoping with After Effects alone, you’re doing it the hard way.

Planar Tracking
When I first learned about Mocha it was all about the planar tracker, and that really is still the heart of the program. Mocha’s basically my go-to when nothing else works. Recently, I was working on a shot where a woman had her dress tucked into her pantyhose, and I pretty much had to recreate a leg of a dress that swayed and flowed along with her as she walked. If it wasn’t for Mocha’s planar tracker I wouldn’t have been able to make a locked-on track of the soft-focus (solid color and nearly without detail) side of the dress. After Effects couldn’t make a track because there weren’t enough contrast-y details.

GPU Acceleration
I never thought Mocha’s planar tracking was slow, even though it is slower than point tracking, but then they added GPU acceleration a version or two ago and now it flies through shots. It has to be at least five times as fast now that it’s using my Nvidia Titan X (Pascal), and it’s not like my CPU was a slouch (an 8-core i7-5960X).

Object Removal
I’d be content using Mocha just to track difficult shots and for rotoscoping, but their object-removal feature has saved me hours of cloning/tracking work in After Effects, especially when I’ve used it to remove camera rigs or puppet rigs from shots.

Mocha’s remove module is the closest thing out there to automated object removal***. It’s as simple as 1) create a mask around the object you want to remove, 2) track the background that your object passes in front of, and then 3) render. Okay, there’s a little more to it, but compared to the cloning and tracking and cloning and tracking and cloning and tracking method, it’s pretty great. Also, a huge reason to get the VR edition of Mocha is that the remove module will work with a 360 camera.

Here I used Mocha object removal to remove ropes that pulled a go-cart in a spot for Advil.

VR Outside of After Effects?
I’ve spent most of this article talking about Mocha with After Effects, because it’s what I know best, but there is one VR pipeline that can match nearly all of Mocha VR’s capabilities: the Nuke plugin Cara VR, but there is a cost to that workflow. More on this shortly.

Where you will hit the limit of Mocha VR (and After Effects in general) is if you are doing 3D compositing with CGI and real-world camera depth positioning. Mocha’s 3D Camera Solve module is not optimized for 360 and the After Effects 3D workspace can be limited for true 3D compositing, compared to software like Nuke or Fusion.

While After Effects sort of tacked on its 3D features to its established 2D workflow, Nuke is a true 3D environment as robust as Autodesk Maya or any of the high-end 3D software. This probably sounds great, but you should also know that Cara VR is $4,300 vs. $1,000 for Mocha VR (the standalone + Adobe plugin version) and Nuke starts at $4,300/year vs. $240/year for After Effects.

Conclusion
I think of Mocha as an essential companion to compositing in After Effects, because it makes routine work much faster and it does some things you just can’t do with After Effects alone. Mocha VR is a major release because VR has so much buzz these days, but in reality it’s pretty much just a version of Mocha Pro with the ability to also work with 360 footage.

*Equirectangular is a clever way of unwrapping a 360 spherical projection, a.k.a, the view we see in VR, by flattening it out into a rectangle. It’s a great way to see the whole 360 view in an editing program, but A: it’s very distorted so it can cause problems for tracking and B: anything that is moving up or down in the equirectangular frame will wrap around to the opposite side (a bit like Pacman when he exits the screen), and non-VR tracking programs will stop tracking when something exits the screen on one side.

**Note: According to the developer, one of the main advantages to running Mocha as a plug-in (inside AE, Premiere, Nuke, etc) for 360 video work is that you are using the host program’s render engine and proxy workflow. Having the ability to do all your tracking, masking and object removal on proxy resolutions is a huge benefit when working at large 360 formats that can be as large as 8k stereoscopic. Additionally, the Mocha modules that render, such as reorient for horizon stabilization or remove module will render inside the plug-in making for a streamlined workflow.

***FayOut was a “coming soon” product that promised an even more automated method for object removal, but as of the publishing of this article it appears that they are no longer “coming soon” and may have folded or maybe their technology was purchased and it will be included in a future product. We shall see…
________________________________________
Zach Shukan is the VFX specialist at SilVR and is constantly trying his hand at the latest technologies in the video post production world.

Adobe acquires Mettle’s SkyBox tools for 360/VR editing, VFX

Adobe has acquired all SkyBox technology from Mettle, a developer of 360-degree and virtual reality software. As more media and entertainment companies embrace 360/VR, there is a need for seamless, end-to-end workflows for this new and immersive medium.

The Skybox toolset is designed exclusively for post production in Adobe Premiere Pro CC and Adobe After Effects CC and complements Adobe Creative Cloud’s existing 360/VR cinematic production technology. Adobe will integrate SkyBox plugin functionality natively into future releases of Premiere Pro and After Effects.

To further strengthen Adobe’s leadership in 360-degree and virtual reality, Mettle co-founder Chris Bobotis will join Adobe, bringing more than 25 years of production experience to his new role.

“We believe making virtual reality content should be as easy as possible for creators. The acquisition of Mettle SkyBox technology allows us to deliver a more highly integrated VR editing and effects experience to the film and video community,” says Steven Warner, VP of digital video and audio at Adobe. “Editing in 360/VR requires specialized technology, and as such, this is a critical area of investment for Adobe, and we’re thrilled Chris Bobotis has joined us to help lead the charge forward.”

“Our relationship started with Adobe in 2010 when we created FreeForm for After Effects, and has been evolving ever since. This is the next big step in our partnership,” says Bobotis, now director, professional video at Adobe. “I’ve always believed in developing software for artists, by artists, and I’m looking forward to bringing new technology and integration that will empower creators with the digital tools they need to bring their creative vision to life.”

Introduced in April 2015, SkyBox was the first plugin to leverage Mettle’s proprietary 3DNAE technology, and its success quickly led to additional development of 360/VR plugins for Premiere Pro and After Effects.

Today, Mettle’s plugins have been adopted by companies such as The New York Times, CNN, HBO, Google, YouTube, Discovery VR, DreamWorks TV, National Geographic, Washington Post, Apple and Facebook, as well as independent filmmakers and YouTubers.

Nice Shoes Creative Studio animates limited-edition Twizzlers packages

Twizzlers and agency Anomaly recently selected 16 artists to design a fun series of limited edition packages for the classic candy. Each depicts various ways people enjoy Twizzlers. New York’s Nice Shoes Creative Studio, led by creative director Matt Greenwood, came on board to introduce these packages with an animated 15-second spot.

Three of the limited edition packages are featured in the fast-paced spot, bringing to life the scenarios of car DJing, “ugly crying” at the movies, and studying in the library, before ending on a shot that incorporates all of the 16 packages. Each pack has its own style, characters, and color scheme, unique to the original artists, and Nice Shoes was careful to work to preserve this as they crafted the spot.

“We were really inspired by the illustrations,” explains Greenwood. “We stayed close to the original style and brought them into a 3D space. There’s only a few seconds to register each package, so the challenge was to bring all the different styles and colors together within this time span. Select characters and objects carry over from one scene into the next, acting as transitional elements. The Twizzlers logo stays on-screen throughout, acting as a constant amongst the choreographed craziness.”

The Nice Shoes team used a balance of 3D and 2D animation, creating a CG pack while executing the characters on the packs with hand-drawn animation. Greenwood proposed taking advantage of the rich backgrounds that the artists had drawn, animating tiny background elements in addition to the main characters in order to “make each pack feel more alive.”

The main Twizzlers pack was modeled, lit, animated and rendered in Autodesk Maya which was composited in Adobe After Effects together with the supporting elements. These consisted of 2D hand-drawn animations created in Photoshop and 3D animated elements made with Mason Cinema 4D.

“Once we had the timing, size and placement of the main pack locked, I looked at which shapes would make sense to bring into a 3D space,” says Greenwood. “For example, the pink ribbons and cars from the ‘DJ’ illustration worked well as 3D objects, and we had time to add touches of detail within these elements.”

The characters on the packs themselves were animated with After Effects and applied as textures within the pack artwork. “The flying books and bookcases were rendered with Sketch and Toon in Cinema 4D, and I like to take advantage of that software’s dynamics simulation system when I want a natural feel to objects falling onto surfaces. The shapes in the end mnemonic are also rendered with Sketch and Toon and they provide a ‘wipe’ to get us to the end lock-up,” says Greenwood.

The final step during the production was to add a few frame-by-frame 2D animations (the splashes or car exhaust trail, for example) but Nice Shoes Creative Studio waited until everything was signed off before they added these final details.

“The nature of the illustrations allowed me to try a few different approaches and as long as everything was rendered flat or had minimal shading, I could combine different 2D and 3D techniques,” he concludes.

Frame.io 2.0 offers 100 new features, improvements for collaboration

Frame.io, developers of the video review and collaboration platform for content creators, has unveiled Frame.io 2.0 , an upgrade offering over 100 new features and improvements. This new version features new client Review Pages, which expands content review and sharing. In addition, the new release offers deeper workflow integration with Final Cut Pro X and Avid Media Composer, plus a completely re-engineered player.

“Frame.io 2 is based on everything we’ve learned from our customers over the past two years and includes our most-requested features,” says Emery Wells, CEO of Frame.io.

Just as internal teams can collaborate using Frame.io’s comprehensive annotation and feedback tools, clients can now provide detailed feedback on projects with Review Pages, which is designed to make the sharing experience simple, with no log-in required.

Review Pages give clients the same commenting ability as collaborators, without exposing them to the full Frame.io interface. Settings are highly configurable to meet specific customer needs, including workflow controls (approvals), security (password protection, setting expiration date) and communication (including a personalized message for the client).

The Review Pages workflow simplifies the exchange of ideas, consolidating feedback in a succinct manner. For those using Adobe Premiere or After Effects, those thoughts flow directly into the timeline, where you can immediately take action and upload a new version. Client Review Pages are also now available in the Frame.io iOS app, allowing collaboration via iPhones and iPads.

Exporting and importing comments and annotations into Final Cut Pro X and Media Composer has gotten easier with the upgraded, free desktop companion app, which allows users to open downloaded comment files and bring them into the editor as markers. There is now no need to toggle between Frame.io and the NLE.

Users can also now copy and paste comments from one version to another. The information is exportable in a variety of formats, whether that’s a PDF containing a thumbnail, timecode, comment, annotation and completion status that can be shared and reviewed with the team or as a .csv or .xml file containing tons of additional data for further processing.

Also new to Frame.io 2.0 is a SMPTE-compliant source timecode display that works with both non-drop and drop-frame timecode. Users can now download proxies straight from Frame.io.

The Frame.io 2.0 player page now offers better navigation, efficiency and accountability. New “comment heads” allow artists to visually see who left a comment and where so they can quickly find and prioritize feedback on any given project. Users can also preview the next comment, saving them time when one comment affects another.

The new looping feature, targeting motion and VFX artists, lets users watch the same short clip on loop. You can even select a range within a clip to really dive in deep. Frame.io 2.0’s asset slider makes it easy to navigate between assets from the player page.

The new Frame.io 2.0 dashboard has been redesigned for speed and simplicity. Users can manage collaborators for any given project from the new collaborator panel, where adding an entire team to a project takes one click. A simple search in the project search bar makes it easy to bring up a project. The breadcrumb navigation bar tracks every move deeper into a sub-sub-subfolder, helping artists stay oriented when getting lost in their work. The new list view option with mini-scrub gives users the birds-eye view of everything happening in Frame.io 2.0.

Copying and moving assets between projects takes up no additional storage, even when users make thousands of copies of a clip or project. Frame.io 2.0 also now offers the ability to publish direct to Vimeo, with full control over publishing options, so pros can create the description and set privacy permissions, right then and there.

Behind the Title: Artist/Creative Director Barton Damer

NAME: Barton Damer

COMPANY: Dallas-based  Already Been Chewed

CAN YOU DESCRIBE YOUR COMPANY?
AlreadyBeenChewed is a boutique studio that I founded in 2010. We have created a variety of design, motion graphics and 3D animated content for iconic brands, including Nike, Vans, Star Wars, Harry Potter and Marvel Comics. Check out our motion reel.

WHAT’S YOUR JOB TITLE?
Owner/Founding Artist/Creative Director

WHAT DOES THAT ENTAIL?
My job is to set the vibe for the types of projects, clients and style of work we create. I’m typically developing the creative, working with our chief strategy officer to land projects and then directing the team to execute the creative for the project.

WHAT WOULD SURPRISE PEOPLE ABOUT WHAT FALLS UNDER THAT TITLE?
When you launch out on your own, it’s surprising how much non-creative work there is to do. It’s no longer good enough to be great at what you do (being an artist). Now you have to be excellent with communication skills, people skills, business, organization, marketing, sales and leadership skills. It’s surprising how much you have to juggle in the course of a single day and still hit deadlines.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Developing a solution that will not only meet the clients needs but also push us forward as a studio is always exciting. My favorite part of any job is making sure it looks amazing. That’s my passion. The way it animates is secondary. If it doesn’t look good to begin with, it won’t look better just because you start animating it.

WHAT’S YOUR LEAST FAVORITE?
Dealing with clients that stress me out for various reasons —whether it’s because they are scope creeping or not realizing that they signed a contract… or not paying a bill. Fortunately, I have a team of great people that help relieve that stress for me, but it can still be stressful knowing that they are fighting those battles for the company. We get a lot of clients who will sign a contract without even realizing what they agreed to. It’s always stressful when you have to remind them what they signed.

WHAT IS YOUR FAVORITE TIME OF THE DAY?
Night time! That’s when the freaks come out! I do my best creative at night. No doubt!

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Real estate investing/fixing up/flipping. I like all aspects of designing, including interior design. I’ve designed and renovated three different studio spaces for Already Been Chewed over the last seven years.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I blew out my ACL and tore my meniscus while skateboarding. I wanted to stay involved with my friends that I skated with knowing that surgery and rehab would have me off the board for at least a full year. During that time, I began filming and editing skate videos of my friends. I quickly discovered that the logging and capturing of footage was my least favorite part, but I loved adding graphics and motion graphics to the skate videos. I then began to learn Adobe After Effects and Maxon Cinema 4D.

At this time I was already a full-time graphic designer, but didn’t even really know what motion graphics were. I had been working professionally for about five or six years before making the switch from print design to animation. That was after dabbling in Flash animations and discovering I didn’t want to do code websites (this was around 2003-2004).

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
We recently worked with Nike on various activations for the Super Bowl, March Madness and got to create motion graphics for storefronts as part of the Equality Campaign they launched during Black History Month. It was cool to see our work in the flagship Niketown NYC store while visiting New York a few weeks ago.

We are currently working on a variety of projects for Nike, Malibu Boats, Training Mask, Marvel and DC Comics licensed product releases, as well as investing heavily in GPUs and creating 360 animated videos for VR content.

HOW DID THE NIKE EQUALITY MOTION GRAPHICS CAMPAIGN COME TO FRUITION?
Nike had been working on a variety of animated concepts to bring the campaign to life for storefronts. They had a library of animation styles that had already been done that they felt were not working. Our job was to come up with something that would benefit the campaign style.

We recreated 16 athlete portraits in 3D so that we could cast light and shadows across their faces to slowly reveal them from black and also created a seamless video loop transitioning between the athlete portraits and various quotes about equality.

CAN YOU DESCRIBE THE MOTION GRAPHICS SCOPE OF THE NIKE EQUALITY CAMPAIGN, AND THE SOFTWARE USED?
The video we created was used in various Nike flagship stores — Niketown NYC, Soho and LA, to name a few. We reformatted the video to work in a variety of sizes. We were able to see the videos at Niketown NYC where it was on the front of the window displays. It was also used on large LED walls on the interior as well as a four-story vertical screen in store.

We created the portrait technique on all 16 athletes using Cinema 4D and Octane. The remainder of the video was animated in After Effects.

The portraits were sculpted in Cinema 4D and we used camera projection to accurately project real photos of the athletes onto the 3D portrait. This allowed us to keep 100 percent accuracy of the photos Nike provided, but be able to re-light and cast shadows accordingly to reveal the faces up from black.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
That’s a tough one. Usually, it’s whatever the latest project is. We’re blessed to be working on some really fun projects. That being said… working on Vans 50th Anniversary campaign for the Era shoe is pretty epic! Especially since I am a long time skateboarder.

Our work was used globally on everything from POP displays to storefronts to interactive Website takeover and 3D animated spots for broadcast. It was amazing to see it being used across so many mediums.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
A computer, my iPhone and speakers!

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
I’m very active on Instagram and Facebook. I chose to say “no” to Snapchat in hopes that it will go away so that I don’t have to worry about one more thing (he laughs), and twitter is pretty much dead for me these days. I log in once a month and see if I have any notifications. I also use Behance and LinkedIn a lot, and Dribbble once in a blue moon.

DO YOU LISTEN TO MUSIC WHILE YOU WORK? IF SO, WHAT KIND?
My 25-year-old self would cyber bully me for saying this but soft Drake is “Too Good” these days. Loving Travis Scott and Migos among a long list of others.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
First I bought a swimming pool to help me get away from the computer/emails and swim laps with the kids. That worked for a while, but then I bought a convertible BMW to try to ease the tension and enjoy the wind through my hair. Once that wore off and the stress came back, I bought a puppy. Then I started doing yoga. A year later I bought another puppy.

Quick Chat: Emery Wells discusses Frame.io for Adobe After Effects

By Randi Altman

Frame.io is a cloud-based video collaboration tool that was designed to combine the varied ways pros review and approve projects — think Dropbox, Vimeo or email. Frame.io allows you to create projects and add collaborators and files to share in realtime.

They are now offering integration with Adobe’s After Effects that includes features like realtime comments and annotations that sync to your comp, the ability to import comments and annotations into your comp as live shape layers, and uploads of project files and bins.

To find out more, I reached out to Frame.io’s co-founder/CEO Emery Wells.

You just launched a panel for Adobe After Effects. Why was this the next product you guys targeted?
We launched our first Adobe integration with Premiere Pro this past NAB. It was a huge amount of work to rebuild all the Frame.io collaboration features for the Adobe Extension architecture, but it was worth the effort. The response from the Premiere integration was one of the best and biggest we received. After Effects is Premiere’s best friend. It’s the workhorse of the post industry. From complex motion graphics and visual effects to simple comps and title sequences, After Effects is one the key tools video pros rely on so we knew we had to extend all of the capabilities into AE.

Can you discuss the benefits users get from this panel?
Workflow is often one of the biggest frustrations any post pro faces. You really just want to focus on making cool stuff, but inevitably that requires wrangling renders, uploading files everywhere, collecting feedback and generally just doing a bunch of stuff that has nothing to do what you’re good at and what you enjoy. Frame.io for Adobe After Effects allows you to focus on the work you do well in the tool you use to do it. When you need to get feedback from someone, just upload your comp to Frame.io from within AE. Those people will immediately get a notification via email or their phone and they can start leaving feedback immediately. That feedback then flows right back into your comp where you’re doing the work.

We just cut out all the inefficient steps in between. What it really provides, more than anything else, is rapid iteration. The absolute best work only comes through that creative iteration. We never nail something on our first try. It’s the 10th try, the 50th try. Being able to try things quickly and get feedback quickly not only saves time and money, but will actually produce better work.

Will there be more Adobe collaboration offerings to come?
The way we built the panel for Premiere and After Effects actually uses the entire Frame.io web application codebase. It essentially just has a different skin on it so it feels native to Adobe apps. What that essentially means is all the updates we do to the core web application get inherited by Premiere and After Effects, so there will be many more features to come.

Not long ago Frame.io got a huge infusion of cash thanks to some heavy-hitter investors. How has this changed the way you guys work?
It’s allowing us to move faster and in parallel. We’ve now shipped four really unique products in about a year and half. The core web app, the Apple Design award-winning iOS app, the full experiences that live inside Premiere and AE, and our desktop companion app that integrated with Final Cut Pro X. All these products require considerable resources to maintain and push forward, so the capital infusion will allow us to continue building a complete ecosystem of apps that all work together to solve the most essential creative collaboration challenges.

What’s next for Frame.io?
The integrations are a really key part of our strategy, and you’ll see more of them moving forward. We want to embed Frame.io as deeply as we can in the creative apps so it just becomes a seamless part of your experience.

Check out this video for more:

Review: The HP Z1G3 All-in-One workstation

By Brady Betzel

I’ll admit it. I’ve always been impressed with HP’s All-in-One workstations — from their z840 to their zBook mobile workstation and now their HP Z1G3. Yes, I know, the HP line of workstations are not cheap. In fact, you can save quite a bit of money building your own system, but you will probably have tons of headaches unless you are very confident in your computer-building skills. And if you don’t mind standing in the return line at the Fry’s Electronics.

HP spends tons of time and money on ISV certifications for their workstations. ISV certification stands for Independent Software Vendor certification. In plain English it means that HP spends a lot of time and money making sure the hardware inside of your workstation works with the software you use. For an industry pro that means apps like Adobe’s Premiere Pro and After Effects, Avid Media Composer, Autodesk products like 3DS Max and many others.

For this review,  I tested apps like Avid Media Composer, FilmLight’s Baselight for Media Composer color correction plug-in, Adobe Premiere Pro, Adobe Media Encoder and Adobe After Effects, as well as Blackmagic’s Resolve 12.5.2, which chewed through basic color correction. In terms of testing time, I typically keep a review computer system for a couple of months, but with this workstation I really wanted to test it as thoroughly as possible — I’ve had the workstation for three months and counting, and I’ve been running the system through all the appropriate paces.

I always love to review workstations like the HP Z1G3 because of the raw power they possess. While HP sent me one of the top-of-the-line Z1G3 configurations, which retails for a list price of $3,486, they have a pretty reasonable starting price at $1,349. From Intel i3, i5 and i7 configurations all the way up to the all mighty Intel Xeon — the HP Z1G3 can be customized to fit into your workflow whether you just need to check your email or color correct video from your GoPro.

Here are the specs that make up the HP Z1G3 All-in-One workstation I received:

● 23.6-inch UHD/4K non-glare and non-touch display (3840×2160)
● Intel Xeon E3-1270 v5 CPU, 3.6GHz (4 Cores / 8 Threads)
● 64GB DDR4 SODIMM 2133 GHz (4 x 16GB)
● Nvidia Quadro M2000M graphics (4GB)
● Two Z Turbo drives (512GB, PCIe M.2)
● Wireless keyboard and mouse
● Two Thunderbolt 3/USB 3.1 ports
● USB charging port
● Media card reader
● DisplayPort out

As I mentioned earlier, I tested the Z1G3 with many different apps, but recently I’ve been diving deeper into color correction, and luckily for my testing this fits right in. A few of the most strenuous real-world tests for computer systems is running 3D modeling apps like Maxon Cinema 4D and color correction suites like Resolve. Of course, apps like After Effects are great tests as well, but adding nodes on nodes on nodes in Resolve will really tax your CPU, as well as your GPU.

One thing that can really set apart high-end systems like the Z1G3 is the delay when using a precision color correction panel like Tangent’s Elements or Ripple. Sometimes you will move one of the color wheel balls and a half a second later the color wheel moves on screen. I tried adding a few clips and nodes on the timeline and when using the panels, I noticed no discernible delay (at least more than what I would expect). While this isn’t a scientific test, it is crucial for folks looking to plug in external devices.

For more scientific tests I stuck to apps like Cinebench from Maxon, AJA’s System Test and Blackmagic’s Disk Speed Test. In Cinebench, the Z1G3 ranked at the top of the list when compared to similar systems. In AJA’s System Test I tested the read/write speed of the hp-z1g3-aja-system-test-copynon-OS drive (basically the editing or cache drive). It sustained around 1520MB/s read and 1490MB/s write. I say around because I couldn’t get the AJA app to display the entire read/write numbers because of the high-resolution/zoom in Windows, I tried scaling it down to 1920×1080 but no luck. In Blackmagic’s Disk Speed Test, I was running at 1560MB/s read and 1497.3MB/s write. The drive that I ran this test on is HP’s version of the M.2 PCIe SSD powered by Samsung, more affectionately known by HP as a Z-Turbo drive. The only thing better at the moment would be a bunch of these drives arranged in a RAID-0 configuration. Luckily, you can do that through the Thunderbolt 3 port with some spare SSDs you have lying around.

Almost daily I ran Premiere Pro CC, Media Encoder and Resolve Studio 12.5.2. I was really happy with the performance in Premiere. When working with QuickTimes in inter-frame codecs like H.264 and AVC-HD (non-edit friendly codecs), I was able to work without too much stuttering in the timeline. When I used intra-frame codecs like ProRes HQ from a Blackmagic’s Pocket Cinema Camera, Premiere worked great. I even jumped into Adobe’s Lumetri color tools while using Tangent’s Ripple external color correction panel and it worked with little discernable delay. I did notice that Premiere had a little more delay when using the external color correction panel than Media Composer and Resolve, but that seemed to be more of a software problem rather than a workstation problem.

One of my favorite parts about using a system with an Nvidia graphics card, especially a Quadro card like the M2000M, is the ability to encode multiple versions of a file at once. Once I was done editing some timelapses in Premiere, I exported using Media Encoder. I would apply three presets I made: one square 600×600 H.264 for Instagram, one 3840×2160 H.264 for YouTube and an Animated GIF at 480×360 for Twitter. Once I told Media Encoder to encode, it ran all three exports concurrently — a really awesome feature. With the Nvidia Quadro card installed, it really sped along the export.

Media Composer
Another app I wanted to test was Media Composer 8.6.3. Overall Media Composer ran great except for the high-resolution display. As I’ve said in previous reviews, this isn’t really the fault of HP, but more of the software manufacturers who haven’t updated their interfaces to adapt to the latest UHD displays. I had filmed a little hike I took with my five-year-old. I gave him a GoPro while I had my own. Once we got the footage back home, I imported it into Media Composer, grouped the footage and edited it using the multi-cam edit workflow.

Simply put, the multi-camera split was on the left and the clip I had in the sequence was playing simultaneously on the right. Before I grouped the footage into a multi-group, I transcoded the H.264s into DNxHD 175 an intra-frame, edit-friendly codec. The transcode was nearly realtime, so it took 60 minutes to transcode a 60-minute H.264 — which is not bad. In the end, I was able to edit the two-camera multi-group at 1920×1080 resolution with only minor hiccups. Occasionally, I would get caught in fast forward for a few extra seconds when J-K-L editing, but nothing that made me want to throw my keyboard or mouse against the wall.

Once done editing, I installed the FilmLight color correction plug-in for Media Composer. I had a really awesome experience coloring using Baselight in Media Composer on the Z1G3. I didn’t have any slowdowns, and the relationship between using the color correction panel and Baselight was smooth.

Resolve
The last app I tested with HP’s Z1G3 All-in-One Workstation was Blackmagic’s Resolve 12.5.2. Much like my other tests, I concentrated on color correction with the Tangent Ripple and Element-Vs iOS app. I had four or five nodes going in the color correction page before I started to see a slow down. I was using the native H.264 and ProRes HQ files from the cameras, so I didn’t make it easy for Resolve, but it still worked. Once I added a little sharpening to my clips, the HP Z1G3 really started to kick into gear. I heard the faint hum of fans, which up until this point hadn’t kicked in. This is also where the system started to slow down and become sluggish.

Summing Up
The Z1G3 is one of my favorite workstations, period. A while ago, I reviewed the previous All-in-One workstation from HP, the Z1G2, and at the time it was my favorite. One of my few complaints was that, while it was easy to fix, it was very heavy and bulky. When I opened the Z1G3 box, I immediately noticed how much lighter and streamlined the design was. It almost felt like they took away 50 percent of the bulk, which is something I really appreciate. I can tell that one of the main focuses with the Z1G3 was minimizing its footprint and weight, while increasing the power. HP really knocked it out of the park.

One of the only things that I wish was different on the Z1G3 I tested was the graphics card. While the Nvidia Quadro M2000M is a great graphics card, it is a “mobile” version of a Quadro, which has 128 fewer CUDA cores and 26GB/s less bandwidth than its desktop equivalent the M2000. I would love the option of a full-sized Quadro and instead of the mobile version but I also understand the power consumption will go up as well as the form factor, so maybe I give HP a pass here.

In the end, I know everyone reading this review is saying to themselves, “I love my iMac so why would I want the HP Z1G3?” If you are a die-hard Apple user, or you just saw the new Microsoft Surface Studio announcement, then it might be a hard sell, but I love both Windows- and Mac OS-based systems, and the Z1G3 is awesome. What’s even more awesome is that it is easily upgradeable. I took off the back cover, and with simple switch I could have added a 2.5-inch hard drive or two in under a minute. If you are looking for a new powerful workstation and want one that not only stands up to Resolve and Premiere Pro CC, the HP Z1G3 is for you.


Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Earlier this year, Brady was nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.

Review: Red Giant’s Universe 2

By Brady Betzel

Throughout 2016, we have seen some interesting acquisitions in the world of post production software and hardware — Razer bought THX, Blackmagic bought Ultimatte and Fairlight and Boris FX bought GenArts, to name a few. We’ve also seen a tremendous consolidation of jobs. Editors are now being tasked as final audio mixers, final motion graphics creators, final colorists and much more.

Personally, I love doing more than just editing, so knowing tools like Adobe After Effects and DaVinci Resolve, in addition to Avid Media Composer, has really helped me become not only an editor but someone who can jump into After Effects or Resolve and do good work.

hudUnfortunately, for some people it is the nature of the post beast to know everything. Plug-ins play a gigantic part in balancing my workload, available time and the quality of the final product. If I didn’t have plug-ins like Imagineer’s Mocha Pro, Boris’s Continuum Complete, GenArt’s Sapphire and Red Giant’s Universe 2, I would be forced to turn down work because the time it would take to create a finished piece would outweigh the fee I would be able to charge a client.

A while back, I reviewed Red Giant’s Universe when it was in version 1, (check it out here). In the beginning Universe allowed for lifetime, annual and free memberships. It seems the belt has tightened a little for Red Giant as Universe 2 is now $99 a year, $20 a month or a 14-day free trial. No permanent free version or lifetime memberships are offered (if you downloaded the free Universe before June 28, you will still be able to access those free plug-ins in the Legacy group). Moreover, they have doubled the monthly fee from $10 to $20 — definitely trying to get everyone on to the annual subscription train.

Personally, I think this resulted from too much focus on the broad Universe, trying to jam in as many plug-ins/transitions/effects as possible and not working on specific plug-ins within Universe. I actually like the renewed focus of Red Giant toward a richer toolset as opposed to a full toolset.

Digging In
Okay, enough of my anecdotal narrative and on to some technical awesomeness. Red Giant’s Universe 2 is a vast plug-in collection that is compatible with Adobe’s Premiere Pro and After Effects CS6-CC 2015.3; Apple Final Cut Pro X 10.0.9 and later; Apple Motion 5.0.7 and later; Vegas 12 and 13; DaVinci Resolve 11.1 and later; and HitFilm 3 and 4 Pro. You must have a compatible GPU installed as Universe does not have a CPU fallback plan for unsupported machines. Basically you must have 2GB or higher GPU, and don’t forget about Intel as their graphic support has improved a lot lately. For more info on OS compatibility and specific GPU requirements, check out Red Giant’s compatibility page.

Universe 2 is loaded with great plug-ins that, once you dig in, you will want to use all the time. For instance, I really like the ease of use of Universe’s RGB Separation and Chromatic Glow. If you want a full rundown of each and every effect you should download the Universe 2 trial and check this out. In this review I am only going to go over some of the newly added plug-ins — HUD Components,  Line, Logo Motion and Color Stripe — but remember there are a ton more.

I will be bouncing around different apps like Premiere Pro and After Effects. Initially I wanted to see how well Universe 2 worked inside of Blackmagic’s DaVinci Resolve 12.5.2. Resolve gave me a little trouble at first; it began by crashing once I clicked on OpenFX in the Color page. I rebooted completely and got the error message that the OpenFX had been disabled. I did a little research (and by research I mean I typed ”Disabled OpenFX Resolve” into Google), and  stumbled on a post on Blackmagic’s Forum that suggested deleting “C:\ProgramData\Blackmagic Design\Davinci Resolve\Support\OFXPluginCache.xml” might fix it. Once I deleted that and rebooted Resolve, I clicked on the OpenFX tab in the Color Page, waited 10 minutes, and it started working. From that point on it loaded fast. So, barring the Resolve installation hiccup, there were no problems installing in Premiere and After Effects.

Once installed, you will notice that Universe has a few folders inside of your plug-in’s drop down: Universe Blur, Universe Distort, Universe Generators, Universe Glow, Universe Legacy, Universe Motion Graphics, Universe Stylize and Universe Utilities. You may recognize some of these if you have used an earlier version of Universe, but something you will not recognize is that each Universe plug-in now has a “uni.” prefix.

I am still not sure whether I like this or hate this. On one hand it’s easy to search for if you know exactly what you want in apps like Premiere. On the other hand it runs counterintuitive to what I am used to as a grouchy old editor. In the end, I decided to run my tests in After Effects and Premiere. Resolve is great, but for tracking a HUD in 3D space I was more comfortable in After Effects.

HUD Components
First up is HUD Components, located under the Universe Motion Graphics folder and labeled: “uni.HUD Components.” What used to take many Video CoPilot tutorials and many inspirational views of HUD/UI master Jayse Hansen’s (@jayse_) work, now takes me minutes thanks to the new HUD components. Obviously, to make anything on the level of a master like Jayse Hansen will take hundreds of hours and thousands of attempts, but still — with Red Giant HUD Components you can make those sci-fi in-helmet elements quickly.

When you apply HUD Components to a solid layer in After Effects you can immediately see the start of your HUD. To see what the composite over my footage would look like, I went to change the blend mode to Add, which is listed under “Composite Settings.” From there you can see some awesome pre-built looks under the Choose a Preset button. The pre-built elements are all good starting points, but I would definitely dive further into customizing, maybe layer multiple HUDs over each other with different Blend Modes, for example.

Diving further into HUD Components, there are four separate “Elements” that you can customize, each with different images, animations, colors, clone types, and much more. One thing to remember is that when it comes to transformation settings and order of operations work from the top down. For instance, if you change the rotation on element one, it will affect each element under it, which is kind of handy if you ask me. Once you get the hang of how HUD Components works, it is really easy to make some unique UI components. I really like to use the uni.Point Zoom effect (listed under Universe Glow in the Effects & Presets); it gives you a sort of projector-like effect with your HUD component.

There are so many ways to use and apply HUD Components in everyday work, from building dynamic lower thirds with all of the animatable arcs, clones and rotations to building sci-fi elements, applying Holomatrix to it and even Glitch to create awesome motion graphics elements with multiple levels of detail and color. I did try using HUD Components in Resolve when tracking a 3D object but couldn’t quite get the look I wanted, so I ditched it and used After Effects.

Line
Second up is the Line plug-in. While drawing lines along a path in After Effects isn’t necessarily hard, it’s kind of annoying — think having to make custom map graphics to and from different places daily. Line takes the hard work out of making line effects to and from different points. This plug-in also contains the prefix uni. and is located under Universe Motion Graphics labeled uni.Line.

This plug-in is very simple to use and animate. I quickly found a map, applied uni.Line, placed my beginning and end points, animated the line using two keyframes under “Draw On” and bam! I had an instant travel-vlog style graphic that showed me going from California to Australia in under three minutes (yes, I know three minutes seems a little fast to travel to Australia but that’s really how long it took, render and all). Under the Effect Controls you can find preset looks, beginning and ending shape options like circles or arrows, line types, segmented lines and curve types. You can even move the peak of the curve under bezier style option.

Logo Motion
Third is Logo Motion, located under Universe Motion Graphics titled uni.LogoMotion. In a nutshell you can take a pre-built logo (or anything for that matter), pre-compose it, throw the uni.LogoMotion effect on top, apply a preset reveal, tweak your logo animation, dynamically adjust the length of your pre-comp — which directly affects the logo’s wipe on and off — and, finally, render.

This is another plug-in that makes my life as an editor who dabbles in motion graphics really easy. Red Giant even included some lower third animation presets that help create dynamic lower third movements. You can select from some of the pre-built looks, add some motion while the logo is “idle,” adjust things like rotation, opacity and blur under the start and end properties, and even add motion blur. The new preset browser in Universe 2 really helps with plug-ins like Logo Motion where you can audition animations easily before applying them. You can quickly add some life to any logo or object with one or two clicks; if you want to get detailed you can dial in the idle animation and/or transition settings.

Color Stripe
Fourth is Color Stripe, a transition that uses color layers to wipe across and reveal another layer. This one is a pretty niche case use, but is still worth mentioning. In After Effects. transitions are generally a little cumbersome. I found the Universe 2 transitions infinitely easier to use in NLEs like Adobe Premiere. From the always-popular swish pan to exposure blur, there are some transitions you might use once or some you might use a bunch. Color Stripe is a transition that you probably won’t want to use too often, but when you do need it, it will be right at your fingertips. You can choose from different color schemes like analogous, tetradic, or even create a custom scheme to match your project.

In the end, Universe 2 has some effects that are essential once you begin using them, like uni.Unmult, uni.RGB Separation and the awesome uni.Chromatic Glow. The new ones are great too, I really like the ease of use of uni.HUD Components. Since these effects are GPU accelerated you might be surprised at how fast and fluid they work in your project without slowdowns. For anyone who likes apps like After Effects, but can’t afford to spend hours dialing in the perfect UI interface and HUD, Universe 2 is perfect for you. Check out all of the latest Red Giant Universe 2 tools here.

Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Earlier this year, Brady was nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.