Tag Archives: Adobe After Effects

Promoting a Mickey Mouse watch without Mickey

Imagine creating a spot for a watch that celebrates the 90th anniversary of Mickey Mouse — but you can’t show Mickey Mouse. Already Been Chewed (ABC), a design and motion graphics studio, developed a POV concept that met this challenge and also tied in the design of the actual watch.

Nixon, a California-based premium watch company that is releasing a series of watches around the Mickey Mouse anniversary, called on Already Been Chewed to create the 20-second spot.

“The challenge was that the licensing arrangement that Disney made with Nixon doesn’t allow Mickey’s image to be in the spot,” explains Barton Damer, creative director at Already Been Chewed. “We had to come up with a campaign that promotes the watch and has some sort of call to action that inspires people to want this watch. But, at the same time, what were we going to do for 20 seconds if we couldn’t show Mickey?”

After much consideration, Damer and his team developed a concept to determine if they could push the limits on this restriction. “We came up with a treatment for the video that would be completely point-of-view, and the POV would do a variety of things for us that were working in our favor.”

The solution was to show Mickey’s hands and feet without actually showing the whole character. In another instance, a silhouette of Mickey is seen in the shadows on a wall, sending a clear message to viewers that the spot is an official Disney and Mickey Mouse release and not just something that was inspired by Mickey Mouse.

Targeting the appropriate consumer demographic segment was another key issue. “Mickey Mouse has long been one of the most iconic brands in the history of branding, so we wanted to make sure that it also appealed to the Nixon target audience and not just a Disney consumer,” Damer says. “When you think of Disney, you could brand Mickey for children or you could brand it for adults who still love Mickey Mouse. So, we needed to find a style and vibe that would speak to the Nixon target audience.”

The Already Been Chewed team chose surfing and skateboarding as dominant themes, since 16-to 30-year-olds are the target demographic and also because Disney is a West Coast brand.
Damer comments, “We wanted to make sure we were creating Mickey in a kind of 3D, tangible way, with more of a feature film and 3D feel. We felt that it should have a little bit more of a modern approach. But at the same time, we wanted to mesh it with a touch of the old-school vibe, like 1950s cartoons.”

In that spirit, the team wanted the action to start with Mickey walking from his car and then culminate at the famous Venice Beach basketball courts and skate park. Here’s the end result.

“The challenge, of course, is how to do all this in 15 seconds so that we can show the logos at the front and back and a hero image of the watch. And that’s where it was fun thinking it through and coming up with the flow of the spot and seamless transitions with no camera cuts or anything like that. It was a lot to pull off in such a short time, but I think we really succeeded.”

Already Been Chewed achieved these goals with an assist from Maxon’s Cinema 4D and Adobe After Effects. With Damer as creative lead, here’s the complete cast of characters: head of production Aaron Smock; 3D design was via Thomas King, Barton Damer, Bryan Talkish, Lance Eckert; animation was provided by Bryan Talkish and Lance Eckert; character animation was via Chris Watson; soundtrack was DJ Sean P.

Adobe updates Creative Cloud

By Brady Betzel

You know it’s almost fall when when pumpkin spice lattes are  back and Adobe announces its annual updates. At this year’s IBC, Adobe had a variety of updates to its Creative Cloud line of apps. From more info on their new editing platform Project Rush to the addition of Characterizer to Character Animator — there are a lot of updates so I’m going to focus on a select few that I think really stand out.

Project Rush

I use Adobe Premiere quite a lot these days; it’s quick and relatively easy to use and will work with pretty much every codec in the universe. In addition, the Dynamic Link between Adobe Premiere Pro and Adobe After Effects is an indispensible feature in my world.

With the 2018 fall updates, Adobe Premiere will be closer to a color tool like Blackmagic’s Resolve with the addition of new hue saturation curves in the Lumetri Color toolset. In Resolve these are some of the most important aspects of the color corrector, and I think that will be the same for Premiere. From Hue vs. Sat, which can help isolate a specific color and desaturate it to Hue vs. Luma, which can help add or subtract brightness values from specific hues and hue ranges — these new color correcting tools further Premiere’s venture into true professional color correction. These new curves will also be available inside of After Effects.

After Effects features many updates, but my favorites are the ability to access depth matte data of 3D elements and the addition of the new JavaScript engine for building expressions.

There is one update that runs across both Premiere and After Effects that seems to be a sleeper update. The improvements to motion graphics templates, if implemented correctly, could be a time and creativity saver for both artists and editors.

AI
Adobe, like many other companies, seem to be diving heavily into the “AI” pool, which is amazing, but… with great power comes great responsibility. While I feel this way and realize others might not, sometimes I don’t want all the work done for me. With new features like Auto Lip Sync and Color Match, editors and creators of all kinds should not lose the forest for the trees. I’m not telling people to ignore these features, but asking that they put a few minutes into discovering how the color of a shot was matched, so you can fix something if it goes wrong. You don’t want to be the editor who says, “Premiere did it” and not have a great solution to fix something when it goes wrong.

What Else?
I would love to see Adobe take a stab at digging up the bones of SpeedGrade and integrating that into the Premiere Pro world as a new tab. Call it Lumetri Grade, or whatever? A page with a more traditional colorist layout and clip organization would go a long way.

In the end, there are plenty of other updates to Adobe’s 2018 Creative Cloud apps, and you can read their blog to find out about other updates.

NAB: Adobe’s spring updates for Creative Cloud

By Brady Betzel

Adobe has had a tradition of releasing Creative Cloud updates prior to NAB, and this year is no different. The company has been focused on improving existing workflows and adding new features, some based on Adobe’s Sensei technology, as well as improved VR enhancements.

In this release, Adobe has announced a handful of Premiere Pro CC updates. While I personally don’t think that they are game changing, many users will appreciate the direction Adobe is going. If you are color correcting, Adobe has added the Shot Match function that allows you to match color between two shots. Powered by Adobe’s Sensei technology, Shot Match analyzes one image and tries to apply the same look to another image. Included in this update is the long-requested split screen to compare before and after color corrections.

Motion graphic templates have been improved with new adjustments like 2D position, rotation and scale. Automatic audio ducking has been included in this release as well. You can find this feature in the Essential Sound panel, and once applied it will essentially dip the music in your scene based on dialogue waveforms that you identify.

Still inside of Adobe Premiere Pro CC, but also applicable in After Effects, is Adobe’s enhanced Immersive Environment. This update is for people who use VR headsets to edit and or process VFX. Team Project workflows have been updated with better version tracking and indicators of who is using bins and sequences in realtime.

New Timecode Panel
Overall, while these updates are helpful, none are barn burners, the thing that does have me excited is the new Timecode Panel — it’s the biggest new update to the Premiere Pro CC app. For years now, editors have been clamoring for more than just one timecode view. You can view sequence timecodes, source media timecodes from the clips on the different video layers in your timeline, and you can even view the same sequence timecode in a different frame rate (great for editing those 23.98 shows to a 29.97/59.94 clock!). And one of my unexpected favorites is the clip name in the timecode window.

I was testing this feature in a pre-release version of Premiere Pro, and it was a little wonky. First, I couldn’t dock the timecode window. While I could add lines and access the different menus, my changes wouldn’t apply to the row I had selected. In addition, I could only right click and try to change the first row of contents, but it would choose a random row to change. I am assuming the final release has this all fixed. If it the wonkiness gets flushed out, this is a phenomenal (and necessary) addition to Premiere Pro.

Codecs, Master Property, Puppet Tool, more
There have been some compatible codec updates, specifically Raw Sony X-OCN (Venice), Canon Cinema Raw Light (C200) and Red IPP2.

After Effects CC has also been updated with Master Property controls. Adobe said it best during their announcement: “Add layer properties, such as position, color or text, in the Essential Graphics panel and control them in the parent composition’s timeline. Use Master Property to push individual values to all versions of the composition or pull selected changes back to the master.”

The Puppet Tool has been given some love with a new Advanced Puppet Engine, giving access to improving mesh and starch workflows to animate static objects. Beyond updates to Add Grain, Remove Grain and Match Grain effects, making them multi-threaded, enhanced disk caching and project management improvements have been added.

My favorite update for After Effects CC is the addition of data-driven graphics. You can drop a CSV or JSON data file and pick-whip data to layer properties to control them. In addition, you can drag and drop data right onto your comp to use the actual numerical value. Data-driven graphics is a definite game changer for After Effects.

Audition
While Adobe Audition is an audio mixing application, it has some updates that will directly help anyone looking to mix their edit in Audition. In the past, to get audio to a mixing program like Audition, Pro Tools or Fairlight you would have to export an AAF (or if you are old like me possibly an OMF). In the latest Audition update you can simply open your Premiere Pro projects directly into Audition, re-link video and audio and begin mixing.

I asked Adobe whether you could go back and forth between Audition and Premiere, but it seems like it is a one-way trip. They must be expecting you to export individual audio stems once done in Audition for final output. In the future, I would love to see back and forth capabilities between apps like Premiere Pro and Audition, much like the Fairlight tab in Blackmagic’s Resolve. There are some other updates like larger tracks and under-the-hood updates which you can find more info about on: https://theblog.adobe.com/creative-cloud/.

Adobe Character Animator has some cool updates like overall character building updates, but I am not too involved with Character Animator so you should definitely read about things like the Trigger Improvements on their blog.

Summing Up
In the end, it is great to see Adobe moving forward on updates to its Creative Cloud video offerings. Data-driven animation inside of After Effects is a game-changer. Shot color matching in Premiere Pro is a nice step toward a professional color correction application. Importing Premiere Pro projects directly into Audition is definitely a workflow improvement.

I do have a wishlist though: I would love for Premiere Pro to concentrate on tried-and-true solutions before adding fancy updates like audio ducking. For example, I often hear people complain about how hard it is to export a QuickTime out of Premiere with either stereo or mono/discrete tracks. You need to set up the sequence correctly from the jump, adjust the pan on the tracks, as well as adjust the audio settings and export settings. Doesn’t sound streamlined to me.

In addition, while shot color matching is great, let’s get an Adobe SpeedGrade-style view tab into Premiere Pro so it works like a professional color correction app… maybe Lumetri Pro? I know if the color correction setup was improved I would be way more apt to stay inside of Premiere Pro to finish something instead of going to an app like Resolve.

Finally, consolidating and transcoding used clips with handles is hit or miss inside of Premiere Pro. Can we get a rock-solid consolidate and transcode feature inside of Premiere Pro? Regardless of some of the few negatives, Premiere Pro is an industry staple and it works very well.

Check out Adobe’s NAB 2018 update video playlist for details on each and every update.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Quick Chat: Creating graphics package for UN’s Equator Prize ceremony

Undefined Creative (UC) was recently commissioned by the United Nations Development Programme (UNDP) to produce a fresh package of event graphics for its Equator Prize 2017 Award Ceremony. This project is the latest in a series of motion design-centered work collaborations between the creative studio and the UN, a relationship that began when UC donated their skills to the Equator Prize in 2010.

The Equator Prize recognizes local and indigenous community initiatives from across the planet that are advancing innovative on-the-ground solutions to climate, environment and poverty challenges. Award categories honor achievement and local innovation in the thematic areas of oceans, forests, grasslands and wildlife protection.

For this year’s ceremony, UNDP wanted a complete refresh that gave the on-stage motion graphics a current vibe while incorporating the key icons behind its sustainable development goals (SDGs). Consisting of a “Countdown to Ceremony” screensaver, an opening sequence, 15 winner slates, three category slates and 11 presenter slates, the package had to align visually with a presentation from National Geographic Society, which was part of the evening’s program.

To bring it all together, UC drew from the SDG color palettes and relied on subject matter knowledge of both the UNDP and National Geographic in establishing the ceremony graphics’ overall look and feel. With only still photos available for the Equator Prize winners, UC created motion and depth by strategically intertwining the best shots with moving graphics and strategically selected stock footage. Naturally moving flora and fauna livened up the photography, added visual diversity and contributed creating a unique aesthetic.

We reached out to Undefined Creative’s founder/creative director Maria Rapetskaya to find out more:

How early did you get involved in the project, and was the client open to input?
We got the call a couple of months before the event. The original show had been used multiple times since we created it in 2010, so the client was definitely looking for input on how we could refresh or even rebrand.

Any particular challenges for this one?
For non-commercial organizations, budgets and messaging are equally sensitive topics. We have to be conscious of costs, and also very aware of Do’s and Don’t’s when it comes to assets and use. Our creative discussions took place over several calls, laying out options and ideas at different budget tiers — anything from simply updating the existing package to creating something entirely different. In case of the latter, parameters had to be established right away for how different “different” could be.

For example, it was agreed that we should stick with photography provided by the 2017 award winners. However, our proposal to include stock for flora and fauna was agreed on by all involved. Which SDG icons would be used and how, what partner and UN organizational branding should be featured prominently as design inspiration, how this would integrate with content being produced for UNDP/Equator Prize by Nat Geo… all of these questions had to be addressed before we started any real ideation in order for the creative to stay on brand, on message, on budget and on time.

What tools did you use on the project?
We relied on Adobe CC, in particular, After Effects, which is our staple software. In this particular project, we also relied heavily on stock from multiple vendors. Pond5 have a robust and cost-effective collection of video elements we were seeking.

Why is this project important to you?
The majority of our clients are for-profit commercial entities, and while that’s wonderful, there’s always a different feeling of reward when we have the chance to do something for the good of humanity at large, however minuscule our contribution is. The winners are coming from such different corners of the globe — at times, very remote. They’re incredibly excited to be honored, on stage, in New York City, and we can only imagine what it feels like to see their faces, the faces of their colleagues and friends, the names of their projects, up on this screen in front of a large, live audience. This particular event brings us a lot closer to what we’re creating, on a really empathetic, human level.

Red Giant Trapcode Suite 14 now available

By Brady Betzel

Red Giant has released an update to its Adobe After Effects focused plug-in toolset Trapcode Suite 14, including new versions of Trapcode Particular and Form as well as an update to Trapcode Tao.

The biggest updates seem to be in Red Giant’s flagship product Trapcode Particular 3. Trapcode Particular is now GPU accelerated through OpenGL with a proclaimed 4X speed increase over previous versions. The Designer has been re-imagined and seems to take on a more Magic Bullet-esque look and feel. You can now include multiple particle systems inside the same 3D space, which will add to the complexity and skill level needed to work with Particular.

You can now also load your own 3D model OBJ files as emitters in the Designer panel or use any image in your comp as a particle. There are also a bunch of new presets that have been added to start you on your Particular system building journey — over 210 new presets, to be exact.

Trapcode Form has been updated to version 3 with the updated Designer, ability to add 3D models and animated OBJ sequences as particle grids, load images to be used as a particle, new graphing system to gain more precise control over the system and over 70 presets in the designer.

Trapcode Tao has been updated with depth of field effects to allow for that beautiful camera-realistic blur that really sets pro After Effects users apart.

Trapcode Particular 3 and Form 3 are paid updates while Tao is free for existing users. If you want to only update Tao make sure you only select Tao for the update otherwise you will install new Trapcode plug-ins over your old ones.

Trapcode Particular 3 is available now for $399. The update is $149 and the academic version is $199. You can also get it as a part of the Trapcode Suite 14 for $999.

Trapcode Form 3 is available now for $199. The update is $99 and the academic costs $99. It can be purchased as part of the Trapcode Suite 14 for $999.

Check out the new Trapcode Suite 14 bundle.

 

Mocha VR: An After Effects user’s review

By Zach Shukan

If you’re using Adobe After Effects to do compositing and you’re not using Mocha, then you’re holding yourself back. If you’re using Mettle Skybox, you need to check out Mocha VR, the VR-enhanced edition of Mocha Pro.

Mocha Pro, and Mocha VR are all standalone programs where you work entirely within the Mocha environment and then export your tracks, shapes or renders to another program to do the rest of the compositing work. There are plugins for Maxon Cinema 4D, The Foundry’s Nuke, HitFilm, and After Effects that allow you to do more with the Mocha data within your chosen 3D or compositing program. Limited-feature versions of Mocha (Mocha AE and Mocha HitFilm) come installed with the Creative Cloud versions of After Effects and HitFilm 4 Pro, and every update of these plugins is getting closer to looking like a full version of Mocha running inside of the effects panel.

Maybe I’m old school, or maybe I just try to get the maximum performance from my workstation, but I always choose to run Mocha VR by itself and only open After Effects when I’m ready to export. In my experience, all the features of Mocha run more smoothly in the standalone than when they’re launched and run inside of After Effects.**

How does Mocha VR compare to Mocha Pro? If you’re not doing VR, stick with Mocha Pro. However, if you are working with VR footage, you won’t have to bend over backwards to keep using Mocha.

Last year was the year of VR, when all my clients wanted to do something with VR. It was a crazy push to be the first to make something and I rode the wave all year. The thing is there really weren’t many tools specifically designed to work with 360 video. Now this year, the post tools for working with VR are catching up.

In the past, I forced previous versions of Mocha to work with 360 footage before the VR version, but since Mocha added its VR-specific features, stabilizing a 360-camera became cake compared to the kludgy way it works with the industry standard After Effects 360 plugin, Skybox. Also, I’ve used Mocha to track objects in 360 before the addition of an equirectangular* camera and it was super-complicated because I had to splice together a whole bunch of tracks to compensate for the 360 camera distortion. Now it’s possible to create a single track to follow objects as they travel around the camera. Read the footnote for an explanation of equirectangular, a fancy word that you need to know if you’re working in VR.

Now let’s talk about the rest of Mocha’s features…

Rotoscoping
I used to rotoscope by tracing every few frames and then refining the frames in between until I found out about the Mocha way to rotoscope. Because Mocha combines rotoscoping with tracking of arbitrary shapes, all you have to do is draw a shape and then use tracking to follow and deform all the way through. It’s way smarter and more importantly, faster. Also, with the Uberkey feature, you can adjust your shapes on multiple frames at once. If you’re still rotoscoping with After Effects alone, you’re doing it the hard way.

Planar Tracking
When I first learned about Mocha it was all about the planar tracker, and that really is still the heart of the program. Mocha’s basically my go-to when nothing else works. Recently, I was working on a shot where a woman had her dress tucked into her pantyhose, and I pretty much had to recreate a leg of a dress that swayed and flowed along with her as she walked. If it wasn’t for Mocha’s planar tracker I wouldn’t have been able to make a locked-on track of the soft-focus (solid color and nearly without detail) side of the dress. After Effects couldn’t make a track because there weren’t enough contrast-y details.

GPU Acceleration
I never thought Mocha’s planar tracking was slow, even though it is slower than point tracking, but then they added GPU acceleration a version or two ago and now it flies through shots. It has to be at least five times as fast now that it’s using my Nvidia Titan X (Pascal), and it’s not like my CPU was a slouch (an 8-core i7-5960X).

Object Removal
I’d be content using Mocha just to track difficult shots and for rotoscoping, but their object-removal feature has saved me hours of cloning/tracking work in After Effects, especially when I’ve used it to remove camera rigs or puppet rigs from shots.

Mocha’s remove module is the closest thing out there to automated object removal***. It’s as simple as 1) create a mask around the object you want to remove, 2) track the background that your object passes in front of, and then 3) render. Okay, there’s a little more to it, but compared to the cloning and tracking and cloning and tracking and cloning and tracking method, it’s pretty great. Also, a huge reason to get the VR edition of Mocha is that the remove module will work with a 360 camera.

Here I used Mocha object removal to remove ropes that pulled a go-cart in a spot for Advil.

VR Outside of After Effects?
I’ve spent most of this article talking about Mocha with After Effects, because it’s what I know best, but there is one VR pipeline that can match nearly all of Mocha VR’s capabilities: the Nuke plugin Cara VR, but there is a cost to that workflow. More on this shortly.

Where you will hit the limit of Mocha VR (and After Effects in general) is if you are doing 3D compositing with CGI and real-world camera depth positioning. Mocha’s 3D Camera Solve module is not optimized for 360 and the After Effects 3D workspace can be limited for true 3D compositing, compared to software like Nuke or Fusion.

While After Effects sort of tacked on its 3D features to its established 2D workflow, Nuke is a true 3D environment as robust as Autodesk Maya or any of the high-end 3D software. This probably sounds great, but you should also know that Cara VR is $4,300 vs. $1,000 for Mocha VR (the standalone + Adobe plugin version) and Nuke starts at $4,300/year vs. $240/year for After Effects.

Conclusion
I think of Mocha as an essential companion to compositing in After Effects, because it makes routine work much faster and it does some things you just can’t do with After Effects alone. Mocha VR is a major release because VR has so much buzz these days, but in reality it’s pretty much just a version of Mocha Pro with the ability to also work with 360 footage.

*Equirectangular is a clever way of unwrapping a 360 spherical projection, a.k.a, the view we see in VR, by flattening it out into a rectangle. It’s a great way to see the whole 360 view in an editing program, but A: it’s very distorted so it can cause problems for tracking and B: anything that is moving up or down in the equirectangular frame will wrap around to the opposite side (a bit like Pacman when he exits the screen), and non-VR tracking programs will stop tracking when something exits the screen on one side.

**Note: According to the developer, one of the main advantages to running Mocha as a plug-in (inside AE, Premiere, Nuke, etc) for 360 video work is that you are using the host program’s render engine and proxy workflow. Having the ability to do all your tracking, masking and object removal on proxy resolutions is a huge benefit when working at large 360 formats that can be as large as 8k stereoscopic. Additionally, the Mocha modules that render, such as reorient for horizon stabilization or remove module will render inside the plug-in making for a streamlined workflow.

***FayOut was a “coming soon” product that promised an even more automated method for object removal, but as of the publishing of this article it appears that they are no longer “coming soon” and may have folded or maybe their technology was purchased and it will be included in a future product. We shall see…
________________________________________
Zach Shukan is the VFX specialist at SilVR and is constantly trying his hand at the latest technologies in the video post production world.

Adobe acquires Mettle’s SkyBox tools for 360/VR editing, VFX

Adobe has acquired all SkyBox technology from Mettle, a developer of 360-degree and virtual reality software. As more media and entertainment companies embrace 360/VR, there is a need for seamless, end-to-end workflows for this new and immersive medium.

The Skybox toolset is designed exclusively for post production in Adobe Premiere Pro CC and Adobe After Effects CC and complements Adobe Creative Cloud’s existing 360/VR cinematic production technology. Adobe will integrate SkyBox plugin functionality natively into future releases of Premiere Pro and After Effects.

To further strengthen Adobe’s leadership in 360-degree and virtual reality, Mettle co-founder Chris Bobotis will join Adobe, bringing more than 25 years of production experience to his new role.

“We believe making virtual reality content should be as easy as possible for creators. The acquisition of Mettle SkyBox technology allows us to deliver a more highly integrated VR editing and effects experience to the film and video community,” says Steven Warner, VP of digital video and audio at Adobe. “Editing in 360/VR requires specialized technology, and as such, this is a critical area of investment for Adobe, and we’re thrilled Chris Bobotis has joined us to help lead the charge forward.”

“Our relationship started with Adobe in 2010 when we created FreeForm for After Effects, and has been evolving ever since. This is the next big step in our partnership,” says Bobotis, now director, professional video at Adobe. “I’ve always believed in developing software for artists, by artists, and I’m looking forward to bringing new technology and integration that will empower creators with the digital tools they need to bring their creative vision to life.”

Introduced in April 2015, SkyBox was the first plugin to leverage Mettle’s proprietary 3DNAE technology, and its success quickly led to additional development of 360/VR plugins for Premiere Pro and After Effects.

Today, Mettle’s plugins have been adopted by companies such as The New York Times, CNN, HBO, Google, YouTube, Discovery VR, DreamWorks TV, National Geographic, Washington Post, Apple and Facebook, as well as independent filmmakers and YouTubers.

Nice Shoes Creative Studio animates limited-edition Twizzlers packages

Twizzlers and agency Anomaly recently selected 16 artists to design a fun series of limited edition packages for the classic candy. Each depicts various ways people enjoy Twizzlers. New York’s Nice Shoes Creative Studio, led by creative director Matt Greenwood, came on board to introduce these packages with an animated 15-second spot.

Three of the limited edition packages are featured in the fast-paced spot, bringing to life the scenarios of car DJing, “ugly crying” at the movies, and studying in the library, before ending on a shot that incorporates all of the 16 packages. Each pack has its own style, characters, and color scheme, unique to the original artists, and Nice Shoes was careful to work to preserve this as they crafted the spot.

“We were really inspired by the illustrations,” explains Greenwood. “We stayed close to the original style and brought them into a 3D space. There’s only a few seconds to register each package, so the challenge was to bring all the different styles and colors together within this time span. Select characters and objects carry over from one scene into the next, acting as transitional elements. The Twizzlers logo stays on-screen throughout, acting as a constant amongst the choreographed craziness.”

The Nice Shoes team used a balance of 3D and 2D animation, creating a CG pack while executing the characters on the packs with hand-drawn animation. Greenwood proposed taking advantage of the rich backgrounds that the artists had drawn, animating tiny background elements in addition to the main characters in order to “make each pack feel more alive.”

The main Twizzlers pack was modeled, lit, animated and rendered in Autodesk Maya which was composited in Adobe After Effects together with the supporting elements. These consisted of 2D hand-drawn animations created in Photoshop and 3D animated elements made with Mason Cinema 4D.

“Once we had the timing, size and placement of the main pack locked, I looked at which shapes would make sense to bring into a 3D space,” says Greenwood. “For example, the pink ribbons and cars from the ‘DJ’ illustration worked well as 3D objects, and we had time to add touches of detail within these elements.”

The characters on the packs themselves were animated with After Effects and applied as textures within the pack artwork. “The flying books and bookcases were rendered with Sketch and Toon in Cinema 4D, and I like to take advantage of that software’s dynamics simulation system when I want a natural feel to objects falling onto surfaces. The shapes in the end mnemonic are also rendered with Sketch and Toon and they provide a ‘wipe’ to get us to the end lock-up,” says Greenwood.

The final step during the production was to add a few frame-by-frame 2D animations (the splashes or car exhaust trail, for example) but Nice Shoes Creative Studio waited until everything was signed off before they added these final details.

“The nature of the illustrations allowed me to try a few different approaches and as long as everything was rendered flat or had minimal shading, I could combine different 2D and 3D techniques,” he concludes.

Frame.io 2.0 offers 100 new features, improvements for collaboration

Frame.io, developers of the video review and collaboration platform for content creators, has unveiled Frame.io 2.0 , an upgrade offering over 100 new features and improvements. This new version features new client Review Pages, which expands content review and sharing. In addition, the new release offers deeper workflow integration with Final Cut Pro X and Avid Media Composer, plus a completely re-engineered player.

“Frame.io 2 is based on everything we’ve learned from our customers over the past two years and includes our most-requested features,” says Emery Wells, CEO of Frame.io.

Just as internal teams can collaborate using Frame.io’s comprehensive annotation and feedback tools, clients can now provide detailed feedback on projects with Review Pages, which is designed to make the sharing experience simple, with no log-in required.

Review Pages give clients the same commenting ability as collaborators, without exposing them to the full Frame.io interface. Settings are highly configurable to meet specific customer needs, including workflow controls (approvals), security (password protection, setting expiration date) and communication (including a personalized message for the client).

The Review Pages workflow simplifies the exchange of ideas, consolidating feedback in a succinct manner. For those using Adobe Premiere or After Effects, those thoughts flow directly into the timeline, where you can immediately take action and upload a new version. Client Review Pages are also now available in the Frame.io iOS app, allowing collaboration via iPhones and iPads.

Exporting and importing comments and annotations into Final Cut Pro X and Media Composer has gotten easier with the upgraded, free desktop companion app, which allows users to open downloaded comment files and bring them into the editor as markers. There is now no need to toggle between Frame.io and the NLE.

Users can also now copy and paste comments from one version to another. The information is exportable in a variety of formats, whether that’s a PDF containing a thumbnail, timecode, comment, annotation and completion status that can be shared and reviewed with the team or as a .csv or .xml file containing tons of additional data for further processing.

Also new to Frame.io 2.0 is a SMPTE-compliant source timecode display that works with both non-drop and drop-frame timecode. Users can now download proxies straight from Frame.io.

The Frame.io 2.0 player page now offers better navigation, efficiency and accountability. New “comment heads” allow artists to visually see who left a comment and where so they can quickly find and prioritize feedback on any given project. Users can also preview the next comment, saving them time when one comment affects another.

The new looping feature, targeting motion and VFX artists, lets users watch the same short clip on loop. You can even select a range within a clip to really dive in deep. Frame.io 2.0’s asset slider makes it easy to navigate between assets from the player page.

The new Frame.io 2.0 dashboard has been redesigned for speed and simplicity. Users can manage collaborators for any given project from the new collaborator panel, where adding an entire team to a project takes one click. A simple search in the project search bar makes it easy to bring up a project. The breadcrumb navigation bar tracks every move deeper into a sub-sub-subfolder, helping artists stay oriented when getting lost in their work. The new list view option with mini-scrub gives users the birds-eye view of everything happening in Frame.io 2.0.

Copying and moving assets between projects takes up no additional storage, even when users make thousands of copies of a clip or project. Frame.io 2.0 also now offers the ability to publish direct to Vimeo, with full control over publishing options, so pros can create the description and set privacy permissions, right then and there.

Behind the Title: Artist/Creative Director Barton Damer

NAME: Barton Damer

COMPANY: Dallas-based  Already Been Chewed

CAN YOU DESCRIBE YOUR COMPANY?
AlreadyBeenChewed is a boutique studio that I founded in 2010. We have created a variety of design, motion graphics and 3D animated content for iconic brands, including Nike, Vans, Star Wars, Harry Potter and Marvel Comics. Check out our motion reel.

WHAT’S YOUR JOB TITLE?
Owner/Founding Artist/Creative Director

WHAT DOES THAT ENTAIL?
My job is to set the vibe for the types of projects, clients and style of work we create. I’m typically developing the creative, working with our chief strategy officer to land projects and then directing the team to execute the creative for the project.

WHAT WOULD SURPRISE PEOPLE ABOUT WHAT FALLS UNDER THAT TITLE?
When you launch out on your own, it’s surprising how much non-creative work there is to do. It’s no longer good enough to be great at what you do (being an artist). Now you have to be excellent with communication skills, people skills, business, organization, marketing, sales and leadership skills. It’s surprising how much you have to juggle in the course of a single day and still hit deadlines.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Developing a solution that will not only meet the clients needs but also push us forward as a studio is always exciting. My favorite part of any job is making sure it looks amazing. That’s my passion. The way it animates is secondary. If it doesn’t look good to begin with, it won’t look better just because you start animating it.

WHAT’S YOUR LEAST FAVORITE?
Dealing with clients that stress me out for various reasons —whether it’s because they are scope creeping or not realizing that they signed a contract… or not paying a bill. Fortunately, I have a team of great people that help relieve that stress for me, but it can still be stressful knowing that they are fighting those battles for the company. We get a lot of clients who will sign a contract without even realizing what they agreed to. It’s always stressful when you have to remind them what they signed.

WHAT IS YOUR FAVORITE TIME OF THE DAY?
Night time! That’s when the freaks come out! I do my best creative at night. No doubt!

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Real estate investing/fixing up/flipping. I like all aspects of designing, including interior design. I’ve designed and renovated three different studio spaces for Already Been Chewed over the last seven years.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I blew out my ACL and tore my meniscus while skateboarding. I wanted to stay involved with my friends that I skated with knowing that surgery and rehab would have me off the board for at least a full year. During that time, I began filming and editing skate videos of my friends. I quickly discovered that the logging and capturing of footage was my least favorite part, but I loved adding graphics and motion graphics to the skate videos. I then began to learn Adobe After Effects and Maxon Cinema 4D.

At this time I was already a full-time graphic designer, but didn’t even really know what motion graphics were. I had been working professionally for about five or six years before making the switch from print design to animation. That was after dabbling in Flash animations and discovering I didn’t want to do code websites (this was around 2003-2004).

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
We recently worked with Nike on various activations for the Super Bowl, March Madness and got to create motion graphics for storefronts as part of the Equality Campaign they launched during Black History Month. It was cool to see our work in the flagship Niketown NYC store while visiting New York a few weeks ago.

We are currently working on a variety of projects for Nike, Malibu Boats, Training Mask, Marvel and DC Comics licensed product releases, as well as investing heavily in GPUs and creating 360 animated videos for VR content.

HOW DID THE NIKE EQUALITY MOTION GRAPHICS CAMPAIGN COME TO FRUITION?
Nike had been working on a variety of animated concepts to bring the campaign to life for storefronts. They had a library of animation styles that had already been done that they felt were not working. Our job was to come up with something that would benefit the campaign style.

We recreated 16 athlete portraits in 3D so that we could cast light and shadows across their faces to slowly reveal them from black and also created a seamless video loop transitioning between the athlete portraits and various quotes about equality.

CAN YOU DESCRIBE THE MOTION GRAPHICS SCOPE OF THE NIKE EQUALITY CAMPAIGN, AND THE SOFTWARE USED?
The video we created was used in various Nike flagship stores — Niketown NYC, Soho and LA, to name a few. We reformatted the video to work in a variety of sizes. We were able to see the videos at Niketown NYC where it was on the front of the window displays. It was also used on large LED walls on the interior as well as a four-story vertical screen in store.

We created the portrait technique on all 16 athletes using Cinema 4D and Octane. The remainder of the video was animated in After Effects.

The portraits were sculpted in Cinema 4D and we used camera projection to accurately project real photos of the athletes onto the 3D portrait. This allowed us to keep 100 percent accuracy of the photos Nike provided, but be able to re-light and cast shadows accordingly to reveal the faces up from black.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
That’s a tough one. Usually, it’s whatever the latest project is. We’re blessed to be working on some really fun projects. That being said… working on Vans 50th Anniversary campaign for the Era shoe is pretty epic! Especially since I am a long time skateboarder.

Our work was used globally on everything from POP displays to storefronts to interactive Website takeover and 3D animated spots for broadcast. It was amazing to see it being used across so many mediums.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
A computer, my iPhone and speakers!

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
I’m very active on Instagram and Facebook. I chose to say “no” to Snapchat in hopes that it will go away so that I don’t have to worry about one more thing (he laughs), and twitter is pretty much dead for me these days. I log in once a month and see if I have any notifications. I also use Behance and LinkedIn a lot, and Dribbble once in a blue moon.

DO YOU LISTEN TO MUSIC WHILE YOU WORK? IF SO, WHAT KIND?
My 25-year-old self would cyber bully me for saying this but soft Drake is “Too Good” these days. Loving Travis Scott and Migos among a long list of others.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
First I bought a swimming pool to help me get away from the computer/emails and swim laps with the kids. That worked for a while, but then I bought a convertible BMW to try to ease the tension and enjoy the wind through my hair. Once that wore off and the stress came back, I bought a puppy. Then I started doing yoga. A year later I bought another puppy.