Arraiy 4.11.19

Category Archives: VFX

Behind the Title: Nice Shoes animator Yandong Dino Qiu

This artist/designer has taken to sketching people on the subway to keep his skills fresh and mind relaxed.

NAME: Yandong Dino Qiu

COMPANY: New York’s Nice Shoes

CAN YOU DESCRIBE YOUR COMPANY?
Nice Shoes is a full-service creative studio. We offer design, animation, VFX, editing, color grading, VR/AR, working with agencies, brands and filmmakers to help realize their creative vision.

WHAT’S YOUR JOB TITLE?
Designer/Animator

WHAT DOES THAT ENTAIL?
Helping our clients to explore different looks in the pre-production stage, while aiding them in getting as close as possible to the final look of the spot. There’s a lot of exploration and trial and error as we try to deliver beautiful still frames that inform the look of the moving piece.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
Not so much for the title, but for myself, design and animation can be quite broad. People may assume you’re only 2D, but it also involves a lot of other skill sets such as 3D lighting and rendering. It’s pretty close to a generalist role that requires you to know nearly every software as well as to turn things around very quickly.

WHAT TOOLS DO YOU USE?
Photoshop, After Effects,. Illustrator, InDesign — the full Adobe Creative Suite — and Maxon Cinema 4D.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Pitch and exploration. At that stage, all possibilities are open. The job is alive… like a baby. You’re seeing it form and helping to make new life. Before this, you have no idea what it’s going to look like. After this phase, everyone has an idea. It’s very challenging, exciting and rewarding.

WHAT’S YOUR LEAST FAVORITE?
Revisions. Especially toward the end of a project. Everything is set up. One little change will affect everything else.

WHAT IS YOUR FAVORITE TIME OF THE DAY?
2:15pm. Its right after lunch. You know you have the whole afternoon. The sun is bright. The mood is light. It’s not too late for anything.

Sketching on the subway.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I would be a Manga artist.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
La Mer. Frontline. Friskies. I’ve also been drawing during my commute everyday, sketching the people I see on the subway. I’m trying to post every week on Instagram. I think it’s important for artists to keep to a routine. I started up with this at the beginning of 2019, and there’ve been about 50 drawings already. Artists need to keep their pen sharp all the time. By doing these sketches, I’m not only benefiting my drawing skills, but I’m improving my observation about shapes and compositions, which is extremely valuable for work. Being able to break down shapes and components is a key principle of design, and honing that skill helps me in responding to client briefs.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
TED-Ed What Is Time? We had a lot of freedom in figuring out how to animate Einstein’s theories in a fun and engaging way. I worked with our creative director Harry Dorrington to establish the look and then with our CG team to ensure that the feel we established in the style frames was implemented throughout the piece.

TED-Ed What Is Time?

The film was extremely well received. There was a lot of excitement at Nice Shoes when it premiered, and TED-Ed’s audience seemed to respond really warmly as well. It’s rare to see so much positivity in the YouTube comments.

NAME SOME TECHNOLOGY YOU CAN’T LIVE WITHOUT.
My Wacom tablet for drawing and my iPad for reading.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I take time and draw for myself. I love that drawing and creating is such a huge part of my job, but it can get stressful and tiring only creating for others. I’m proud of that work, but when I can draw something that makes me personally happy, any stress or exhaustion from the work day just melts away.

FilmLight offers additions to Baselight toolkit

FilmLight will be at NAB showing updates to its Baselight toolkit, including T-Cam v2. This is FilmLight’s new and improved color appearance model, which allows the user to render an image for all formats and device types with confidence of color.

It combines with the Truelight Scene Looks and ARRI Look Library, now implemented within the Baselight software. “T-CAM color handling with the updated Looks toolset produces a cleaner response compared to creative, camera-specific LUTs or film emulations,” says Andrea Chlebak, senior colorist at Deluxe’s Encore in Hollywood. “I know I can push the images for theatrical release in the creative grade and not worry about how that look will translate across the many deliverables.”

FilmLight had added what they call “a new approach to color grading” with the addition of Texture Blend tools, which allow the colorist to apply any color grading operation dependent on image detail. This gives the colorist fine control over the interaction of color and texture.

Other workflow improvements aimed at speeding the process include enhanced cache management; a new client view that displays a live web-based representation of a scene showing current frame and metadata; and multi-directory conform for a faster and more straightforward conform process.

The latest version of Baselight software also includes per-pixel alpha channels, eliminating the need for additional layer mattes when compositing VFX elements. Tight integration with VFX suppliers, including Foundry Nuke and Autodesk, means that new versions of sequences can be automatically detected, with the colorist able to switch quickly between versions within Baselight.

Arraiy 4.11.19

VFX house Rodeo FX acquires Rodeo Production

Visual effects studio Rodeo FX, whose high-profile projects include Dumbo, Aquaman and Bumblebee, has purchased Rodeo Production and added its roster of photographers and directors to its offerings.

The two companies, whose common name is just a coincidence, will continue to operate as distinct entities. Rodeo Production’s 10-year-old Montreal office will continue to manage photo and video production, but will now also offer RodeoFX’s post production services and technical expertise.

In Toronto, Rodeo FX plans to open an Autodesk Flame editing suite in the Rodeo Production’ studio and expand its Toronto roster of photographers and directors with the goal of developing stronger production and post services for clients in the city’s advertising, television and film industries.

“This is a milestone in our already incredible history of growth and expansion,” says Sébastien Moreau, founder/president of Rodeo FX, which has offices in LA and Munich in addition to Montreal.

“I have always worked hard to give our artists the best possible opportunities, and this partnership was the logical next step,” says Rodeo Production’s founder Alexandra Saulnier. “I see this as a fusion of pure creativity and innovative technology. It’s the kind of synergy that Montreal has become famous for; it’s in our DNA.”

Rodeo Production clients include Ikea, Under Armour and Mitsubishi.


Quick Chat: Lord Danger takes on VFX-heavy Devil May Cry 5 spot

By Randi Altman

Visual effects for spots have become more and more sophisticated, and the recent Capcom trailer promoting the availability of its game Devil May Cry 5 is a perfect example.

 The Mike Diva-directed Something Greater starts off like it might be a commercial for an anti-depressant with images of a woman cooking dinner for some guests, people working at a construction site, a bored guy trimming hedges… but suddenly each of our “Everyday Joes” turns into a warrior fighting baddies in a video game.

Josh Shadid

The hedge trimmer’s right arm turns into a futuristic weapon, the construction worker evokes a panther to fight a monster, and the lady cooking is seen with guns a blazin’ in both hands. When she runs out of ammo, and to the dismay of her dinner guests, her arms turn into giant saws. 

Lord Danger’s team worked closely with Capcom USA to create this over-the-top experience, and they provided everything from production to VFX to post, including sound and music.

We reached out to Lord Danger founder/EP Josh Shadid to learn more about their collaboration with Capcom, as well as their workflow.

How much direction did you get from Capcom? What was their brief to you?
Capcom’s fight-games director of brand marketing, Charlene Ingram, came to us with a simple request — make a memorable TV commercial that did not use gameplay footage but still illustrated the intensity and epic-ness of the DMC series.

What was it shot on and why?
We shot on both Arri Alexa Mini and Phantom Flex 4k using Zeiss Super Speed MKii Prime lenses, thanks to our friends at Antagonist Camera, and a Technodolly motion control crane arm. We used the Phantom on the Technodolly to capture the high-speed shots. We used that setup to speed ramp through character actions, while maintaining 4K resolution for post in both the garden and kitchen transformations.

We used the Alexa Mini on the rest of the spot. It’s our preferred camera for most of our shoots because we love the combination of its size and image quality. The Technodolly allowed us to create frame-accurate, repeatable camera movements around the characters so we could seamlessly stitch together multiple shots as one. We also needed to cue the fight choreography to sync up with our camera positions.

You had a VFX supervisor on set. Can you give an example of how that was beneficial?
We did have a VFX supervisor on site for this production. Our usual VFX supervisor is one of our lead animators — having him on site to work with means we’re often starting elements in our post production workflow while we’re still shooting.

Assuming some of it was greenscreen?
We shot elements of the construction site and gardening scene on greenscreen. We used pop-ups to film these elements on set so we could mimic camera moves and lighting perfectly. We also took photogrammetry scans of our characters to help rebuild parts of their bodies during transition moments, and to emulate flying without requiring wire work — which would have been difficult to control outside during windy and rainy weather.

Can you talk about some of the more challenging VFX?
The shot of the gardener jumping into the air while the camera spins around him twice was particularly difficult. The camera starts on a 45-degree frontal, swings behind him and then returns to a 45-degree frontal once he’s in the air.

We had to digitally recreate the entire street, so we used the technocrane at the highest position possible to capture data from a slow pan across the neighborhood in order to rebuild the world. We also had to shoot this scene in several pieces and stitch it together. Since we didn’t use wire work to suspend the character, we also had to recreate the lower half of his body in 3D to achieve a natural looking jump position. That with the combination of the CG weapon elements made for a challenging composite — but in the end, it turned out really dramatic (and pretty cool).

Were any of the assets provided by Capcom? All created from scratch?
We were provided with the character and weapons models from Capcom — but these were in-game assets, and if you’ve played the game you’ll see that the environments are often dark and moody, so the textures and shaders really didn’t apply to a real-world scenario.

Our character modeling team had to recreate and re-interpret what these characters and weapons would look like in the real world — and they had to nail it — because game culture wouldn’t forgive a poor interpretation of these iconic elements. So far the feedback has been pretty darn good.

In what ways did being the production company and the VFX house on the project help?
The separation of creative from production and post production is an outdated model. The time it takes to bring each team up to speed, to manage the communication of ideas between creatives and to ensure there is a cohesive vision from start to finish, increases both the costs and the time it takes to deliver a final project.

We shot and delivered all of Devil May Cry’s Something Greater in four weeks total, all in-house. We find that working as the production company and VFX house reduces the ratio of managers per creative significantly, putting more of the money into the final product.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 


Review: Yanobox Nodes 3 — plugins for Premiere, AE, FCPX, Motion

By Brady Betzel

Did you ever see a plugin preview and immediately think, “I need to have that?” Well, Nodes 3 by Yanobox is that plugin for me. Imagine if Video CoPilot’s Element 3D and Red Giant’s Trapcode and Form had a baby — you would probably end up with something like Nodes 3.

Nodes 3 is a MacOS-only plugin for Adobe’s After Effects and Premiere Pro and Apple’s Final Cut Pro X and Motion. I know what you are thinking: Why isn’t this made for Windows? Good question, but I don’t think it will ever be ported over.

Final Cut Pro

What is it? Nodes 3 is a particle, text, .obj and point cloud replicator, as well as overall mind-blower. With just one click in their preset library you can create stunning fantasy user interfaces (FUIs), such as HUDs or the like. From Transformer-like HUDs to visual data representations interconnected with text and bar graphs, Nodes 3 needs to be seen to be believed. Ok, enough gloating and fluff, let’s get to the meat and potatoes.

A Closer Look
Nodes 3 features a new replicator, animation module and preset browser. The replicator allows you to not only create your HUD or data representation, but also replicates it onto other 2D and 3D primitive shapes (like circles or rectangles) and animates those replications individually or as a group. One thing I really love is the ability to randomize node and/or line values — Yanobox labels this “Probabilities.” You can immediately throw multiple variations of your work together with a few mouse-clicks instead of lines of scripting.

As I mentioned earlier, Nodes 3 is essentially a mix of Element 3D and Trapcode — it’s part replicator/part particle generator and it works easily with After Effect’s 3D cameras (obviously if you are working inside of After Effects) to affect rotations, scale and orientation. The result is a particle replication that feels organic and fresh instead of static and stale. The Auto-Animations offering allows you to quickly animate up to four parts of a structure you’ve built, with 40 parameter choices under each of the four slots. You can animate the clockwise rotation of an ellipse with a point on it, while also rotating the entire structure in toward the z-axis.

Replicator

The newly updated preset browser allows you to save a composition as a preset and open it from within any other compatible host. This allows you to make something with Nodes 3 inside of After Effects and then work with it inside of Final Cut Pro X. That can be super handy and help streamline VFX work. From importing an .obj file to real video, you can generate point clouds from unlimited objects and literally explode them into hundreds of interconnecting points and lines, all animated randomly. It’s amazing.

If you are seeing this and thinking about using Nodes for data representation, that is one of the more beautiful functions of this plugin. First, check out how to turn seemingly boring bar graphs into mesmerizing creations.

For me Nodes really began to click when they described how each node is defined by an index number. Meaning, each node has even and odd numbers assigned to them, allowing for some computer-science geeky-ness, like skipping even or odd rows and adding animated oscillations for some really engrossing graph work.

When I reviewed Nodes 2 back in 2014, what really gave me a “wow” moment was when they showed a map of the United States along with text for each state and its capital. From there you could animate an After Effect’s 3D camera to reproduce a fly-over but with this futuristic HUD/FUI.

Adobe Premiere

On a motion graphics primal level, this really changed and evolved my way of thinking. Not only did United States graphics not have to be plain maps with animated dotted lines, they could be reimagined with sine-wave-based animations or even gently oscillating data points. Nodes 3 really can turn boring into mesmerizing quickly. The only limiting factor is your mind and some motion graphic design creativity.

To get a relatively quick look into the new replicator options inside of Nodes 3, go to FxFactory Plugins’ YouTube page for great tutorials and demos.

If you get even a tiny bit excited when seeing work from HUD masters like Jayse Hansen or plugins like Element 3D, run over to fxfactory.com and download their plugin app to use Yanobox Nodes 3. You can even get a fully working trial to just test out some of their amazing presets. And if you like what you see, you should definitely hand them $299 for the Nodes 3 plugin.

One slight negative for me — I’m not a huge fan of the FxFactory installer. Not because it messes anything up, but because I have to download a plugin loader for the plugin — double download and potential bloating. Not that I see any slowdown on my system, but it would be nice if I could just download Nodes 3 and nothing else. That is small potatoes though; Nodes 3 is really an interesting and unbridled way to visualize 2D and 3D data quickly.

Oh, and if you are curious, Yanobox has been used on big-name projects from The Avengers to Rise of the Planet of the Apes — HUDs, FUIs and GUIs have been created using Yanobox Nodes.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.


VFX supervisor Christoph Schröer joins NYC’s Artjail

New York City-based VFX house Artjail has added Christoph Schröer as VFX supervisor. Previously a VFX supervisor/senior compositor at The Mill, Schröer brings over a decade of experience to his new role at Artjail. His work has been featured in spots for Mercedes-Benz, Visa, Volkswagen, Samsung, BMW, Hennessy and Cartier.

Combining his computer technology expertise and a passion for graffiti design, Schröer applied his degree in Computer and Media Sciences to begin his career in VFX. He started off working at visual effects studios in Germany and Switzerland where he collaborated with a variety of European auto clients. His credits from his tenure in the European market include lead compositor for multiple Mercedes-Benz spots, two global Volkswagen campaign launches and BMW’s “Rev Up Your Family.”

In 2016, Schröer made the move to New York to take on a role as senior compositor and VFX supervisor at The Mill. There, he teamed with directors such as Tarsem Singh and Derek Cianfrance, and worked on campaigns for Hennessy, Nissan Altima, Samsung, Cartier and Visa.


Roper Technologies set to acquire Foundry

Roper Technologies, a technology company and a constituent of the S&P 500, Fortune 1000 and the Russell 1000 indices, is expected to purchase Foundry — the deal is expected to close in April 2019, subject to regulatory approval and customary closing conditions.Foundry makes software tools used to create visual effects and 3D for the media and entertainment world, including Nuke, Modo, Mari and Katana.

Craig Rodgerson

It’s a substantial move that enables Foundry to remain an independent company, with Roper assuming ownership from Hg. Roper has a successful history of acquiring well-run technology companies in niche markets that have strong, sustainable growth potential.

“We’re excited about the opportunities this partnership brings. Roper understands our strategy and chose to invest in us to help us realize our ambitious growth plans,” says Foundry CEO Craig Rodgerson. “This move will enable us to continue investing in what really matters to our customers: continued product improvement, R&D and technology innovation and partnerships with global leaders in the industry.”


Shooting, posting New Republic’s Indie film, Sister Aimee

After a successful premiere at the Sundance Film Festival, New Republic Studios’ Sister Aimee screened at this month’s SXSW. The movie tells the story of an infamous American evangelist of the 1920s, Sister Aimee Semple McPherson, who gets caught up in her lover’s dreams of Mexico and finds herself on a road trip toward the border.

Sister Aimee shot at the newly renovated New Republic Studios near Austin, Texas, over two and a half weeks. “Their crew used our 2,400-square-foot Little Bear soundstage, our 3,000-square-foot Lone Wolf soundstage, our bullpen office space and numerous exterior locations in our backlot,” reports New Republic Studios president Mindy Raymond, adding that the Sister Aimee production also had access to two screening rooms with 5.1 surround sound, HDMI hookups to 4K monitors and theater-style leather chairs to watch dailies. The film also hit the road, shooting in the New Mexico desert.

L-R: Directors Samantha Buck, Marie Schlingmann at SXSW. Credit: Harrison Funk

Co-written and co-directed by Samantha Buck and Marie Schlingmann, the movie takes some creative license with the story of Aimee. “We don’t look for factual truth in Aimee’s journey,” they explain. “Instead we look for a more timeless truth that says something about female ambition, the female quest for immortality and, most of all, the struggle for women to control their own narratives. It becomes a story about storytelling itself.”

The film, shot by cinematographer Carlos Valdes-Lora at 3.2K ProRes 4444 XQ on an Arri Alexa Mini, was posted at Dallas and Austin-based Charlieuniformtango.

We reached out to the DP and the post team to find out more.

Carlos, why did you choose the package of the Alexa and Cooke Mini S4 Primes?
Carlos Valdes-Lora: In early conversations with the directors, we all agreed that we didn’t want Sister Aimee to feel like a traditional period movie. We didn’t want to use softening filters or vintage lenses. We aimed instead for clear images, deep focus and a rich color palette that remains grounded in the real world. We felt that this would lend the story a greater sense of immediacy and draw the viewer closer to the characters. Following that same thinking, we worked very extensively with the 25mm and 32mm, especially in closeups and medium closeups, emphasizing accessibility.

The Cooke Mini S4s are a beautiful and affordable set (relative to our other options.) We like the way they give deep dimensionality and warmth to faces, and how they create a slightly lower contrast image compared to the other modern lenses we looked at. They quickly became the right choice for us, striking the right balance between quality, size and value.

The Cookes paired with the Alexa Mini gave us a lightweight camera system with a very contained footprint, and we needed to stay fast and lean due to our compressed shooting schedule and often tight shooting quarters. The Chapman Cobra dolly was a big help in that regard as well.

What was the workflow to post like?
Charlieuniformtango producers Bettina Barrow, Katherine Harper, David Hartstein: Post took place primarily between Charlieuniformtango’s Dallas and Austin offices. Post strategizing started months before the shoot, and active post truly began when production began in July 2018.

Tango’s Evan Linton handled dailies brought in from the shoot, working alongside editor Katie Ennis out of Tango’s Austin studio, to begin assembling a rough cut as shooting continued. Ennis continued to cut at the studio through August with directors Schlingmann and Buck.

Editorial then moved back to the directors’ home state of New York to finish the cut for Sundance. (Editor Ennis, who four-walled out of Tango Austin for the first part of post, went to  New York with the directors, working out of a rented space.)

VFX and audio work started early at Tango, with continuously updated timelines coming from editorial, working to have certain locked shots also finished for the Sundance submission, while saving much of the cleanup and other CG heavy shots for the final picture lock.

Tango audio engineer Nick Patronella also tackled dialogue edit, sound design and mix for the submission out of the Dallas studio.

Can you talk about the VFX?
Barrow, Harper, Hartstein: The cut was locked in late November, and the heavy lifting really began. With delivery looming, Tango’s Flame artists Allen Robbins, Joey Waldrip, David Hannah, David Laird, Artie Peña and Zack Smith divided effects shots, which ranged from environmental cleanup, period-specific cleanup, beauty work such as de-aging, crowd simulation, CG sign creation and more. 3D

(L-R) Tango’s Artie Peña, Connor Adams, Allen Robbins in one of the studio’s Flame suites.

Artist Connor Adams used Houdini, Mixamo and Maya to create CG elements and crowds, with final comps being done in Nuke and sent to Flame for final color. Over 120 VFX shots were handled in total and Flame was the go-to for effects. Color and much of the effects happened simultaneously. It was a nice workflow as the project didn’t have major VFX needs that would have impacted color.

What about the color grade?
Barrow, Harper, Hartstein: Directors Buck and Schlingmann and DP Valdes-Lora worked with Tango colorist Allen Robbins to craft the final look of the film — with the color grade also done in Flame. The trio had prepped shooting for a Kodachrome-style look, especially for the exteriors, but really overall. They found important reference in selections of Robert Capa photographs.

Buck, Schlingmann and Valdes-Lora responded mostly to Kodachrome’s treatment of blues, browns, tans, greens and reds (while staying true to skin tone), but also to their gamma values, not being afraid of deep shadows and contrast wherever appropriate. Valdes-Lora wanted to avoid lighting/exposing to a custom LUT on set that would reflect this kind of Kodachrome look, in case they wanted to change course during the process. With the help of Tango, however, they discovered that by dialing back the Capa look it grounded the film a little more and made the characters “feel” more accessible. The roots of the inspiration remained in the image but a little more naturalism, a little more softness, served the story better.

Because of that they monitored on set with Alexa 709, which he thought exposing for would still provide enough room. Production designer Jonathan Rudak (another regular collaborator with the directors) was on the same page during prep (in terms of reflecting this Capa color style), and the practical team did what they could to make sure the set elements complemented this approach.

What about the audio post?
Barrow, Harper, Hartstein: With the effects and color almost complete, the team headed to Skywalker Ranch for a week of final dialogue edit, mix, sound design and Foley, led by Skywalker’s Danielle Dupre, Kim Foscato and E. Larry Oatfield. The team also was able to simultaneously approve color sections in Skywalker’s Stag Theater allowing for an ultra-efficient schedule. With final mix in hand, the film was mastered just after Christmas so that DCP production could begin.

Since a portion of the film was musical, how complex was the audio mix?
Skywalker sound mixer Dupre: The musical number was definitely one of the most challenging but rewarding scenes to design and mix. It was such a strong creative idea that played so deeply into the main character. The challenge was in striking a balance between tying it into the realism of the film while also leaning into the grandiosity of the musical to really sell the idea.

It was really fun to play with a combination of production dialogue and studio recordings to see how we could make it work. It was also really rewarding to create a soundscape that starts off minimally and simply and transitions to Broadway scale almost undetectably — one of the many exciting parts to working with creative and talented filmmakers.

What was the biggest challenge in post?
Barrow, Harper, Hartstein: Finishing a film in five to six weeks during the holidays was no easy feat. Luckily, we were able to have our directors hands-on for all final color, VFX and mix. Collaborating in the same room is always the best when you have no time to spare. We had a schedule where each day was accounted for — and we stuck to it almost down to the hour.

 


Alkemy X: A VFX guide to pilot season

Pilot season is an important time for visual effects companies that work in television. Pilots offer an opportunity to establish the look of key aspects of a show and, if the show gets picked up, present the potential of a long-term gig. But pilots also offer unique challenges.

Time is always short and budgetary resources are often in even shorter supply, yet expectations may be sky high. Alkemy X, which operates visual effects studios in New York and Los Angeles, has experienced the trials as well as enjoyed the fruits of pilot season, delivering effects for shows that have gone onto successful runs, including Frequency, Time After Time, Do No Harm, The Leftovers, Flesh and Bone, Outcast, Mr. Robot, Deception and The Marvelous Mrs. Maisel.

Mark Miller

We recently reached out to Mark Miller, executive producer/business development, at Alkemy X to find out how his company overcomes the obstacles of time and budget to produce great effects for hopeful, new shows.

How does visual effects production for pilots differ from a regular series?
The biggest difference between a series and a pilot is that with a pilot you are establishing the look of the show. You work in concert with the director to implement his or her vision and offer ideas on how to get there.

Typically, we work on pilots with serious needs for VFX to drive the stories. We are not often told how to get there but simply listen to the producters, interpret their vision and do our best to give it to them on screen. The quality of the visuals we create is often the difference between a pick-up and a pass.

In the case of one show I was involved with, the time and budget available made it impossible to complete all the required visual effects. As a result, the VFX supervisor decided to put the time and money they had into the most important plot points in the script and use simple tests as placeholders for less important VFX. That sold the studio and the show went to series.

Had we attempted to complete the show in its entirety, it may not have seemed as viable by the studio. Again, that was a collaborative decision made by the director, studio, VFX supervisor and VFX company.

Mr. Robot

What should studios consider in selecting a visual effects provider for a pilot?
Often the deciding factors in choosing a VFX vendor are its cost and location within an incentivized region. Usually the final arbitrator is the VFX supervisor, occasionally with restrictions as to which company he or she may use. I find that good-quality VFX companies, shops with strong creative vision and the ability to deliver the shots with little pain, are unable to meet a production’s budgets, even if they are in a favorable region. That drives productions to smaller shops and results in less-polished shows.

Shots may not be delivered on time or may not have the desired creative impact. We are all aware that, even if a pilot you work on goes to series, there is no guarantee you will get the work. These days, many pilots employ feature directors and their crew. So, when one is picked up, it usually has a whole new crew.

The other issue with pilots is time. When the shoot runs longer than anticipated, it delays the director’s cut and VFX work can’t begin until that is done. Even a one-day delay in turnover can impact the quality of the visual effects. And it’s not a matter of throwing more artists at a shot. Many shots are not shareable among multiple artists so adding more artists won’t shorten the time to completion. Visual effects are like fine-art painting; one artist can’t create the sky while another works on the background. Under the best circumstances, it is hard to deliver polished work for pilots and such delays add to the problem. With pilots, our biggest enemy is time.

The Leftovers

How do you handle staffing and workflow issues in managing short-term projects like pilots?
You need to be very smart and nimble. A big issue for New York-based studios is infrastructure. Many buildings lack enough electricity to accommodate high power demands, high-speed connectivity and even the physical space required by visual effects studios.

New York studios therefore have to be as efficient as possible with streamlined pipelines built to push work through. We are addressing this issue by increasingly relying on cloud solutions for software and infrastructure. It helps us maximize flexibility.

Staffing is also an ongoing issue. Qualified artists are in short supply. More and more, we look to schools, designed by VFX supervisors, artists and producers, for junior artists with the skills to hit the ground running.

Review: Red Giant’s Trapcode Suite 15

By Brady Betzel

We are now comfortably into 2019 and enjoying the Chinese Year of the Pig — or at least I am! So readers, you might remember that with each new year comes a Red Giant Trapcode Suite update. And Red Giant didn’t disappoint with Trapcode Suite 15.

Every year Red Giant adds more amazing features to its already amazing particle generator and emitter toolset, Trapcode Suite, and this year is no different. Trapcode Suite 15 is keeping tools like 3D Stroke, Shine, Starglow, Sound Keys, Lux, Tao, Echospace and Horizon while significantly updating Particular, Form and Mir.

I won’t be covering each plugin in this review but you can check out what each individual plugin does on the Red Giant’s website.

Particular 4
The bread and butter of the Trapcode Suite has always been Particular, and Version 4 continues to be a powerhouse. The biggest differences between using a true 3D app like Maxon’s Cinema 4D or Autodesk Maya and Adobe After Effects (besides being pseudo 3D) are features like true raytraced rendering and interacting particle systems with fluid dynamics. As I alluded to, After Effects isn’t technically a 3D app, but with plugins like Particular you can create pseudo-3D particle systems that can affect and be affected by different particle emitters in your scenes. Trapcode Suite 15 and, in particular (all the pun intended), Particular 4, have evolved to another level with the latest update to include Dynamic Fluids. Dynamic Fluids essentially allows particle systems that have the fluid-physics engine enabled to interact with one another as well as create mind-blowing liquid-like simulations inside of After Effects.

What’s even more impressive is that with the Particular Designer and over 335 presets, you don’t  need a master’s degree to make impressive motion graphics. While I love to work in After Effects, I don’t always have eight hours to make a fluidly dynamic particle system bounce off 3D text, or have two systems interact with each other for a text reveal. This is where Particular 4 really pays for itself. With a little research and tutorial watching, you will be up and rendering within 30 minutes.

When I was using Particular 4, I simply wanted to recreate the Dynamic Fluid interaction I had seen in one of their promos. Basically, two emitters crashing into each other in a viscus-like fluid, then interacting. While it isn’t necessarily easy, if you have a slightly above-beginner amount of After Effects knowledge you will be able to do this. Apply the Particular plugin to a new solid object and open up the Particular Designer in Effect Controls. From there you can designate emitter type, motion, particle type, particle shadowing, particle color and dispersion types, as well as add multiple instances of emitters, adjust physics and much more.

The presets for all of these options can be accessed by clicking the “>” symbol in the upper left of the Designer interface. You can access all of the detailed settings and building “Blocks” of each of these categories by clicking the “<” in the same area. With a few hours spent watching tutorials on YouTube, you can be up and running with particle emitters and fluid dynamics. The preset emitters are pretty amazing, including my favorite, the two-emitter fluid dynamic systems that interact with one another.

Form 4
The second plugin in the Trapcode Suite 15 that has been updated is Trapcode Form 4. Form is a plugin that literally creates forms using particles that live forever in a unified 3D space, allowing for interaction. Form 4 adds the updated Designer, which makes particle grids a little more accessible and easier to construct for non-experts. Form 4 also includes the latest Fluid Dynamics update that Particular gained. The Fluid Dynamics engine really adds another level of beauty to Form projects, allowing you to create fluid-like particle grids from the 150 included presets or even your own .obj files.

My favorite settings to tinker with are Swirl and Viscosity. Using both settings in tandem can help create an ooey-gooey liquid particle grid that can interact with other Form systems to build pretty incredible scenes. To test out how .obj models worked within form, I clicked over to www.sketchfab.com and downloaded an .obj 3D model. If you search for downloadable models that do not cost anything, you can use them in your projects under Creative Commons licensing protocols, as long as you credit the creator. When in doubt always read the licensing (You can find more info on creative commons licensing here, but in this case you can use them as great practice models.

Anyway, Form 4 allows us to import .obj files, including animated .obj sequences as well as their textures. I found a Day of the Dead-type skull created by JMUHIST, pointed form to the .obj as well as its included texture, added a couple After Effect’s lights, a camera, and I was in business. Form has a great replicator feature (much like Element3D). There are a ton of options, including fog distance under visibility, animation properties, and even the ability to quickly add a null object linked to your model for quick alignment of other elements in the scene.

Mir 3
Up last is Trapcode Mir 3. Mir 3 is used to create 3D terrains, objects and wireframes in After Effects. In this latest update, Mir has added the ability to import .obj models and textures. Using fractal displacement mapping, you can quickly create some amazing terrains. From mountain-like peaks to alien terrains, Mir is a great supplement when using plugins like Video Copilot Element 3D to add endless tunnels or terrains to your 3D scenes quickly and easily.

And if you don’t have or own Element 3D, you will really enjoy the particle replication system. Use one 3D object and duplicate, then twist, distort and animate multiple instances of them quickly. The best part about all of these Trapcode Suite tools is that they interact with the cameras and lighting native to After Effects, making it a unified animating experience (instead of animating separate camera and lighting rigs like in the old days). Two of my favorite features from the last update are the ability to use quad- or triangle-based polygons to texture your surfaces. This can give an 8-bit or low-poly feel quickly, as well as a second pass wireframe to add a grid-like surface to your terrain.

Summing Up
Red Giant’s Trapcode Suite 15 is amazing. If you have a previous version of the Trapcode Suite, you’re in luck: the upgrade is “only” $199. If you need to purchase the full suite, it will cost you $999. Students get a bit of a break at $499.

If you are on the fence about it, go watch Daniel Hashimoto’s Cheap Tricks: Aquaman Underwater Effects tutorial (Part 1 and Part 2). He explains how you can use all of the Red Giant Trapcode Suite effects with other plugins like Video CoPilot’s Element 3D and Red Giant’s Universe and offers up some pro tips when using www.sketchfab.com to find 3D models.

I think I even saw him using Video CoPilot’s FX Console, which is a free After Effects plugin that makes accessing plugins much faster in After Effects. You may have seen his work as @ActionMovieKid on Twitter or @TheActionMovieKid on Instagram. He does some amazing VFX with his kids — he’s a must follow. Red Giant made a power move to get him to make tutorials for them! Anyway, his Aquaman Underwater Effects tutorial take you step by step through how to use each part of the Trapcode Suite 15 in an amazing way. He makes it look a little too easy, but I guess that is a combination of his VFX skills and the Trapcode Suite toolset.

If you are excited about 3D objects, particle systems and fluid dynamics you must check out Trapcode Suite 15 and its latest updates to Particular, Mir and Form.

After I finished the Trapcode Suite 15 review, Red Giant released the Trapcode Suite 15.1 update. The 15.1 update includes Text and Mask Emitters for Form and Particular 4.1, updated Designer, Shadowlet particle type matching, shadowlet softness and 21 additional presets.

This is a free update that can be downloaded from the Red Giant website.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.