Tag Archives: VFX

NAB 2019: Maxon acquires Redshift Rendering Technologies

Maxon, makers of Cinema 4D, has purchased Redshift Rendering Technologies, developers of the Redshift rendering engine. Redshift is a flexible GPU-accelerated renderer targeting high-end production. Redshift offers an extensive suite of features that makes rendering complicated 3D projects faster. Redshift is available as a plugin for Maxon’s Cinema 4D and other industry-standard 3D applications.

“Rendering can be the most time-consuming and demanding aspect of 3D content creation,” said David McGavran, CEO of Maxon. “Redshift’s speed and efficiency combined with Cinema 4D’s responsive workflow make it a perfect match for our portfolio.”

“We’ve always admired Maxon and the Cinema 4D community, and are thrilled to be a part of it,” said Nicolas Burtnyk, co-founder/CEO, Redshift. “We are looking forward to working closely with Maxon, collaborating on seamless integration of Redshift into Cinema 4D and continuing to push the boundaries of what’s possible with production-ready GPU rendering.”

Redshift is used by post companies, including Technicolor, Digital Domain, Encore Hollywood and Blizzard. Redshift has been used for VFX and motion graphics on projects such as Black Panther, Aquaman, Captain Marvel, Rampage, American Gods, Gotham, The Expanse and more.

Autodesk’s Flame 2020 features machine learning tools

Autodesk’s new Flame 2020 offers a new machine-learning-powered feature set with a host of new capabilities for Flame artists working in VFX, color grading, look development or finishing. This latest update will be showcased at the upcoming NAB Show.

Advancements in computer vision, photogrammetry and machine learning have made it possible to extract motion vectors, Z depth and 3D normals based on software analysis of digital stills or image sequences. The Flame 2020 release adds built-in machine learning analysis algorithms to isolate and modify common objects in moving footage, dramatically accelerating VFX and compositing workflows.

New creative tools include:
· Z-Depth Map Generator— Enables Z-depth map extraction analysis using machine learning for live-action scene depth reclamation. This allows artists doing color grading or look development to quickly analyze a shot and apply effects accurately based on distance from camera.
· Human Face Normal Map Generator— Since all human faces have common recognizable features (relative distance between eyes, nose, location of mouth) machine learning algorithms can be trained to find these patterns. This tool can be used to simplify accurate color adjustment, relighting and digital cosmetic/beauty retouching.
· Refraction— With this feature, a 3D object can now refract, distorting background objects based on its surface material characteristics. To achieve convincing transparency through glass, ice, windshields and more, the index of refraction can be set to an accurate approximation of real-world material light refraction.

Productivity updates include:
· Automatic Background Reactor— Immediately after modifying a shot, this mode is triggered, sending jobs to process. Accelerated, automated background rendering allows Flame artists to keep projects moving using GPU and system capacity to its fullest. This feature is available on Linux only, and can function on a single GPU.
· Simpler UX in Core Areas— A new expanded full-width UX layout for MasterGrade, Image surface and several Map User interfaces, are now available, allowing for easier discoverability and accessibility to key tools.
· Manager for Action, Image, Gmask—A simplified list schematic view, Manager makes it easier to add, organize and adjust video layers and objects in the 3D environment.
· Open FX Support—Flame, Flare and Flame Assist version 2020 now include comprehensive support for industry-standard Open FX creative plugins such as Batch/BFX nodes or on the Flame timeline.
· Cryptomatte Support—Available in Flame and Flare, support for the Cryptomatte open source advanced rendering technique offers a new way to pack alpha channels for every object in a 3D rendered scene.

For single-user licenses, Linux customers can now opt for monthly, yearly and three-year single user licensing options. Customers with an existing Mac-only single user license can transfer their license to run Flame on Linux.
Flame, Flare, Flame Assist and Lustre 2020 will be available on April 16, 2019 at no additional cost to customers with a current Flame Family 2019 subscription. Pricing details can be found at the Autodesk website.

VFX and color for new BT spot via The Mill

UK telco BT wanted to create a television spot that showcased the WiFi capabilities of its broadband hub and underline its promise of “whole home coverage.” Sonny director Fredrik Bond visualized a fun and fast-paced spot for agency AMV BBDO, and a The Mill London was brought onboard to help with VFX and color. It is called Complete WiFi.

In the piece, the hero comes home to find it full of soldiers, angels, dancers, fairies, a giant and a horse — characters from the myriad of games and movies the family are watching simultaneously. Obviously, the look depends upon multiple layers of compositing, which have to be carefully scaled to be convincing.

They also need to be very carefully color matched, with similar lighting applied, so all the layers sit together. In a traditional workflow, this would have meant a lot of loops between VFX and grading to get the best from each layer, and a certain amount of compromise as the colorist imposed changes on virtual elements to make the final grade.

To avoid this, and to speed progress, The Mill recently started using BLG for Flame, a FilmLilght plugin that allows Baselight grades to be rendered identically within Flame — and with no back and forth to the color suite to render out new versions of shots. It means the VFX supervisor is continually seeing the latest grade and the colorist can access the latest Flame elements to match them in.

“Of course it was frustrating to grade a sequence and then drop the VFX on top,” explains VFX supervisor Ben Turner. “To get the results our collaborators expect, we were constantly pushing material to and fro. We could end up with more than a hundred publishes on a single job.”

With the BLG for Flame plugin, the VFX artist sees the latest Baselight grade automatically applied, either from FilmLight’s BLG format files or directly from a Baselight scene, even while the scene is still being graded — although Turner says he prefers to be warned when updates are coming.

This works because all systems have access to the raw footage. Baselight grades non-destructively, by building up layers of metadata that are imposed in realtime. The metadata includes all the grading information, multiple windows and layers, effects and relights, textures and more – the whole process. This information can be imposed on the raw footage by any BLG-equipped device (there are Baselight Editions software plugins for Avid and Nuke, too) for realtime rendering and review.

That is important because it also allows remote viewing. For this BT spot, director Bond was back in Los Angeles by the time of the post. He sat in a calibrated room in The Mill in LA and could see the graded images at every stage. He could react quickly to the first animation tests.

“I can render a comp and immediately show it to a client with the latest grade from The Mill’s colorist, Dave Ludlam,” says Turner. “When the client really wants to push a certain aspect of the image, we can ensure through both comp and grade that this is done sympathetically, maintaining the integrity of the image.”

(L-R) VFX supervisor Ben Turner and colorist Dave Ludlam.

Turner admits that it means more to-ing and fro-ing, but that is a positive benefit. “If I need to talk to Dave then I can pop in and solve a specific challenge in minutes. By creating the CGI to work with the background, I know that Dave will never have to push anything too hard in the final grade.”

Ludlam agrees that this is a complete change, but extremely beneficial. “With this new process, I am setting looks but I am not committing to them,” he says. “Working together I get a lot more creative input while still achieving a much slicker workflow. I can build the grade and only lock it down when everyone is happy.

“It is a massive speed-up, but more importantly it has made our output far superior. It gives everyone more control and — with every job under huge time pressure — it means we can respond quickly.”

The spot was offlined by Patric Ryan from Marshall Street. Audio post was via 750mph with sound designers Sam Ashwell and Mike Bovill.

FilmLight offers additions to Baselight toolkit

FilmLight will be at NAB showing updates to its Baselight toolkit, including T-Cam v2. This is FilmLight’s new and improved color appearance model, which allows the user to render an image for all formats and device types with confidence of color.

It combines with the Truelight Scene Looks and ARRI Look Library, now implemented within the Baselight software. “T-CAM color handling with the updated Looks toolset produces a cleaner response compared to creative, camera-specific LUTs or film emulations,” says Andrea Chlebak, senior colorist at Deluxe’s Encore in Hollywood. “I know I can push the images for theatrical release in the creative grade and not worry about how that look will translate across the many deliverables.”

FilmLight had added what they call “a new approach to color grading” with the addition of Texture Blend tools, which allow the colorist to apply any color grading operation dependent on image detail. This gives the colorist fine control over the interaction of color and texture.

Other workflow improvements aimed at speeding the process include enhanced cache management; a new client view that displays a live web-based representation of a scene showing current frame and metadata; and multi-directory conform for a faster and more straightforward conform process.

The latest version of Baselight software also includes per-pixel alpha channels, eliminating the need for additional layer mattes when compositing VFX elements. Tight integration with VFX suppliers, including Foundry Nuke and Autodesk, means that new versions of sequences can be automatically detected, with the colorist able to switch quickly between versions within Baselight.

VFX house Rodeo FX acquires Rodeo Production

Visual effects studio Rodeo FX, whose high-profile projects include Dumbo, Aquaman and Bumblebee, has purchased Rodeo Production and added its roster of photographers and directors to its offerings.

The two companies, whose common name is just a coincidence, will continue to operate as distinct entities. Rodeo Production’s 10-year-old Montreal office will continue to manage photo and video production, but will now also offer RodeoFX’s post production services and technical expertise.

In Toronto, Rodeo FX plans to open an Autodesk Flame editing suite in the Rodeo Production’ studio and expand its Toronto roster of photographers and directors with the goal of developing stronger production and post services for clients in the city’s advertising, television and film industries.

“This is a milestone in our already incredible history of growth and expansion,” says Sébastien Moreau, founder/president of Rodeo FX, which has offices in LA and Munich in addition to Montreal.

“I have always worked hard to give our artists the best possible opportunities, and this partnership was the logical next step,” says Rodeo Production’s founder Alexandra Saulnier. “I see this as a fusion of pure creativity and innovative technology. It’s the kind of synergy that Montreal has become famous for; it’s in our DNA.”

Rodeo Production clients include Ikea, Under Armour and Mitsubishi.

Quick Chat: Lord Danger takes on VFX-heavy Devil May Cry 5 spot

By Randi Altman

Visual effects for spots have become more and more sophisticated, and the recent Capcom trailer promoting the availability of its game Devil May Cry 5 is a perfect example.

 The Mike Diva-directed Something Greater starts off like it might be a commercial for an anti-depressant with images of a woman cooking dinner for some guests, people working at a construction site, a bored guy trimming hedges… but suddenly each of our “Everyday Joes” turns into a warrior fighting baddies in a video game.

Josh Shadid

The hedge trimmer’s right arm turns into a futuristic weapon, the construction worker evokes a panther to fight a monster, and the lady cooking is seen with guns a blazin’ in both hands. When she runs out of ammo, and to the dismay of her dinner guests, her arms turn into giant saws. 

Lord Danger’s team worked closely with Capcom USA to create this over-the-top experience, and they provided everything from production to VFX to post, including sound and music.

We reached out to Lord Danger founder/EP Josh Shadid to learn more about their collaboration with Capcom, as well as their workflow.

How much direction did you get from Capcom? What was their brief to you?
Capcom’s fight-games director of brand marketing, Charlene Ingram, came to us with a simple request — make a memorable TV commercial that did not use gameplay footage but still illustrated the intensity and epic-ness of the DMC series.

What was it shot on and why?
We shot on both Arri Alexa Mini and Phantom Flex 4k using Zeiss Super Speed MKii Prime lenses, thanks to our friends at Antagonist Camera, and a Technodolly motion control crane arm. We used the Phantom on the Technodolly to capture the high-speed shots. We used that setup to speed ramp through character actions, while maintaining 4K resolution for post in both the garden and kitchen transformations.

We used the Alexa Mini on the rest of the spot. It’s our preferred camera for most of our shoots because we love the combination of its size and image quality. The Technodolly allowed us to create frame-accurate, repeatable camera movements around the characters so we could seamlessly stitch together multiple shots as one. We also needed to cue the fight choreography to sync up with our camera positions.

You had a VFX supervisor on set. Can you give an example of how that was beneficial?
We did have a VFX supervisor on site for this production. Our usual VFX supervisor is one of our lead animators — having him on site to work with means we’re often starting elements in our post production workflow while we’re still shooting.

Assuming some of it was greenscreen?
We shot elements of the construction site and gardening scene on greenscreen. We used pop-ups to film these elements on set so we could mimic camera moves and lighting perfectly. We also took photogrammetry scans of our characters to help rebuild parts of their bodies during transition moments, and to emulate flying without requiring wire work — which would have been difficult to control outside during windy and rainy weather.

Can you talk about some of the more challenging VFX?
The shot of the gardener jumping into the air while the camera spins around him twice was particularly difficult. The camera starts on a 45-degree frontal, swings behind him and then returns to a 45-degree frontal once he’s in the air.

We had to digitally recreate the entire street, so we used the technocrane at the highest position possible to capture data from a slow pan across the neighborhood in order to rebuild the world. We also had to shoot this scene in several pieces and stitch it together. Since we didn’t use wire work to suspend the character, we also had to recreate the lower half of his body in 3D to achieve a natural looking jump position. That with the combination of the CG weapon elements made for a challenging composite — but in the end, it turned out really dramatic (and pretty cool).

Were any of the assets provided by Capcom? All created from scratch?
We were provided with the character and weapons models from Capcom — but these were in-game assets, and if you’ve played the game you’ll see that the environments are often dark and moody, so the textures and shaders really didn’t apply to a real-world scenario.

Our character modeling team had to recreate and re-interpret what these characters and weapons would look like in the real world — and they had to nail it — because game culture wouldn’t forgive a poor interpretation of these iconic elements. So far the feedback has been pretty darn good.

In what ways did being the production company and the VFX house on the project help?
The separation of creative from production and post production is an outdated model. The time it takes to bring each team up to speed, to manage the communication of ideas between creatives and to ensure there is a cohesive vision from start to finish, increases both the costs and the time it takes to deliver a final project.

We shot and delivered all of Devil May Cry’s Something Greater in four weeks total, all in-house. We find that working as the production company and VFX house reduces the ratio of managers per creative significantly, putting more of the money into the final product.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Review: Yanobox Nodes 3 — plugins for Premiere, AE, FCPX, Motion

By Brady Betzel

Did you ever see a plugin preview and immediately think, “I need to have that?” Well, Nodes 3 by Yanobox is that plugin for me. Imagine if Video CoPilot’s Element 3D and Red Giant’s Trapcode and Form had a baby — you would probably end up with something like Nodes 3.

Nodes 3 is a MacOS-only plugin for Adobe’s After Effects and Premiere Pro and Apple’s Final Cut Pro X and Motion. I know what you are thinking: Why isn’t this made for Windows? Good question, but I don’t think it will ever be ported over.

Final Cut Pro

What is it? Nodes 3 is a particle, text, .obj and point cloud replicator, as well as overall mind-blower. With just one click in their preset library you can create stunning fantasy user interfaces (FUIs), such as HUDs or the like. From Transformer-like HUDs to visual data representations interconnected with text and bar graphs, Nodes 3 needs to be seen to be believed. Ok, enough gloating and fluff, let’s get to the meat and potatoes.

A Closer Look
Nodes 3 features a new replicator, animation module and preset browser. The replicator allows you to not only create your HUD or data representation, but also replicates it onto other 2D and 3D primitive shapes (like circles or rectangles) and animates those replications individually or as a group. One thing I really love is the ability to randomize node and/or line values — Yanobox labels this “Probabilities.” You can immediately throw multiple variations of your work together with a few mouse-clicks instead of lines of scripting.

As I mentioned earlier, Nodes 3 is essentially a mix of Element 3D and Trapcode — it’s part replicator/part particle generator and it works easily with After Effect’s 3D cameras (obviously if you are working inside of After Effects) to affect rotations, scale and orientation. The result is a particle replication that feels organic and fresh instead of static and stale. The Auto-Animations offering allows you to quickly animate up to four parts of a structure you’ve built, with 40 parameter choices under each of the four slots. You can animate the clockwise rotation of an ellipse with a point on it, while also rotating the entire structure in toward the z-axis.

Replicator

The newly updated preset browser allows you to save a composition as a preset and open it from within any other compatible host. This allows you to make something with Nodes 3 inside of After Effects and then work with it inside of Final Cut Pro X. That can be super handy and help streamline VFX work. From importing an .obj file to real video, you can generate point clouds from unlimited objects and literally explode them into hundreds of interconnecting points and lines, all animated randomly. It’s amazing.

If you are seeing this and thinking about using Nodes for data representation, that is one of the more beautiful functions of this plugin. First, check out how to turn seemingly boring bar graphs into mesmerizing creations.

For me Nodes really began to click when they described how each node is defined by an index number. Meaning, each node has even and odd numbers assigned to them, allowing for some computer-science geeky-ness, like skipping even or odd rows and adding animated oscillations for some really engrossing graph work.

When I reviewed Nodes 2 back in 2014, what really gave me a “wow” moment was when they showed a map of the United States along with text for each state and its capital. From there you could animate an After Effect’s 3D camera to reproduce a fly-over but with this futuristic HUD/FUI.

Adobe Premiere

On a motion graphics primal level, this really changed and evolved my way of thinking. Not only did United States graphics not have to be plain maps with animated dotted lines, they could be reimagined with sine-wave-based animations or even gently oscillating data points. Nodes 3 really can turn boring into mesmerizing quickly. The only limiting factor is your mind and some motion graphic design creativity.

To get a relatively quick look into the new replicator options inside of Nodes 3, go to FxFactory Plugins’ YouTube page for great tutorials and demos.

If you get even a tiny bit excited when seeing work from HUD masters like Jayse Hansen or plugins like Element 3D, run over to fxfactory.com and download their plugin app to use Yanobox Nodes 3. You can even get a fully working trial to just test out some of their amazing presets. And if you like what you see, you should definitely hand them $299 for the Nodes 3 plugin.

One slight negative for me — I’m not a huge fan of the FxFactory installer. Not because it messes anything up, but because I have to download a plugin loader for the plugin — double download and potential bloating. Not that I see any slowdown on my system, but it would be nice if I could just download Nodes 3 and nothing else. That is small potatoes though; Nodes 3 is really an interesting and unbridled way to visualize 2D and 3D data quickly.

Oh, and if you are curious, Yanobox has been used on big-name projects from The Avengers to Rise of the Planet of the Apes — HUDs, FUIs and GUIs have been created using Yanobox Nodes.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

VFX supervisor Christoph Schröer joins NYC’s Artjail

New York City-based VFX house Artjail has added Christoph Schröer as VFX supervisor. Previously a VFX supervisor/senior compositor at The Mill, Schröer brings over a decade of experience to his new role at Artjail. His work has been featured in spots for Mercedes-Benz, Visa, Volkswagen, Samsung, BMW, Hennessy and Cartier.

Combining his computer technology expertise and a passion for graffiti design, Schröer applied his degree in Computer and Media Sciences to begin his career in VFX. He started off working at visual effects studios in Germany and Switzerland where he collaborated with a variety of European auto clients. His credits from his tenure in the European market include lead compositor for multiple Mercedes-Benz spots, two global Volkswagen campaign launches and BMW’s “Rev Up Your Family.”

In 2016, Schröer made the move to New York to take on a role as senior compositor and VFX supervisor at The Mill. There, he teamed with directors such as Tarsem Singh and Derek Cianfrance, and worked on campaigns for Hennessy, Nissan Altima, Samsung, Cartier and Visa.

Roper Technologies set to acquire Foundry

Roper Technologies, a technology company and a constituent of the S&P 500, Fortune 1000 and the Russell 1000 indices, is expected to purchase Foundry — the deal is expected to close in April 2019, subject to regulatory approval and customary closing conditions.Foundry makes software tools used to create visual effects and 3D for the media and entertainment world, including Nuke, Modo, Mari and Katana.

Craig Rodgerson

It’s a substantial move that enables Foundry to remain an independent company, with Roper assuming ownership from Hg. Roper has a successful history of acquiring well-run technology companies in niche markets that have strong, sustainable growth potential.

“We’re excited about the opportunities this partnership brings. Roper understands our strategy and chose to invest in us to help us realize our ambitious growth plans,” says Foundry CEO Craig Rodgerson. “This move will enable us to continue investing in what really matters to our customers: continued product improvement, R&D and technology innovation and partnerships with global leaders in the industry.”

Alkemy X: A VFX guide to pilot season

Pilot season is an important time for visual effects companies that work in television. Pilots offer an opportunity to establish the look of key aspects of a show and, if the show gets picked up, present the potential of a long-term gig. But pilots also offer unique challenges.

Time is always short and budgetary resources are often in even shorter supply, yet expectations may be sky high. Alkemy X, which operates visual effects studios in New York and Los Angeles, has experienced the trials as well as enjoyed the fruits of pilot season, delivering effects for shows that have gone onto successful runs, including Frequency, Time After Time, Do No Harm, The Leftovers, Flesh and Bone, Outcast, Mr. Robot, Deception and The Marvelous Mrs. Maisel.

Mark Miller

We recently reached out to Mark Miller, executive producer/business development, at Alkemy X to find out how his company overcomes the obstacles of time and budget to produce great effects for hopeful, new shows.

How does visual effects production for pilots differ from a regular series?
The biggest difference between a series and a pilot is that with a pilot you are establishing the look of the show. You work in concert with the director to implement his or her vision and offer ideas on how to get there.

Typically, we work on pilots with serious needs for VFX to drive the stories. We are not often told how to get there but simply listen to the producters, interpret their vision and do our best to give it to them on screen. The quality of the visuals we create is often the difference between a pick-up and a pass.

In the case of one show I was involved with, the time and budget available made it impossible to complete all the required visual effects. As a result, the VFX supervisor decided to put the time and money they had into the most important plot points in the script and use simple tests as placeholders for less important VFX. That sold the studio and the show went to series.

Had we attempted to complete the show in its entirety, it may not have seemed as viable by the studio. Again, that was a collaborative decision made by the director, studio, VFX supervisor and VFX company.

Mr. Robot

What should studios consider in selecting a visual effects provider for a pilot?
Often the deciding factors in choosing a VFX vendor are its cost and location within an incentivized region. Usually the final arbitrator is the VFX supervisor, occasionally with restrictions as to which company he or she may use. I find that good-quality VFX companies, shops with strong creative vision and the ability to deliver the shots with little pain, are unable to meet a production’s budgets, even if they are in a favorable region. That drives productions to smaller shops and results in less-polished shows.

Shots may not be delivered on time or may not have the desired creative impact. We are all aware that, even if a pilot you work on goes to series, there is no guarantee you will get the work. These days, many pilots employ feature directors and their crew. So, when one is picked up, it usually has a whole new crew.

The other issue with pilots is time. When the shoot runs longer than anticipated, it delays the director’s cut and VFX work can’t begin until that is done. Even a one-day delay in turnover can impact the quality of the visual effects. And it’s not a matter of throwing more artists at a shot. Many shots are not shareable among multiple artists so adding more artists won’t shorten the time to completion. Visual effects are like fine-art painting; one artist can’t create the sky while another works on the background. Under the best circumstances, it is hard to deliver polished work for pilots and such delays add to the problem. With pilots, our biggest enemy is time.

The Leftovers

How do you handle staffing and workflow issues in managing short-term projects like pilots?
You need to be very smart and nimble. A big issue for New York-based studios is infrastructure. Many buildings lack enough electricity to accommodate high power demands, high-speed connectivity and even the physical space required by visual effects studios.

New York studios therefore have to be as efficient as possible with streamlined pipelines built to push work through. We are addressing this issue by increasingly relying on cloud solutions for software and infrastructure. It helps us maximize flexibility.

Staffing is also an ongoing issue. Qualified artists are in short supply. More and more, we look to schools, designed by VFX supervisors, artists and producers, for junior artists with the skills to hit the ground running.