AMD 2.1

Category Archives: 3D

GTC: GPUs power The Mandalorian‘s in-camera VFX, realtime workflows

By Mike McCarthy

Each year, Nvidia hosts a series of conferences that focus on new developments in GPU-based computing. Originally, these were about graphics and visualization, which were the most advanced things being done with GPUs. Now they focus on everything from supercomputing and AI to self-driving cars and VR. The first GTC conference I attended was in 2016, when Nvidia announced its Pascal architecture with dedicated Tensor cores. While that was targeted to supercomputer users, there was still a lot of graphics-based content to explore, especially with VR.

Over time, the focus has shifted from visual applications to AI applications that aren’t necessarily graphics-based; they just have similar parallel computing requirements to graphics processing and are optimal tasks to be accelerated on GPU hardware. This has made GTC more relevant to programmers and similar users, but the hardware developments that enable those capabilities also accelerate the more traditional graphics workflows — and new ways of using that power are constantly being developed.

I was looking forward to going to March’s GTC to hear the details on what was expected to be an announcement about Nvidia’s next generation of hardware architecture and to see all of the other presentations about how others have been using current GPU technology. Then came the coronavirus, and the world changed. Nvidia canceled the online keynote and a few SDK updates were released, but all major product announcements have been deferred for the time being. What Nvidia did offer was a selection of talks and seminars that were remotely recorded and hosted as videos to watch. These are available to anyone who registers for the free online version of GTC, instead of paying the hundreds it would cost to attend in person.

One that really stood out to me was “Creating In-Camera VFX with Realtime Workflows.” It highlighted the Unreal Engine and what that technology allowed on The Mandalorian — it was amazing. The basic premise is to replace greenscreen composites with VFX projections behind the elements being photographed. This was done years ago for exteriors of in-car scenes using flat prerecorded footage, but technology has progressed dramatically since then. The main advances are in motion capture, 3D rendering and LED walls.

From the physical standpoint, LED video walls have greater brightness, allowing them not only to match the lit foreground subjects, but to light those subjects for accurate shadows and reflections without post compositing. And if that background imagery can be generated in real time — instead of recordings or renders — it can respond to the movement of the camera as well. That is where Unreal comes in — as a 3D game rendering engine that is repurposed to generate images corrected for the camera’s perspective in order to project on the background. This allows live-action actors to be recorded in complex CGI environments as if they were real locations. Actors can see the CGI elements they are interacting with, and the crew can see it all working together in real time without having to imagine how it’s going to look after VFX. We looked at using this technology on the last film I worked on, but it wasn’t quite there yet at the scale we needed; we used greenscreens instead, but it looks like this use of the technology has arrived. And Nvidia should be happy, because it takes a lot more GPU power to render the whole environment in real time than it does to render just what the camera sees after filming. But the power is clearly available, and even more is coming.

While no new Nvidia technology has been announced, something is always right around the corner. The current Turing generation of GPUs, which has been available for over 18 months, brought dedicated RTX cores for realtime raytracing. The coming generation is expected to scale up the number of CUDA cores and amount of memory by using smaller transistors than Turing’s 12nm process. This should offer more processing power for less money, which is always a welcome development.


Mike McCarthy is an online editor/workflow consultant with over 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.

Chaos offering V-Ray Education Collection via single license

Chaos Group has launched its V-Ray Education Collection, a new offering that provides access to 11 V-Ray and Phoenix FD products through a single license. Students, schools and educators can now use Chaos Group’s renderer and fluid simulation software for $149 a year (an 86% savings compared to purchasing individually).

“We wanted to make it easier for students and educators to access a full suite of software, as they learn and teach visualization for architecture, design and visual effects,” says Chaos Group education director Veselina Zheleva. “With our new collection, users can easily switch between the leading 3D applications, helping them branch out, reduce costs and expand their curriculums.”

The V-Ray Education Collection offers students more flexibility as they begin to master workflows and build their portfolios. As coursework and interests develop, students can apply different versions of V-Ray to their own challenges. With options for architecture (Revit, SketchUp, Rhino, 3ds Max), visual effects (Maya, Houdini), realtime (Unreal) and more, students can now hone in on the areas they are most focused on.

With one price across products, schools and teachers can choose the right product for the job, and then switch it if plans or industry trends start to shift. As cost is a historical barrier to departmental growth, Chaos says the V-Ray Education Collection has been priced to spur new classes and expanded curriculums, so administrators won’t have to wait for new budgets when ideas hit.

The V-Ray Education Collection includes full versions of 11 products and free upgrades for the length of the license. Free access to Chaos Group’s commercial support team is also included, providing on-demand support from set-up to settings.

The V-Ray Education Collection Includes: V-Ray for 3ds Max, V-Ray for Maya, V-Ray for SketchUp, V-Ray for Rhino, V-Ray for Revit, V-Ray for Modo, V-Ray for Unreal, V-Ray for Houdini, V-Ray for Cinema 4D, Phoenix FD for 3ds Max and Phoenix FD for Maya.

Main Image: Courtesy of ZEILT Productions 

AMD 2.1

Autodesk’s 3ds Max 2021 now available

Autodesk has introduced 3ds Max 2021, a new version of its 3D modeling, animation and rendering software. This latest release offers new tools designed to give 3D artists the ability to work across design visualization and media and entertainment with a fully scriptable baking experience, simpler install process, viewport and rendering improvements, and integrated Python 3 support, among other enhancements.

Highlights include:
• New texture baking experience supports physically based rendering (PBR), overrides and OSL workflows and provides intuitive new tool set.
• Updated installer allows users to get up and running quickly and easily.
• Integrated support for Python 3 and an improved pymxs API that ensure developers and technical artists can better customize pipelines.
• Native integration with the Arnold Renderer V6.0 offers a high-end rendering experience out of the box, while included scripts efficiently convert V-Ray and Corona files to the Physical Material for added flexibility.
• Performance enhancements simplify the use of PBR workflows across renderers, including with realtime game engines; provide direct access to high-fidelity viewports; improve the OSL user experience; significantly accelerate file I/O; and enhance control over modeling with a new weighted normals modifier.
• Tool set advancements to SketchUp import, Substance, ProSound and FBX streamline the creation and movement of high-quality 3D assets.
• New plugin interop and improvements – from support for AMG and OSL shaders to scene converter extensions – allow for a broader range of plugins to easily hook into 3ds Max while also simplifying plugin development and installation.

Early 3ds Max 2021 adopter Eloi Andaluz Fullà, a freelance VFX artist on beta, reported, “The revamped viewport, IBL controls and persistent Ambient Occlusion speed up the client review process because I can easily share high-quality viewport images without having to wait for renders. The new bake to texture update is also a huge time-saver because we can easily modify multiple parameters at once, while other updates simplify day-to-day tasks.”

3ds Max 2021 is now available as a stand-alone subscription or with the Autodesk Media & Entertainment Collection.


Maxon to live-stream NAB news and artist presentation

With the Las Vegas NAB Show now cancelled, Maxon will be hosting a virtual NAB presence on C4DLive.com featuring a lineup of working artists. Starting on Monday, April 20, and running through Thursday, April 23, these pros — who were originally slated to appear in Las Vegas — will share production tips, techniques and inspiration and talk about working with Maxon’s Cinema 4D, Red Giant and Redshift product lines

For over a decade, Maxon has supplemented its physical booth presence with live streaming presentations. This has allowed show attendees and those unable to attend events in person, to benefit from demos, technology updates and interaction with the guest artists in real time. First up will be CEO Dave McGavran, who will talk about Maxon’s latest news and recent merger with Red Giant.

In terms of artists, Penelope Nederlander will break down her latest end credit animation for Birds of Prey; filmmaker Seth Worley will walk through some of the visual effects shots from his latest short film, Darker Colors; Doug Appleton will share the creative processes behind creating the technology for Spider-Man: Far From Home; Jonathan Winbush will demo importing C4D scenes into Unreal Engine for rendering or VR/AR output; and Veronica Falconieri Hays will share how she builds cellular landscapes and molecular structures in order to convey complex scientific stories.

The line-up of artists also includes Mike “Beeple” Winkelmann, Stu Maschwitz, EJ Hassenfratz, Chris Schmidt, Angie Feret, Kelcey Steele, Daniel “Hashi” Hashimoto, Dan Pierse, Andy Needham, Saida Saetgareeva and many more.

Additional presenters’ info and a live streaming schedule will be available at C4DLive.com.

Main Image: (L-R) Saida Saetgareeva and Penelope Nederlander


Quick Chat: Scholar’s Will Johnson and William Campbell

By Randi Altman

In celebrating its 10th anniversary, animation and design company Gentleman Scholar has relaunched as Scholar and has put a new emphasis on its live-action work. Started by directors/partners William Campbell and Will Johnson in Los Angeles, the company has grown over the years and now boasts a New York City location as well.

Recent Scholar projects include the animated Timberland Legends Club spot, the live-action and animated Porsche Pop Star and the live-action Acura TLX.

Considering their new name change and website rebrand, we decided to reach out to “The Wills” to find out more about Scholar’s work philosophy and what this change means to the company.

Audi Q3

Why did you decide to rename and relaunch as Scholar?
Will Johnson: After 10 years it felt like a good time to redefine how the world views us. Not as only as a one-stop shop that can handle all of your design and animation needs, but also a live-action and storytelling powerhouse.

Will Campbell: The new name evokes cleanliness and sophistication and better represents how we have evolved. Gentleman Scholar was fun, quirky and playful. We’re still all of those things, but we feel like we’ve also become more cinematic, more polished and better collaborators that understand production more clearly… which allows us to navigate the industry better as a whole.

Even when it comes to live action and carrying our film into post, we can assess solutions on-set quicker and more fluidly, understanding the restrictions or additions we can take with us into the software. Scholar has changed immensely over the past 10 years. We have grown up and become smarter, faster and better. The rebrand is a window to who we have already become and who we plan to be.

How is the business different, and what’s stayed the same?
Johnson: It’s more refined. We’ve learned a lot about how to conduct ourselves in a competitive art world — the positive ways that we approach each project and allowing the stress of the job to kick us in the ass but not let it guide the decisions we make. It’s also about being patient with our team as well as our own decision-making.

Creativity is a process, and “turning it on” every day isn’t always easy. Understanding that not every idea you have is a great idea and how to be comfortable with your creative self is important. To trust in the “why” you are making something versus the “what” that you make. And that’s reflected in the new company name and our new website design. It’s the same us. The same wild bunch of creative explorers intent on pushing the boundaries of design and live action. We are just more certain of who we are and the stories we tell, and therefore more inclusive in our path to get there.

Acura

Campbell: We now have a decade’s worth of work to back up our thoughts and collaborations. This is enormous when you need to show how capable you are, not just in the standard we hold ourselves to visually, but in the quality and sophistication of our evolving storytelling. We have fine-tuned our production processes, enabling the pipelines of our edit, animation, CG and composite teams to more easily embrace the techniques and tools we use to craft the stories we want to tell… so we can be more decisive with the concepts we put on the table. From the software to the hardware, we are more refined.

Can you talk about how the industry has changed over the past 10 years?
Johnson: It’s more spread out than it’s ever been. There is more content that reaches more eyes in more places. From social to OOH to broadcast, the need to pull everyone together and create something that speaks to everyone all at once feels like it’s stronger and more apparent than before. And we’ve seen it all at this point, from vertical campaigns to entirely experiential ones. The era of “do more with less” is here.

Campbell: For us, we were very young when we opened Scholar. We were in our 20s, and everything was a fire drill and we thrived off the chaos. We have learned to harness the inspiration that comes with chaos and channel it into focused, productive creation.

Have you embraced working in the cloud — storage, rendering, review and approval, etc. — and if so, in what way?
Johnson: Yes. We know it’s a fast-paced world and in the climate of things, generally the globe is embracing a cloud-based way of thinking. Luckily, we have an amazing team of technologists so we can tap into our home-base server from anywhere at any time. From rendering to storage to reviews and approvals — it keeps us all united, focused and organized when we’re moving a million miles a minute in any different direction.

Campbell: Scholar has been testing the technology as it is getting better and cheaper, but we are always balancing convenience versus security, and those swing on a job-by-job basis. We’ve written tools to take advantage of storage and rendering resources on both coasts and use Aspera to facilitate file syncing between each office.

Can you talk about the tools you use for your work?
Johnson: The tangible ones are the usual suspects. Adobe’s Creative Suite and 3D tools like Autodesk Maya, Maxon Cinema 4D, Foundry Nuke and all of the animation and time-based ones, like Adobe Premiere and Avid Media Composer. But my favorite tools tend to be the brains and skills of our team… the words on paper and the channeling of art and thought into something tactile. As creators, we lust to make things, and seeing that circuit board of craft and making is something amazing to watch.

Campbell: Scholar has always been a mixed-media studio. We love getting our hands dirty with new software or cameras. We fundamentally want to do what’s right for the job and not rest inside our comfort zone. Thinking about what style is right for a client, not “how do I make my style fit,” is just how we are wired. The tool is always a means to an end. My favorite jobs are the ones where the technique is invisible, and it’s all about the experience.

We are operating in an entirely new world these days with the coronavirus and working remotely. How are you guys embracing the change?
Campbell: With an office on each coast, we have already had to learn to work as a team remotely. The years of unifying groups from a distance and finding ways for technology to bring artists closer together has set the stage for us right now. We have transitioned our workforce to 100% remote. It’s early days yet, but everyone is in good spirits, and we feel as connected as ever, although I do miss our lunch table.

Johnson: We’re definitely thankful for the staff and talent that we surround ourselves with and how they’ve handled their work-from-home routines. The check-ins, the mind melds and the daily (hourly) hangouts have helped. We’re using the change in the world as an opportunity to showcase our adaptability — how we can scale up and down even in the remote world — as a way to continue to grow our relationships and push the creative boundaries.

As people who find it hard to simply sit still, we’ve changed how we approach and talk about a project as each script comes in. The conversations about techniques are important — how we look at animation with a live-action lens, how 2D can become 3D, or vice versa. We’re more easily adaptable and change purely out of the need to discover what’s new.

Main Image: (L-R) Will Johnson and Will Campbell


VFX studio One Of Us adds CTO Benoit Leveau

Veteran post technologist Benoit Leveau has joined London’s One of Us as CTO. The studio, which is in its 16th year, employs 200 VFX artists.

Leveau, who joins One of Us from Milk VFX, has been in the industry for 18 years, starting out in his native France before moving to MPC in London. He then joined Prime Focus, integrating the company’s Vancouver and Mumbai pipelines with London. In 2013, he joined Milk in its opening year as head of pipeline. He helped to build that department and later led the development of Milk’s cloud rendering system.

The studio, which depends on what it calls “the efficient use of existing technology and the timely adoption of new technology,” says Leveau’s knowledge and experience will ensure that “their artists’ creativity has the technical foundation which allows it to flourish.”


Behind the Title: Loyalkaspar EP Scott Lakso

“People probably don’t expect that sometimes being an EP involves jumping into After Effects to render something, or contributing written ideas to strategic and conceptual projects.”

Name: Scott Lakso

Company: New York City’s Loyalkaspar

What does Loyalkaspar do?
We’re a creative branding agency specializing in brand strategy, identity, marketing and production. In human terms: We like to make good work that people will enjoy, and we try to do it for companies that make the world better!

SyFy rebrand

What’s your job title?
Executive Producer

What does that entail?
It entails a little bit of everything you’d expect, but mostly it involves making sure our clients are happy so that they’ll want to keep working with us on new projects. It also means establishing relationships with potential clients. At the office, it means overseeing the team of producers and making sure that everyone is happy and productive. There are a lot of proposals, budgets and timelines as part of that, but all of the nitty-gritty stuff is in service of fostering healthy relationships inside the company and with outside clients.

What would surprise people the most about what falls under that title?
People probably don’t expect that sometimes it involves jumping into After Effects to render something or contributing written ideas to strategic and conceptual projects. The title makes it sound like a reductive position, as in an “executive” producer doesn’t do any of the tasks they used to do as a coordinator or mid-level producer, but it’s actually more of a cumulative role — all of the skills I’ve developed over the 11 years it took to get to EP are still used anytime it seems appropriate.

What’s your favorite part of the job?
My favorite aspect is having the freedom, capacity and trust of the company leadership to do whatever I feel is best for our people, our clients and Loyalkaspar as a whole. Sometimes that’s helping a client out of a bind on short notice, encouraging a staffer to vent over a pint or organizing a spontaneous karaoke night when the time is right… which is more often than you might think.

What’s your least favorite?
When the circumstances of a project or situation require me to work reactively rather than proactively. I’m not a fan of winging it! It feels like driving at night with the headlights turned off. I’m much happier when I can plan a few steps ahead and help everyone avoid the headaches of hazardous speed bumps.

What is your most productive time of day?
Anytime that I can tune out distractions and focus on the task at hand. That’s more about creating a productive window in which to work rather than waiting for a specific time of day.

If you didn’t have this job, what would you be doing instead?
I’d be doing literally any job that NASA would be willing to hire me for, given my lack of astronautics knowledge and experience. So I’d probably be scrubbing dishes in the Cape Canaveral food court or something equally unglamorous.

How did you choose this profession?
“Chose this profession” is a strong phrase, given that I had no idea this kind of work existed until I moved to New York after college. I think I technically stumbled into it. That being said, at some point while stage-managing high school theater, I probably subconsciously chose to go down the path that would lead me to something like this as an adult.

Super Bowl halftime show graphics 2010

Can you name some recent projects?
For the past few months, I’ve been mostly dedicated to the brand identity development for Peacock, the new streaming platform from NBCUniversal. But other recent standout projects have been an interactive film for a museum in Philadelphia and involvement in pitches to the Sesame Workshop and Full Frontal with Samantha Bee.

Do you have a project you are most proud of?
It’s hard to pick only one, but producing the Super Bowl halftime show graphics in 2010 and overseeing our all-encompassing rebrand of SyFy in 2017 are a couple of personal favorites.

Name three piece of technology you can’t live without.
I’d have a hard time living in a world that didn’t have the technology to enjoy music and movies/television, so let’s say a good screen of some kind, a record player/stereo/iPod and some good headphones.

What social media channels do you follow?
At this point, only Instagram. I don’t think I’m alone in thinking that most social media content makes people feel worse about themselves and the world. At least on Instagram, people seem interested in posting things that others will enjoy rather than just broadcasting whatever will get them the most attention.

Do you listen to music while you work? If so what kind?
I find it impossible to work without music on. In terms of what specifically, almost anything instrumental is good for working to, but I really love old, cheesy music like bossa nova, retro Italian film soundtracks, 1960s/1970s library music, Burt Bacharach, etc. That probably makes me sound pretentious, or maybe like a dork, but I’m not exactly proud of my weird taste in music.

What do you do to de-stress from it all?
There are tons of options! When time permits, traveling and hiking outside of the city (especially outside of the country) are great for stress. I know that exercise is good for stress but that doesn’t make it any more enjoyable, so I have to trick myself into accidentally getting a workout while doing something like being in nature or exploring a foreign country. On a smaller scale, just drinking wine with my wife, going to a movie with my phone turned off or doing anything you can find in a book on “hygge” (like reading in my pajamas or cooking comforting food).


Maxon plugin allows for integration of Cinema 4D assets into Unity


Maxon is now a Unity Technologies Verified Solutions Partner and is distributing a plugin for Unity called Cineware by Maxon. The new plugin provides developers and creatives with seamless integration of Cinema 4D assets into Unity. Artists can easily create models and animations in Cinema 4D for use in realtime 3D (RT3D), interactive 2D, 3D, VR and AR experiences. The Cineware by Maxon plugin is now available free of charge on the Unity Asset Store.

The plugin is compatible with Cinema 4D Release 21, the latest version of the software, and Unity’s latest release, 2019.3. The plugin does not require a license of Cinema 4D as long as Cinema 4D scenes have been “Saved for Cineware.” By default, imported assets will appear relative to the asset folder or imported asset. The plugin also supports user-defined folder hierarchies.

Cineware by Maxon currently supports Geometry:
• Vertex Position, Normals, UV, Skinning Weight, Color
• Skin and Binding Rig
• Pose Morphs as Blend Shapes
• Lightmap UV2 Generation on Import

Materials:
• PBR Reflectance Channel Materials conversion
• Albedo/Metal/Rough
• Normal Map
• Bump Map
• Emission

Animated Materials:
• Color including Transparency
• Metalness
• Roughness
• Emission Intensity, Color
• Alpha Cutout Threshold

Lighting:
• Spot, Directional, Point
• Animated properties supported:
• Cone
• Intensity
• Color

Cameras:
• Animated properties
• Field of Vision (FOV)

Main Image: Courtesy of Cornelius Dämmrich


Foundry Nuke 12.1 offers upgrades across product line

Foundry has released Nuke 12.1, with UI enhancements and tool improvements across the entire Nuke family. The largest update to Blink and BlinkScript in recent years improves Cara VR node performance and introduces new tools for developers, while extended functionality in the timeline-based applications speeds up and enriches artist and team review.

Here are the upgrade highlights:
– New Shuffle node updates the classic checkboxes with an artist-friendly, node-based UI that supports up to eight channels per layer (Nuke’s limit) and consistent channel ordering, offering a more robust tool set at the heart of Nuke’s multi-channel workflow.
– Lens Distortion Workflow improvements: The LensDistortion node in NukeX is updated to have a more intuitive workflow and UI, making it easier and quicker to access the faster and more accurate algorithms and expanded options introduced in Nuke 11.
– Blink and BlinkScript improvements: Nuke’s architecture for GPU-accelerated nodes and the associated API can now store data on the GPU between operations, resulting in what Foundry says are “dramatic performance improvements to chains of nodes with GPU caching enabled.” This new functionality is available to developers using BlinkScript, along with bug fixes and a debug print out on Linux.
– Cara VR GPU performance improvements: The Cara VR nodes in NukeX have been updated to take advantage of the new GPU-caching functionality in Blink, offering performance improvements in viewer processing and rendering when using chains of these nodes together. Foundry’s internal tests on production projects show rendering time that’s up to 2.4 times faster.
– Updated Nuke Spherical Transform and Bilateral: The Cara VR versions of the Spherical Transform and Bilateral nodes have been merged with the Nuke versions of these nodes, adding increased functionality and GPU support in Nuke. Both nodes take advantage of the GPU performance improvements added in Nuke 12.1. They are now available in Nuke and no longer require a NukeX license.
– New ParticleBlinkScript node: NukeX now includes a new ParticleBlinkScript node, allowing developers to write BlinkScripts that operate on particles. Nuke 12.1 ships with more than 15 new gizmos, offering a starting point for artists who work with particle effects and developers looking to use BlinkScript.
– QuickTime audio and surround sound support: Nuke Studio, Hiero and HieroPlayer now support multi-channel audio. Artists can now import Mov containers holding audio on Linux and Windows without needing to extract and import the audio as a separate Wav file.

– Faster HieroPlayer launch and Nuke Flipbook integration: Foundry says new instances of HieroPlayer launch 1.2 times faster on Windows and up to 1.5 times faster on Linux in internal tests, improving the experience for artists using HieroPlayer for review. With Nuke 12.1, artists can also use HieroPlayer as the Flipbook tool for Nuke and NukeX, giving them more control when comparing different versions of their work in progress.
– High DPI Windows and Linux: UI scaling when using high-resolution monitors is now available on Windows and Linux, bringing all platforms in line with high-resolution display support added for macOS in Nuke 12.0 v1.
– Extended ARRI camera support: Nuke 12.1 adds support for ARRI formats, including Codex HDE .arx files, ProRes MXFs and the popular Alexa Mini LF. Foundry also says there are performance gains when debayering footage on CUDA GPUs, and there’s an SDK update.

The Call of the Wild director Chris Sanders on combining live-action, VFX

By Iain Blair

The Fox family film The Call of the Wild, based on the Jack London tale, tells the story of  a big-hearted dog named Buck whose is stolen from his California home and transported to the Canadian Yukon during the Gold Rush. Director Chris Sanders called on the latest visual effects and animation technology to bring the animals in the film to life. The film stars Harrison Ford and is based on a screenplay by Michael Green.

Sanders’ crew included two-time Oscar–winning cinematographer Janusz Kaminski; production designer Stefan Dechant; editors William Hoy, ACE, and David Heinz; composer John Powell; and visual effects supervisor Erik Nash.

I spoke with Sanders — who has helmed the animated films Lilo & Stitch, The Croods and How to Train Your Dragon — about making the film, which features a ton of visual effects.

You’ve had a very successful career in animation but wasn’t this a very ambitious project to take on for your live-action debut?
It was. It’s a big story, but I felt comfortable because it has such a huge animated element, and I felt I could bring a lot to the party. I also felt up to the task of learning — and having such an amazing crew made all of that as easy as it could possibly be.

Chris Sanders on set.

What sort of film did you set out to make?
As true a version as we could tell in a family-friendly way. No one’s ever tried to do the whole story. This is the first time. Before, people just focused on the last 30 pages of the novel and focused on the relationship between Buck and John Thornton, played by Harrison. And that makes perfect sense, but what you miss is the whole origin story of how they end up together — how Buck has to learn to become a sled dog, how he meets the wolves and joins their world. I loved all that, and also all the animation needed to bring it all alive.

How early on did you start integrating post and all the visual effects?
Right away, and we began with previs.

Your animation background must have helped with all the previs needed on this. Did you do a lot of previs, and what was the most demanding sequence?
We did a ton. In animation it’s called layout, a rough version, and on this we didn’t arrive on set without having explored the sequence many times in previs. It helped us place the cameras and block it all, and we also improvised and invented on set. But previs was a huge help with any heavy VFX element, like when Thornton’s going down river. We had real canoes in a river in Canada with inertial measurement devices and inertial recorders, and that was the most extensive recording we had to do. Later in post, we had to replace the stuntman in the canoe with Thornton and Buck in an identical canoe with identical movements. That was so intensive.

 

How was it working with Harrison Ford?
The devotion to his craft and professionalism… he really made me understand what “preparing for a role” really means, and he really focused on Thornton’s back story. The scene where he writes the letter to his wife? Harrison dictated all of that to me and I just wrote it down on top of the script. He invented all that. He did that quite a few times. He made the whole experience exciting and easy.

The film has a sort of retro look. Talk about working with DP Janusz Kaminski.
We talked about the look a lot, and we both wanted to evoke those old Disney films we saw as kids —something very rich with a magical storybook feel to it. We storyboarded a lot of the film, and I used all the skills I’d learned in animation. I’d see sequences a certain way, draw it out, and sometimes we’d keep them and cut them into editorial, which is exactly what you do in animation.

How tough was the shoot? It must have been quite a change of pace for you.
You’re right. It was about 50 days, and it was extremely arduous. It’s the hardest thing I’ve ever done physically, and I was not fully prepared for how exhausted you get — and there’s no time to rest. I’d be driving to set by 4:30am every day, and we’d be shooting by 6am. And we weren’t even in the Yukon — we shot here in California, a mixture of locations doubling for the Yukon and stage work.

 

Where did you post?
All on the Fox lot, and MPC Montreal did all the VFX. We cut it in relatively small offices. I’m so used to post, as all animation is basically post. I wish it was faster, but you can’t rush it.

You had two editors — William Hoy and David Heinz. How did that work?
We sent them dailies and they divided up the work since we had so much material. Having two great voices is great, as long as everyone’s making the same movie.

What were the big editing challenges?
The creative process in editorial is very different from animation, and I was floored by how malleable this thing was. I wasn’t prepared for that. You could change a scene completely in editorial, and I was blown away at what they could accomplish. It took a long time because we came back with over three hours of material in the first assembly, and we had to crush that down to 90 minutes. So we had to lose a huge amount, and what we kept had to be really condensed, and the narrative would shift a lot. We’d take comedic bits and make them more serious and vice versa.

Visual effects play a key role. Can you talk about working on them with VFX supervisor Erik Nash.
I love working with VFX, and they were huge in this. I believe there are less than 30 shots in the whole film that don’t have some VFX. And apart from creating Buck and most of the other dogs and animals, we had some very complex visual effects scenes, like the avalanche and the sledding sequence.

L-R: Director Chris Sanders and writer Iain Blair

We had VFX people on set at all times. Erik was always there supervising the reference. He’d also advise us on camera angles now and then, and we’d work very closely with him all the time. The cameras were hooked up to send data to our recording units so that we always knew what lens was on what camera at what focal length and aperture, so later the VFX team knew exactly how to lens the scenes with all the set extensions and how to light them.

The music and sound also play a key role, especially for Buck, right?
Yes, because music becomes Buck’s voice. The dogs don’t talk like they do in Lion King, so it was critical. John Powell wrote a beautiful score that we recorded on the Newman Stage at Fox, and then we mixed at 5 Cat Studios.

Where did you do the DI, and how important is it to you?
We did it at Technicolor with colorist Mike Hatzer, and I’m pretty involved. Janusz did the first pass and set the table, and then we fine-tuned it, and I’m very happy with the rich look we got.

Do you want to direct another live-action film?
Yes. I’m much more comfortable with the idea now that I know what goes into it. It’s a challenge, but a welcome one.

What’s next?
I’m looking at all sorts of projects, and I love the idea of doing another hybrid like this.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Behind the Title: Blue Bolt VFX supervisor Richard Frazer

“If we have done our job well, the viewers should never notice the work and instead just be enjoying the storytelling.”

Name: Richard Frazer

Company: London’s BlueBolt

Can you describe your company?
For the last four years, I’ve worked at BlueBolt, a Soho-based visual effects company in London. We work on high-end TV and feature films, and our main area of specialty is creating CG environments and populating them. BlueBolt is a privately owned company run by two women, which is pretty rare. They believe in nurturing good talent and training them up to help break through the glass ceiling, if an artist is up for it.

What’s your job title?
I joined as a lead compositor with a view to becoming a 2D supervisor, and now I am one of the studio’s main core VFX supervisors.

What does that entail?
It means I oversee all of the visual effects work for a specific TV show or movie — from script stage to final delivery. That includes working with the director and DP in preproduction to determine what they would like to depict on the screen. We then work out what is possible to shoot practically, or if we need to use visual effects to help out.

I’ll then often be on the set during the shoot to make sure we correctly capture everything we need for post work. I’ll work with the VFX producer to calculate the costs and time scales of the VFX work. Finally, I will creatively lead our team of talented artists to create those rendered images and make sure it all fits in with the show in a visually seamless way.

What would surprise people the most about what falls under that title?
The staggering amount of time and effort involved by many talented people to create something that an audience should be totally unaware exists. If we have done our job well, the viewers should never notice the work and instead just be enjoying the storytelling.

How long have you been working in VFX?
For around a decade. I started out as a rotoscope artist in 2008 and then became a compositor. I did my first supervisor job back in 2012.

How has the VFX industry changed in the time you’ve been working?
A big shift has been just how much more visual effects work there is on TV shows and how much the standard of that work has improved. It used to be that TV work was looked down on as the poor cousin of feature film work. But shows like Game of Thrones have set audience expectations so much higher now. I worked on nothing but movies for the first part of my career, but the majority of my work now is on TV shows.

Did a particular film inspire you along this path in entertainment?
I grew up on ‘80s sci-fi and horror, so movies like Aliens and The Thing were definitely inspirations. This was back when effects were almost all done practically, so I wanted to get into model-making or prosthetics. The first time I remember being blown away by digital VFX work was seeing Terminator 2 at the cinema. I’ve ended up doing the type of work I dreamed of as a kid, just in a digital form.

Did you go to film school?
No, I actually studied graphic design. I worked for some time doing animation, video editing and motion graphics. I taught myself compositing for commercials using After Effects. But I always had a love of cinema and decided to try and specializing in this area. Almost all of what I’ve learned has been on the job. I think there’s no better training than just throwing yourself at the work, absorbing everything you can from the people around you and just being passionate about what you do.

What’s your favorite part of the job?
Each project has its own unique set of challenges, and every day involves creative problem-solving. I love the process of translating what only exists in someone’s imagination and the journey of creating those images in a way that looks entirely real.

I also love the mix of being at the offices one day creating things that only exist in a virtual world, while the next day I might be on a film set shooting things in the real world. I get to travel to all kinds of random places and get paid to do so!

What’s your least favorite?
There are so many moving parts involved in creating a TV show or movie — so many departments all working together trying to complete the task at hand, as well as factors that are utterly out of your control. You have to have a perfectly clear idea of what needs to be done, but also be able to completely scrap that and come up with another idea at a moment’s notice.

If you didn’t have this job, what would you be doing instead?
Something where I can be creative and make things that physically exist. I’m always in awe of people who build and craft things with their hands.

Can you name some recent projects you have worked on?
Recent work has included Peaky Blinders, The Last Kingdom and Jamestown, as well as a movie called The Rhythm Section.

What is the project that you are most proud of?
I worked on a movie called Under the Skin a few years ago, which was a very technically and creatively challenging project. It was a very interesting piece of sci-fi that people seem to either love or hate, and everyone I ask seems to have a slightly different interpretation of what it was actually about.

What tools so you use day to day?
Almost exclusively Foundry Nuke. I use it for everything from drawing up concepts to reviewing artists’ work. If there’s functionality that I need from it that doesn’t exist, I’ll just write Python code to add features.

Where do you find inspiration now?
In the real world, if you just spend the time observing it in the right way. I often find myself distracted by how things look in certain light. And Instagram — it’s the perfect social media for me, as it’s just beautiful images, artwork and photography.

What do you do to de-stress from it all?
The job can be quite mentally and creatively draining and you spend a lot of time in dark rooms staring at screens, so I try to do the opposite of that. Anything that involves being outdoors or doing something physical — I find cycling or boxing are good ways to unwind.

I recently went on a paragliding trip in the French Alps, which was great, but I found myself looking at all these beautiful views of sunsets over mountains and just analyzing how the sunlight was interacting with the fog and the atmospheric hazing. Apparently, I can never entirely turn off that part of my brain.

Kent Zambrana joins design house ATK PLN as senior producer

Dallas-based design studio ATK PLN has added Kent Zambrana as senior producer. Zambrana has over a decade of experience in production, overseeing teams of artists working on live action, animation and design projects. Over the years, he has worked at a number of creative shops across the agency and production sides of the business where he developed media campaigns for omni-channel video ecosystems, interactive projects and future tech.

Adds Zambrana, “ATK PLN’s offerings across design, animation and live action fit nicely within my skill set. I’m looking forward to leveraging my production expertise and direct-to-brand and agency perspectives to better serve their clients and continue to grow their offerings.”

Zambrana, who studied radio, television and film at the University of Texas in Austin, started his pro career in Los Angeles, where he spent four years producing work across The Simpsons properties, including the television series, theme park ride, digital platforms and promotional campaigns. He brought that experience with him and returned to Texas where the became a supervising producer at Invodo, building out the video content library of the startup’s digital video platform. He then landed as head of production, producing animation, live action, 2D and 3D work. When the company was acquired by Industrial Color in 2018, he led both the Dallas and Austin offices as senior director of production before joining The Marketing Arm to lead their in-house production shop.

How does Zambrana relax? He can be found rehearsing, recording and performing with his indie pop band, Letting Up Despite Great Faults.

PBS celebrates 50 years with new on-air graphics

LA-based Nathaniel Howe Studios (NHS) has partnered with PBS and creative consultancy Lippincott to create a new on-air graphics package to coincide with the public broadcaster’s updated identity. This includes a refreshed logo, bolder color palette and custom typeface. The new on-air look for PBS — home to shows such as Masterpiece, Nature and Frontline — will roll out throughout 2020 as the network celebrates its 50th anniversary.

PBS looked to NHS to translate its new identity for modern screens while providing brand coherency at both the national and local levels with its 300-plus member stations — the studio called on Adobe’s Creative Suite to create the look. “Nathaniel and his team took our multi-platform vision to heart and developed a broad range of inspired ideas,” says Don Wilcox, VP, multi-platform marketing and content at PBS.

“The design and animation play a supporting role, framing the content and delivering all the key information effortlessly within the new ‘digital-first’ brand architecture,” explains NHS founder/CD Howe, adding that he really enjoyed getting to work with people deeply connected to the PBS brand. “Some of our clients have been with PBS for over 20 years. It was rewarding to serve a brand that is so loved across this country, one that does so much good through storytelling, and to find the balance of respecting its history while subtly evolving for the future. In the process, we also got to meet so many diverse people across the country and help to solve their creative challenges.”

Howe explains that the PBS logo provided the perfect framework to keep the visual system focused while reinforcing the brand in a subtle yet unified way. “Its new flat design also lent itself well to the motion theory behind the package, which favors minimal design elements, gentle key frames and purposeful applications of accent colors to complement the hero PBS blue.”

NHS kicked off this massive project during the early phases of the rebrand strategy, working closely with PBS and Lippincott to help translate the updated identity for digital and broadcast screens. To address the unique needs of PBS’ member stations, a process that included multiple phases of testing and feedback, NHS delivered a customizable Adobe After Effects tool kit and led a nationwide on-boarding process. This included the production of video tutorials and webinars as well as in-studio training programs and presentations for PBS summits and conferences.

“Our greatest challenge was solving almost endless co-branding scenarios within an After Effects toolkit and maintaining balance between unification and local market self-expression,” explains Howe. “This project took place over the course of a year, so we had to keep the focus locked and the fire lit throughout. And we also had to fight off the challenge of adding extra design elements or complexity for the sake of it. Simplicity was the key here.”

According to Howe, the vitals of the PBS rebrand live within a master tool kit that is quick and easy to use for everyone. “The beauty of Lippincott’s minimalistic branding system came into play here as it enabled us to eliminate technical limitations, standardize the graphics creation process, and speed up workflows across the board.”

Howe is no stranger to PBS. For over a decade, he has collaborated with the channel on on-air graphics promos for several Ken Burns documentaries (Jackie Robinson, Prohibition), the Indian Summers series and the PBS Arts Fall Festival. He also helmed the brand refresh of PBS’ long-running anthology series, Great Performances, and sizzle reels for network summits.

“We simply wanted this package to generate excitement around PBS while honoring the integrity of the brand and the value it offers in our cluttered media landscape,” concludes Howe. “As a team, our hearts were aligned from the outset — and as a new father, I was personally inspired by the thought-provoking and educational nature of the content PBS offers to such a broad-reaching audience.”

Alkemy X adds all-female design collective Mighty Oak

Alkemy X has added animation and design collective Mighty Oak to its roster for US commercial representation. Mighty Oak has used its expertise in handmade animation techniques and design combined with live action for brands and networks, including General Electric, Netflix, Luna Bar, HBO, Samsung NBC, Airbnb, Conde Nast, Adult Swim and The New York Times.

Led by CEO/EP Jess Peterson, head of creative talent Emily Collins and CD Michaela Olsen, the collective has garnered over 3 billion online views. Mighty Oak’s first original short film, Under Covers, premiered at the 2019 Sundance Film Festival. Helmed by Olsen, the quirky stop-motion short features handmade puppets and forced-perspective sets to glimpse into the unsuspecting lives and secrets that rest below the surface of a small town.

“I was immediately struck by the extreme care that Mighty Oak takes on each and every frame of their work,” notes Alkemy X EP Eve Ehrich. “Their handmade style and fresh approach really make for dynamic, memorable animation, regardless of the concept.”

Mighty Oak’s Peterson adds, “We are passionate about collaborating with our clients from the earliest stages, working together to craft original character designs and creating work that is memorable and fun.”

Missing Link, The Lion King among VES Award winners

The Visual Effects Society (VES), the industry’s global professional honorary society, held its 18th Annual VES Awards, the yearly celebration that recognizes outstanding visual effects artistry and innovation in film, animation, television, commercials, video games and special venues.

Comedian Patton Oswalt served as host for the 9th time to the more than 1,000 guests gathered at the Beverly Hilton to celebrate VFX talent in 25 awards categories. The Lion King was named the photoreal feature winner, garnering three awards. Missing Link was named top animated film, winning two awards. The Mandalorian was named best photoreal episode and garnered two awards, with Game of Thrones and Stranger Things 3 also winning two awards each. Hennessy: The Seven Worlds topped the commercial field with two wins.

Andy Serkis presented the VES Award for Creative Excellence to visual effects supervisor Sheena Duggal. Joey King presented the VES Visionary Award to director-producer-screenwriter Roland Emmerich. VFX supervisor Pablo Helman presented the Lifetime Achievement Award to director/producer/screenwriter Martin Scorsese, who accepted via video from New York. Scorsese’s The Irishman also picked up two awards, including Outstanding Supporting Visual Effects in a Photoreal Feature.

Presenters also included: directors J.J. Abrams, Jon Favreau, Rian Johnson and Josh Cooley.

Winners of the 18th Annual VES Awards are as follows:

Outstanding Visual Effects in a Photoreal Feature

THE LION KING

Robert Legato

Tom Peitzman

Adam Valdez

Andrew R. Jones

 

Outstanding Supporting Visual Effects in a Photoreal Feature

THE IRISHMAN

Pablo Helman

Mitchell Ferm

Jill Brooks

Leandro Estebecorena

Jeff Brink

 

Outstanding Visual Effects in an Animated Feature

MISSING LINK

Brad Schiff

Travis Knight

Steve Emerson

Benoit Dubuc

 

Outstanding Visual Effects in a Photoreal Episode

THE MANDALORIAN; The Child

Richard Bluff

Abbigail Keller

Jason Porter

Hayden Jones

Roy K. Cancino

 

Outstanding Supporting Visual Effects in a Photoreal Episode

CHERNOBYL; 1:23:45

Max Dennison

Lindsay McFarlane

Clare Cheetham

Paul Jones

Claudius Christian Rauch

 

Outstanding Visual Effects in a Real-Time Project

Control

Janne Pulkkinen

Elmeri Raitanen

Matti Hämäläinen

James Tottman

 

Outstanding Visual Effects in a Commercial

Hennessy: The Seven Worlds

Carsten Keller

Selçuk Ergen

Kiril Mirkov

William Laban

 

Outstanding Visual Effects in a Special Venue Project

Star Wars: Rise of the Resistance

Jason Bayever

Patrick Kearney

Carol Norton

Bill George

 

Outstanding Animated Character in a Photoreal Feature

ALITA: BATTLE ANGEL; Alita

Michael Cozens

Mark Haenga

Olivier Lesaint

Dejan Momcilovic

 

Outstanding Animated Character in an Animated Feature

MISSING LINK; Susan

Rachelle Lambden

Brenda Baumgarten

Morgan Hay

Benoit Dubuc

 

Outstanding Animated Character in an Episode or Real-Time Project

STRANGER THINGS 3; Tom/Bruce Monster

Joseph Dubé-Arsenault

Antoine Barthod

Frederick Gagnon

Xavier Lafarge

 

Outstanding Animated Character in a Commercial

Cyberpunk 2077; Dex

Jonas Ekman

Jonas Skoog

Marek Madej

Grzegorz Chojnacki

 

Outstanding Created Environment in a Photoreal Feature

THE LION KING; The Pridelands

Marco Rolandi

Luca Bonatti

Jules Bodenstein

Filippo Preti

 

Outstanding Created Environment in an Animated Feature

TOY STORY 4; Antiques Mall

Hosuk Chang

Andrew Finley

Alison Leaf

Philip Shoebottom

 

Outstanding Created Environment in an Episode, Commercial, or Real-Time Project

GAME OF THRONES; The Iron Throne; Red Keep Plaza

Carlos Patrick DeLeon

Alonso Bocanegra Martinez

Marcela Silva

Benjamin Ross

 

Outstanding Virtual Cinematography in a CG Project

THE LION KING

Robert Legato

Caleb Deschanel

Ben Grossmann

AJ Sciutto

 

Outstanding Model in a Photoreal or Animated Project

THE MANDALORIAN; The Sin; The Razorcrest

Doug Chiang

Jay Machado

John Goodson

Landis Fields IV

 

Outstanding Effects Simulations in a Photoreal Feature

STAR WARS: THE RISE OF SKYWALKER

Don Wong

Thibault Gauriau

Goncalo Cababca

François-Maxence Desplanques

 

Outstanding Effects Simulations in an Animated Feature

FROZEN 2

Erin V. Ramos

Scott Townsend

Thomas Wickes

Rattanin Sirinaruemarn

 

Outstanding Effects Simulations in an Episode, Commercial, or Real-Time Project

STRANGER THINGS 3; Melting Tom/Bruce

Nathan Arbuckle

Christian Gaumond

James Dong

Aleksandr Starkov

 

Outstanding Compositing in a Feature

THE IRISHMAN

Nelson Sepulveda

Vincent Papaix

Benjamin O’Brien

Christopher Doerhoff

 

Outstanding Compositing in an Episode

GAME OF THRONES; The Long Night; Dragon Ground Battle

Mark Richardson

Darren Christie

Nathan Abbott

Owen Longstaff

 

Outstanding Compositing in a Commercial

Hennessy: The Seven Worlds

Rod Norman

Guillaume Weiss

Alexander Kulikov

Alessandro Granella

 

Outstanding Special (Practical) Effects in a Photoreal or Animated Project

THE DARK CRYSTAL: THE AGE OF RESISTANCE; She Knows All the Secrets

Sean Mathiesen

Jon Savage

Toby Froud

Phil Harvey

 

Outstanding Visual Effects in a Student Project

THE BEAUTY

Marc Angele

Aleksandra Todorovic

Pascal Schelbli

Noel Winzen

 

VFX-heavy Skyworth OLED TV spot via The-Artery

The-Artery created a spot for Skyworth’s latest version of its W81|W81 Pro Wallpaper OLED TV, which debuted last month at the “See the Wonder” event at CES 2020.

Created using The-Artery‘s newly opened Resolve-based color room and expanded design capabilities —spearheaded by colorist Stephen Picano and design director Lauren Indovina — the commercial features a couple swimming through space-like waters, children battling origami dragons while floating in a paper boat and a traveler treking through snowy tundras while glowing jellyfish float overhead. Publicis, Skyworth’s agency, wanted the ad to reflect “the wonder” of the company’s newest television model.

“The campaign, helmed by director Eli Sverdlov, was very director-led in a way that I’ve never seen before,” explains The-Artery’s EP/MD, Deborah Sullivan. “Of course, there was still ongoing dialogue with the client and agency, but the level of creative control that was entrusted is almost unheard of. Everything was open from start to finish, including the ideation phase, color grading and design — to name a few. Our team had a lot of fun jumping straight into the edit to develop and launch what we consider as a high-end conceptual throwback to the nineties.”

Sverdlov agrees: “Our flexible creative process was in a condensed schedule and required a very unique collaboration. We were practically creating the ideas and visuals while editing and sourcing footage.”

Due to the production’s long shooting schedule and tight deadlines, the visual effects were designed via Autodesk Flame in realtime, all under one roof, while filming took place in Serbia. Additional footage was carefully curated as well as color graded and cut to fit the tone and flow of the rest of the piece. The creature imagery such as the jellyfish was done via CG.

In addition to Flame and Resolve, The-Artery called on SideFX Houdini, Autodesk Maya, Maxon’s RedShift, Otoy’s Octane, Autodesk’s Arnold, Adobe After Effects and Maxon’s Cinema 4D.

Framestore launches FPS preproduction services

VFX studio Framestore has launched FPS (Framestore Pre-production Services) for the global film and content production industries. An expansion of Framestore’s existing capability, FPS is available to clients in need of standalone preproduction support or an end-to-end production solution.

The move builds out and aligns the company’s previz, virtual production, techviz and postviz services with Framestore’s art department (which operates either as part of the Framestore workflow or as a stand-alone creative service), virtual production team and R&D unit, and integrates with the company’s VFX and animation teams. The move builds on work on films such as Gravity and the knowledge gained during the company’s eight-year London joint venture with visualization company The Third Floor. FPS is working on feature film projects as part of an integrated offering and as a standalone visualization partner, with more projects slated in the coming months.

The new team is led by Alex Webster, who joins as FPS managing director after running The Third Floor London. He will report directly to Fiona Walkinshaw, Framestore’s global managing director, film.

“This work aligns Framestore’s singular VFX and animation craft with a granular understanding of the visualization industry,” says Webster. “It marries the company’s extraordinary legacy in VFX with established visualization and emergent virtual production processes, supported by bleeding-edge technology and dedicated R&D resource to inform the nimble approach which our clients need. Consolidating our preproduction services represents a significant creative step forward.”

“Preproduction is a crucial stage for filmmakers,” says chief creative officer Tim Webber. “From mapping out environments to developing creatures and characters to helping plot action sequences it provides unparalleled freedom in terms of seeing how a story unfolds or how characters interact with the worlds we create. Bringing together our technical innovation with an understanding of filmmaking, we want to offer a bespoke service for each film and each individual to help tell compelling, carefully crafted stories.”

“Our clients’ needs are as varied as the projects they bring to us, with some needing a start-to-finish service that begins with concept art and ends in post while others want a bespoke, standalone solution to specific creative challenges, be that in early stage concepting, through layout and visualization or in final animation and VFX” says Framestore CEO William Sargent. “It makes sense to bring all these services in-house — even more so when you consider how our work in adjacent fields like AR, VR and MR has helped the likes of HBO, Marvel and Warner Bros. bring their IP to new, immersive platforms. What we’ll ultimately deliver goes well beyond previz and beyond visualization.”

Main Image: (L-R) Tim Webber, Fiona Walkinshaw and Alex Webster.

Rob Legato talks The Lion King‘s Oscar-nominated visual effects

By Karen Moltenbrey

There was a lot of buzz before — and after — this summer’s release of Disney’s remake of the animated classic The Lion King. And what’s not to love? From the animals to the African savannas, Disney brought the fabled world of Simba to life in what is essentially a “live-action” version of the beloved 1994 2D feature of the same name. Indeed, the filmmakers used tenets of live-action filmmaking to create The Lion King, and themselves call it a visual effects film. However, there are those who consider this remake, like the original, an animated movie, as 2019’s The Lion King used cutting-edge CGI for the photoreal beasts and environments.

Rob Legato

Whether you call it “live action” or “animation,” one thing’s for sure. This is no ordinary film. And, it was made using no ordinary production process. Rather, it was filmed entirely in virtual reality. And it’s been nominated for a Best Visual Effects Oscar this year.

“Everything in it is a visual effect, created in the same way that we would make a visual effects-oriented film, where we augment or create the backgrounds or create computer-generated characters for a scene or sequence. But in this case, that spanned the entire movie,” says VFX supervisor Rob Legato. “We used a traditional visual effects pipeline and hired MPC, which is a visual effects studio, not an animation house.”

MPC, which created the animals and environments, crafted all the elements, which were CG, and handled the virtual production, working with Magnopus to develop the necessary tools that would take the filmmakers from previz though shooting and, eventually, into post production. Even the location scouting occurred within VR, with Legato, director Jon Favreau and others, including cinematographer Caleb Deschanel, simultaneously walking through the sets and action by using HTC Vive headsets.

Caleb Deschanel (headset) and Rob Legato. Credit: Michael Legato

The Animations and Environments
MPC, known for its photorealistic animals and more, had worked with Disney and Favreau on the 2016 remake of The Jungle Book, which was shot within a total greenscreen environment and used realistic CG characters and sets with the exception of the boy Mowgli. (It also used VR, albeit for previsualization only.) The group’s innovative effort for that work won an Oscar for visual effects. Apparently that was just the tip of the spear, so to speak, as the team upped its game with The Lion King, making the whole production entirely CG and taking the total filmmaking process into virtual reality.

“It had to look as believable as possible. We didn’t want to exaggerate the performances or the facial features, which would make them less realistic,” says Legato of the animal characters in The Lion King.

The CG skeletons were built practically bone for bone to match their real-life counterparts, and the digital fur matched the hair variations of the various species found in nature. The animators, meanwhile, studied the motion of the real-life animals and moved the digital muscles accordingly.

“Your eye picks up when [the animal] is doing something that it can’t really do, like if it stretches its leg too far or doesn’t have the correct weight distribution that’s affecting the other muscles when it puts a paw down,” says Legato, contending that it is almost impossible to tell the CG version of the characters from the real thing in a non-talking shot or a still frame.

To craft the animals and environments, the MPC artists used Autodesk’s Maya as the main animation program, along with SideFX Houdini for water and fire simulations and Pixar’s RenderMan for rendering. MPC also used custom shaders and tools, particularly for the fur, mimicking that of the actual animal. “A lion has so many different types of hair — short hair around the body, the bushy mane, thick eyebrow hairs and whiskers. And every little nuance was recreated and faithfully reproduced,” Legato adds.

MPC artists brought to life dozens and dozens of animals for the film and then generated many more unique variations — from lions to mandrills to hyenas to zebras and more, even birds and bugs. And then the main cast and background animals were placed within a photoreal environment, where they were shot with virtual cameras that mimicked real cameras.

The world comprises expansive, open landscapes. “There were many, many miles of landscapes that were constructed,” says Legato. The filmmakers would film within pockets that were dressed and populated for different scenes, from Pride Rock to the interior of a cave to the savanna to the elephant graveyard — all built in CGI.

“Everything was simulated to be the real thing, so the sum total of the illusion is that it’s all real. And everything supports each other — the grounds, the characters, what they are physically doing. The sum total of that adds up to where your brain just says, ‘OK, this must be real. I’ll stop looking for flaws and will now just watch the story,’” says Legato. “That was the creative intent behind it.”

Virtual Production
All the virtual camera work was accomplished within Unity’s engine, so all the assets were ported in and out of that game engine. “Everyone would then know where our cameras were, what our camera moves were, how we were following the action, our lens choices, where the lights were placed … all those things,” says Legato.

Magnopus created the VR tools specific for the film, which ran on top of Unity to get the various work accomplished, such as the operation of the cameras. “We had a crane, dolly and other types of cameras encoded so that it basically drove its mate in the computer. For instance, we created a dolly and then had a physical dolly with encoders on it, so everything was hand operated, and we had a dolly grip and a camera assistant pulling focus. There was someone operating the cameras, and sometimes there was a crane operator. We did SteadiCam as well through an actual SteadiCam with a sensor on it to work with OptiTrack [motion capture that was used to track the camera],” explains Legato. “We built a little rig for the SteadiCam as well as one for a drone we’d fly around the stage, and we’d create the illusion that it was a helicopter shot while flying around Africa.”

Because the area within VR was so vast, a menu system was created so the filmmakers could locate one another within the virtual environment, making location scouting much easier. They also could take snapshots of different areas and angles and share them with the group. “We were standing next to each other [on stage], but within the virtual environment, we could be miles apart and not see each other because we’re maybe behind trees or rocks.”

As Legato points out, the menu tool is pretty robust. “We basically built a game of film production. Everything was customizable,” he says. Using iPads, the group could play the animation. As the camera was in operation, they could stop the animation, wind it backward, speed it forward, shoot it in slow motion or faster motion. “These options were all accessible to us,” he adds.

Legato provides the following brief step-by-step overview of how the virtual production occurred. First, the art department created the sets — Africa with the trees, ponds, rivers, mountains, waterfalls and so forth. “Based on the script, you know somewhat where you need to be [in the set],” he says. Production designer James Chinlund would make a composite background, and then they — along with Favreau, Deschanel and animation supervisor Andrew Jones — would go into VR.

“We had built these full-size stationary chess pieces of the animals, and in VR, we’d have these tools that let us grab a lion, for instance, or a meerkat, and position them, and then we’d look through the lens and start from there,” says Legato. “We would either move them by hand or puppeteer a simple walk cycle to get the idea of the blocking.”

Jones and his team would animate that tableau and port it back into the game engine as an animation cycle. “We’d find camera angles and augment them. We’d change some of the animation or slow it down or move the animals in slightly different positions. And then we’d shoot it like it’s on a live-action stage,” explains Legato. “We’d put a dolly track down, cover the action with various types of lenses, create full-coverage film dailies… We could shoot the same scene in as many different angles as we’d wish. We could then play it out to a video deck and start editing it right away.” The shots they liked might get rendered with more light or motion blur, but a lot of the time, they’d go right off the video tap.

Meanwhile, MPC recorded everything the filmmakers did and moved — every leaf, rock, tree, animal. Then, in post, all of that information would be reconverted back into Maya sets and the animation fine-tuned.

“In a nutshell, the filmmakers were imparting a live-action quality to the process — by not faking it, but by actually doing it,” says Legato. “And we still have the flexibility of full CGI.”

The Same, But Different
According to Legato, it did not take the group long to get the hang of working in VR. And the advantages are many — chief among them, time savings when it comes to planning and creating the sequence editorially, and then instantly being able to reshoot or iterate the scene inexpensively. “There is literally no downside to exploring a bold choice or an alternate angle on the concept,” he points out.

Yes, virtual filmmaking is the future, contends Legato.

So, back to the original question: Is The Lion King a VFX film or an animated film? “It’s perhaps a hybrid,” says Legato. “But, if you didn’t know how we did it and if the animals didn’t talk, you’d think it was done in the traditional manner of a live-action film. Which it is, visually speaking. You wouldn’t necessarily describe it as looking like ‘an animated film’ because it doesn’t really look like an animated film, like a Pixar or DreamWorks movie. By labeling it as such, you’re putting it into a hole that it’s not. It’s truly just a movie. How we achieved it is immaterial, as it should be.”

Legato and his colleagues call it “live action,” which it truly is. But some, including the Golden Globes, categorized it as “animation.” (They also called 2015’s The Martian and 2010’s The Tourist “comedies.”)

Call it what you will; the bottom line is that the film is breathtaking and the storytelling is amazing. And the filmmaking is inventive and pushes traditional boundaries, making it difficult to perhaps fit into a traditional category. Therefore, “beautiful,” “riveting,” “creative” and “innovative” might be the only descriptions necessary.


Karen Moltenbrey is a veteran writer, covering visual effects and post production.

Check out MPC’s VFX breakdown on the film:

Behind the Title: Design director Liron Eldar-Ashkenazi

NAME: Liron Eldar-Ashkenazi  (@iamlirona)

WHAT’S YOUR JOB TITLE?
Design Director

WHAT DOES THAT ENTAIL?
I help companies execute on their creative hopes and dreams, both hands-on and as a consultant and director.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
Educating my clients about the lay of the land when it comes to getting what they want creatively. People typically think coming up with creative concepts is easy and quick. A big part of my job is helping companies see the full scope of taking a project from beginning to end with success while being mindful of timeline and budget.

HOW LONG HAVE YOU BEEN WORKING IN MOTION GRAPHICS?
I was accepted to the prestigious position of motion graphics artist in the Israeli defense force when I was 18 — all women and men have to serve in the military. It’s now been about 12 years that I’ve been creating and animating.

HOW HAS THE INDUSTRY CHANGED IN THE TIME YOU’VE BEEN WORKING? WHAT’S BEEN GOOD, WHAT’S BEEN BAD?
I see a lot more women 3D artists and animators. It’s so refreshing! It used to be a man’s world and I’m so thrilled to see the shift. Overall, it’s becoming a bit more challenging as screens are changing so fast and there are so many of them. Everything you create has to suit a thousand different use-cases and coming up with the right strategy for that takes longer than it did when we were only thinking in 15’s and 30’s 16:9.

WHAT’S YOUR FAVORITE PART OF THE JOB?
I love that there are so many facets to my work under one title. Coming up with concepts, designing, animating, creating prints and artworks, working with typography is just so much more rewarding than in the days when you only had one job — lighting, texturing, animating, designing. Now an artist is free to do multiple things, and it’s well appreciated.

WHAT’S YOUR LEAST FAVORITE?
Long rendering times. I think computers are becoming stronger, but we also demand more and more from them. I still hate sitting and waiting for a computer to show me what I’m working on.

WHAT IS YOUR MOST PRODUCTIVE TIME OF THE DAY?
Morning! I’m a morning person who loves to start early and finish when there’s still light out.

WHY DID YOU CHOOSE THIS PROFESSION?
I didn’t really choose it; it chose me.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
At age 16 I knew I would never be great at sitting on my behind and just studying the text. I knew I needed to create in order to succeed. It’s my safe space and what I do best.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Some other form of visual artist, or a psychologist.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
Right before I left The-Artery as a design director, where I’d been working the past three years, we created visuals for a really interesting documentary. All the content was created in 3D using Cinema 4D and Octane. We produced about 18 different spots explaining different concepts. My team and I did everything from concept to rendering. It’ll be amazing to see it when it comes out.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
At The-Artery, I was in charge of a really interesting branding project for a Fin-tech company. We created an entire visual language in 3D for their everyday marketing, website, and blog use. All content was designed and rendered using Cinema 4D and it was so great combining a branding exercise with motion graphics to bring all the visuals to life.

YOU HAVE RECENTLY PRESENTED YOUR WORKFLOW AT TRADES SHOWS AND ROAD TOURS. TELL US ABOUT SHARING YOUR WORK PUBLICLY.
I’ve was invited by Maxon, the developers of Cinema 4D, to give a live-demo presentation at SIGGRAPH 2019. It was an exceptional experience, and I received really lovely responses from the community and artists looking to combine more graphic design into their motion graphics and 3D pipeline. I’ve shared some cool methods I’ve developed in Cinema 4D for creating fine-art looks for renders.

PRESENTLY, YOU ARE WORKING AS ARTIST IN RESIDENCE AT FACEBOOK. HOW DID THIS COME ABOUT AND WHAT KIND OF WORK ARE YOU DOING?
Facebook somehow found me. I assume it was through my Instagram account, where I share my wild, creative experiments. The program is a six-week residency at their New York office, where I get to flex my analog muscles and create prints at their Analog lab. In the lab, they have all the art supplies you can ask for along with an amazing Risograph printer. I’ve been creating posters and zines from my 3D rendered illustrations.

WHAT SOFTWARE TOOLS DO YOU USE DAY-TO-DAY?
Maxon Cinema 4D is my primary tool. I design almost everything I create in it, including work that seems to be flat and graphic.

WHERE DO YOU FIND INSPIRATION NOW?
I find talking to people and brainstorming has always been the thing that sparks the most creativity in me. Solving problems is another way I tackle every design assignment. I always need to figure out what needs to be fixed, be better or change completely, and that’s what I find most inspires me to create.

THIS IS A HIGH-STRESS JOB WITH DEADLINES AND CLIENT EXPECTATIONS. WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
Planning is critical for me to feel confident about projects and helps me avoid stress in general. Giving my work 100% and not promising any false expectations to my clients also helps limit stress. It’s key to be honest from the get-go if I think something wouldn’t work in the timeline, or if late changes would hurt the final product. If I do get to a point that I’m really stressed, I find that running, going out dancing or dancing to my favorite music at home, and generally listening to music are all helpful.

Picture Shop VFX acquires Denmark’s Ghost VFX

Burbank’s Picture Shop VFX has acquired Denmark’s Ghost VFX. This Copenhagen-base studio, founded in 1999, provides high-end visual work for film, television and several streaming platforms. The move helps Picture Shop “increase its services worldwide and broaden its talent and expertise,” according to Picture Shop VFX’s president Tom Kendall.

Over the years, Ghost has contributed to more than 70 feature films and titles. Some of Ghost’s work includes Star Wars: The Rise of Skywalker, The Mandalorian, The Walking Dead, See, Black Panther and Star Trek Discovery.

“As we continue to expand our VFX footprint into the international market, I am extremely excited to have Ghost join Picture Shop VFX,” says Bill Romeo, president of Picture Head Holdings.

Christensen says the studio takes up three floors and 13,000 square feet in a “vintage and beautifully renovated office building” in Copenhagen. Their main tools are Autodesk Maya, Foundry Nuke and SideFX Houdini.

“We are really looking forward to a tight-nit collaboration with all the VFX teams in the Picture Shop group,” says Christensen. “Right now Ghost will continue servicing current clients and projects, but we’re really looking forward to exploring the massive potential of being part of a larger and international family.”

Picture Shop VFX is a division of Picture Head Holdings. Picture Head Holdings has locations in Los Angeles, Vancouver, the United Kingdom, and Denmark.

Main Image: Ghost artists at work.

Conductor Companion app targets VFX boutiques and freelancers

Conductor Technologies has introduced Conductor Companion, a desktop app designed to simplify the use of the cloud-based rendering service. Tailored for boutique studios and freelance artists, Companion streamlines the Conductor on-ramp and rendering experience, allowing users to easily manage and download files, write commands and handle custom submissions or plug-ins from their laptops or workstations. Along with this release, Conductor has added initial support for Blender creative software.

“Conductor was originally designed to meet the needs of larger VFX studios, focusing our efforts on maximizing efficiency and scalability when many artists simultaneously leverage the platform and optimizing how Conductor hooks into those pipelines,” explains CEO Mac Moore. “As Conductor’s user base has grown, we’ve been blown away by the number of freelance artists and small studios that have come to us for help, each of which has their own unique needs. Conductor Companion is a nod to that community, bringing all the functionality and massive render resource scale of Conductor into a user-friendly app, so that artists can focus on content creation versus pipeline management. And given that focus, it was a no-brainer to add Blender support, and we are eager to serve the passionate users of that product.”

Moore reports that this app will be the foundation of Conductor’s Intelligence Hub in the near future, “acting as a gateway to more advanced functionality like Shot Analytics and Intelligent Bid Assist. These features will leverage AI and Conductor’s cloud knowledge to help owners and freelancers make more informed business decisions as it pertains to project-to-project rendering financials.”

Conductor Companion is currently in public beta. You can download the app here.

In addition to Blender, applications currently supported by Conductor include Autodesk Maya and Arnold; Foundry’s Nuke, Cara VR, Katana, Modo and Ocula; Chaos Group’s V-Ray; Pixar’s RenderMan; Isotropix’s Clarisse; Golaem; Ephere’s Ornatrix; Yeti; and Miarmy.

The Mill opens boutique studio in Berlin

Technicolor’s The Mill has officially launched in Berlin. This new boutique studio is located in the heart of Berlin, situated in the creative hub of Mitte, near many of Germany’s agencies, production companies and brands.

The Mill has been working with German clients for years. Recent projects include the Mercedes’ Bertha Benz spot with director Sebastian Strasser; Netto’s The Easter Surprise, directed in-house by The Mill; and BMW The 8 with director Daniel Wolfe. The new studio will bring The Mill’s full range of creative services from color to experiential and interactive, as well as visual effects and design.

The Mill Berlin crew

Creative director Greg Spencer will lead the creative team. He is a multi-award winning creative, having won several VES, Cannes Lions and British Arrow awards. His recent projects include Carlsberg’s The Lake, PlayStation’s This Could Be You and Eve Cuddly Toy. Spencer also played a role in some of Mill Film’s major titles. He was the 2D supervisor for Les Misérables and also worked on the Lord of the Rings trilogy. His resume also includes campaigns for brands such as Nike and Samsung.

Executive producer Justin Stiebel moves from The Mill London, where he has been since early 2014, to manage client relationships and new business. Since joining the company, Stiebel has produced spots such as Audi’s Next Level and the Mini’s “The Faith of a Few” campaign. He has also collaborated with directors such as Sebastian Strasser, Markus Walter and Daniel Wolfe while working on brands like Mercedes, Audi and BMW.

Sean Costelloe is managing director of The Mill London and The Mill Berlin.

Main Image Caption: (L-R) Justin Stiebel and Greg Spencer

VES Awards: The Lion King and Alita earn five noms each

The Visual Effects Society (VES) has announced its nominees for the 18th Annual VES Awards, which recognize outstanding visual effects artistry and innovation in film, animation, television, commercials and video games and the VFX supervisors, VFX producers and hands-on artists who bring this work to life. Alita: Battle Angel and The Lion King both have five nominations each; Toy Story 4 is the top animated film contender with five nominations, and Game of Thrones and The Mandalorian tie to lead the broadcast field with six nominations each.

Nominees in 25 categories were selected by VES members via events hosted by 11 VES sections, including Australia, the Bay Area, Germany, London, Los Angeles, Montreal, New York, New Zealand, Toronto, Vancouver and Washington.

The VES Awards will be held on January 29 at the Beverly Hilton Hotel. The VES Lifetime Achievement Award will be presented to Academy, DGA and Emmy-Award winning director-producer-screenwriter Martin Scorsese. The VES Visionary Award will be presented to director-producer-screenwriter Roland Emmerich. And the VES Award for Creative Excellence will be given to visual effects supervisor Sheena Duggal. Award-winning actor-comedian-author Patton Oswalt will once again host the event.

The nominees for the 18th Annual VES Awards in 25 categories are:

 

Outstanding Visual Effects in a Photoreal Feature

 

ALITA: BATTLE ANGEL

Richard Hollander

Kevin Sherwood

Eric Saindon

Richard Baneham

Bob Trevino

 

AVENGERS: ENDGAME

Daniel DeLeeuw

Jen Underdahl

Russell Earl

Matt Aitken

Daniel Sudick

 

GEMINI MAN

Bill Westenhofer

Karen Murphy-Mundell

Guy Williams

Sheldon Stopsack

Mark Hawker

 

STAR WARS: THE RISE OF SKYWALKER

Roger Guyett

Stacy Bissell

Patrick Tubach

Neal Scanlan

Dominic Tuohy

 

THE LION KING

Robert Legato

Tom Peitzman

Adam Valdez

Andrew R. Jones

 

Outstanding Supporting Visual Effects in a Photoreal Feature

 

1917

Guillaume Rocheron

Sona Pak

Greg Butler

Vijay Selvam

Dominic Tuohy

 

FORD V FERRARI

Olivier Dumont

Kathy Siegel

Dave Morley

Malte Sarnes

Mark Byers

 

JOKER

Edwin Rivera

Brice Parker

Mathew Giampa

Bryan Godwin

Jeff Brink

 

THE AERONAUTS

Louis Morin

Annie Godin

Christian Kaestner

Ara Khanikian

Mike Dawson

 

THE IRISHMAN

Pablo Helman

Mitch Ferm

Jill Brooks

Leandro Estebecorena

Jeff Brink

 

Outstanding Visual Effects in an Animated Feature

 

FROZEN 2

Steve Goldberg

Peter Del Vecho

Mark Hammel

Michael Giaimo

 

KLAUS

Sergio Pablos

Matthew Teevan

Marcin Jakubowski

Szymon Biernacki

 

MISSING LINK

Brad Schiff

Travis KnightSteve Emerson

Benoit Dubuc

 

THE LEGO MOVIE 2

David Burgess

Tim Smith

Mark Theriault

John Rix

 

TOY STORY 4

Josh Cooley

Mark Nielsen

Bob Moyer

Gary Bruins

 

Outstanding Visual Effects in a Photoreal Episode

 

GAME OF THRONES; The Bells

Joe Bauer

Steve Kullback

Ted Rae

Mohsen Mousavi

Sam Conway

 

HIS DARK MATERIALS; The Fight to the Death

Russell Dodgson

James Whitlam

Shawn Hillier

Robert Harrington

 

LADY AND THE TRAMP

Robert Weaver

Christopher Raimo

Arslan Elver

Michael Cozens

Bruno Van Zeebroeck

 

LOST IN SPACE – Episode: Ninety-Seven

Jabbar Raisani

Terron Pratt

Niklas Jacobson

Juri Stanossek

Paul Benjamin

 

STRANGER THINGS – Chapter Six: E Pluribus Unum

Paul Graff

Tom Ford

Michael Maher Jr.

Martin Pelletier

Andy Sowers

 

THE MANDALORIAN; The Child

Richard Bluff

Abbigail Keller

Jason Porter

Hayden Jones

Roy Cancinon

 

Outstanding Supporting Visual Effects in a Photoreal Episode

 

CHERNOBYL; 1:23:45

Max Dennison

Lindsay McFarlane

Clare Cheetham

Paul Jones

Claudius Christian Rauch

 

LIVING WITH YOURSELF; Nice Knowing You

Jay Worth

Jacqueline VandenBussche

Chris Wright

Tristan Zerafa

 

SEE; Godflame

Adrian de Wet

Eve Fizzinoglia

Matthew Welford

Pedro Sabrosa

Tom Blacklock

 

THE CROWN; Aberfan

Ben Turner

Reece Ewing

David Fleet

Jonathan Wood

 

VIKINGS; What Happens in the Cave

Dominic Remane

Mike Borrett

Ovidiu Cinazan

Tom Morrison

Paul Byrne

 

Outstanding Visual Effects in a Real-Time Project

 

Call of Duty Modern Warfare

Charles Chabert

Chris Parise

Attila Zalanyi

Patrick Hagar

 

Control

Janne Pulkkinen

Elmeri Raitanen

Matti Hämäläinen

James Tottman

 

Gears 5

Aryan Hanbeck

Laura Kippax

Greg Mitchell

Stu Maxwell

 

Myth: A Frozen Tale

Jeff Gipson

Nicholas Russell

Brittney Lee

Jose Luis Gomez Diaz

 

Vader Immortal: Episode I

Ben Snow

Mike Doran

Aaron McBride

Steve Henricks

 

Outstanding Visual Effects in a Commercial

 

Anthem Conviction

Viktor Muller

Lenka Likarova

Chris Harvey

Petr Marek

 

BMW Legend

Michael Gregory

Christian Downes

Tim Kafka

Toya Drechsler

 

Hennessy: The Seven Worlds

Carsten Keller

Selcuk Ergen

Kiril Mirkov

William Laban

 

PlayStation: Feel The Power of Pro

Sam Driscoll

Clare Melia

Gary Driver

Stefan Susemihl

 

Purdey’s: Hummingbird

Jules Janaud

Emma Cook

Matthew Thomas

Philip Child

 

Outstanding Visual Effects in a Special Venue Project

 

Avengers: Damage Control

Michael Koperwas

Shereif Fattouh

Ian Bowie

Kishore Vijay

Curtis Hickman

 

Jurassic World: The Ride

Hayden Landis

Friend Wells

Heath Kraynak

Ellen Coss

 

Millennium Falcon: Smugglers Run

Asa Kalama

Rob Huebner

Khatsho Orfali

Susan Greenhow

 

Star Wars: Rise of the Resistance

Jason Bayever

Patrick Kearney

Carol Norton

Bill George

 

Universal Sphere

James Healy

Morgan MacCuish

Ben West

Charlie Bayliss

 

Outstanding Animated Character in a Photoreal Feature

 

ALITA: BATTLE ANGEL; Alita

Michael Cozens

Mark Haenga

Olivier Lesaint

Dejan Momcilovic

 

AVENGERS: ENDGAME; Smart Hulk

Kevin Martel

Ebrahim Jahromi

Sven Jensen

Robert Allman

 

GEMINI MAN; Junior

Paul Story

Stuart Adcock

Emiliano Padovani

Marco Revelant

 

THE LION KING; Scar

Gabriel Arnold

James Hood

Julia Friedl

Daniel Fortheringham

 

 

 

 

Outstanding Animated Character in an Animated Feature

 

FROZEN 2; The Water Nøkk

Svetla Radivoeva

Marc Bryant

Richard E. Lehmann

Cameron Black

 

KLAUS; Jesper

Yoshimishi Tamura

Alfredo Cassano

Maxime Delalande

Jason Schwartzman

 

MISSING LINK; Susan

Rachelle Lambden

Brenda Baumgarten

Morgan Hay

Benoit Dubuc

 

TOY STORY 4; Bo Peep

Radford Hurn

Tanja Krampfert

George Nguyen

Becki Rocha Tower

 

Outstanding Animated Character in an Episode or Real-Time Project

 

LADY AND THE TRAMP; Tramp

Thiago Martins

Arslan Elver

Stanislas Paillereau

Martine Chartrand

 

STRANGER THINGS 3; Tom/Bruce Monster

Joseph Dubé-Arsenault

Antoine Barthod

Frederick Gagnon

Xavier Lafarge

 

THE MANDALORIAN; The Child; Mudhorn

Terry Bannon

Rudy Massar

Hugo Leygnac

 

THE UMBRELLA ACADEMY; Pilot; Pogo

Aidan Martin

Craig Young

Olivier Beierlein

Laurent Herveic

 

Outstanding Animated Character in a Commercial

 

Apex Legends; Meltdown; Mirage

Chris Bayol

John Fielding

Derrick Sesson

Nole Murphy

 

Churchill; Churchie

Martino Madeddu

Philippe Moine

Clement Granjon

Jon Wood

 

Cyberpunk 2077; Dex

Jonas Ekman

Jonas Skoog

Marek Madej

Grzegorz Chojnacki

 

John Lewis; Excitable Edgar; Edgar

Tim van Hussen

Diarmid Harrison-Murray

Amir Bazzazi

Michael Diprose

 

 

Outstanding Created Environment in a Photoreal Feature

 

ALADDIN; Agrabah

Daniel Schmid

Falk Boje

Stanislaw Marek

Kevin George

 

ALITA: BATTLE ANGEL; Iron City

John Stevenson-Galvin

Ryan Arcus

Mathias Larserud

Mark Tait

 

MOTHERLESS BROOKLYN; Penn Station

John Bair

Vance Miller

Sebastian Romero

Steve Sullivan

 

STAR WARS: THE RISE OF SKYWALKER; Pasaana Desert

Daniele Bigi

Steve Hardy

John Seru

Steven Denyer

 

THE LION KING; The Pridelands

Marco Rolandi

Luca Bonatti

Jules Bodenstein

Filippo Preti

 

 

Outstanding Created Environment in an Animated Feature

 

FROZEN 2; Giants’ Gorge

Samy Segura

Jay V. Jackson

Justin Cram

Scott Townsend

 

HOW TO TRAIN YOUR DRAGON: THE HIDDEN WORLD; The Hidden World

Chris Grun

Ronnie Cleland

Ariel Chisholm

Philippe Brochu

 

MISSING LINK; Passage to India Jungle

Oliver Jones

Phil Brotherton

Nick Mariana

Ralph Procida

 

TOY STORY 4; Antiques Mall

Hosuk Chang

Andrew Finley

Alison Leaf

Philip Shoebottom

 

 

Outstanding Created Environment in an Episode, Commercial, or Real-Time Project

 

GAME OF THRONES; The Iron Throne; Red Keep Plaza

Carlos Patrick DeLeon

Alonso Bocanegra Martinez

Marcela Silva

Benjamin Ross

 

LOST IN SPACE; Precipice; The Trench

Philip Engström

Benjamin Bernon

Martin Bergquist

Xuan Prada

 

THE DARK CRYSTAL: AGE OF RESISTANCE; The Endless Forest

Sulé Bryan

Charles Chorein

Christian Waite

Martyn Hawkins

 

THE MANDALORIAN; Nevarro Town

Alex Murtaza

Yanick Gaudreau

Marco Tremblay

Maryse Bouchard

 

Outstanding Virtual Cinematography in a CG Project

 

ALITA: BATTLE ANGEL

Emile Ghorayeb

Simon Jung

Nick Epstein

Mike Perry

 

THE LION KING

Robert Legato

Caleb Deschanel

Ben Grossmann

AJ Sciutto

 

THE MANDALORIAN; The Prisoner; The Roost

Richard Bluff

Jason Porter

Landis Fields IV

Baz Idione

 

 

TOY STORY 4

Jean-Claude Kalache

Patrick Lin

 

Outstanding Model in a Photoreal or Animated Project

 

LOST IN SPACE; The Resolute

Xuan Prada

Jason Martin

Jonathan Vårdstedt

Eric Andersson

 

MISSING LINK; The Manchuria

Todd Alan Harvey

Dan Casey

Katy Hughes

 

THE MAN IN THE HIGH CASTLE; Rocket Train

Neil Taylor

Casi Blume

Ben McDougal

Chris Kuhn

 

THE MANDALORIAN; The Sin; The Razorcrest

Doug Chiang

Jay Machado

John Goodson

Landis Fields IV

 

Outstanding Effects Simulations in a Photoreal Feature

 

DUMBO; Bubble Elephants

Sam Hancock

Victor Glushchenko

Andrew Savchenko

Arthur Moody

 

SPIDER-MAN: FAR FROM HOME; Molten Man

Adam Gailey

Jacob Santamaria

Jacob Clark

Stephanie Molk

 

 

 

 

 

STAR WARS: THE RISE OF SKYWALKER

Don Wong

Thibault Gauriau

Goncalo Cababca

Francois-Maxence Desplanques

 

THE LION KING

David Schneider

Samantha Hiscock

Andy Feery

Kostas Strevlos

 

Outstanding Effects Simulations in an Animated Feature

 

ABOMINABLE

Alex Timchenko

Domin Lee

Michael Losure

Eric Warren

 

FROZEN 2

Erin V. Ramos

Scott Townsend

Thomas Wickes

Rattanin Sirinaruemarn

 

HOW TO TRAIN YOUR DRAGON: THE HIDDEN WORLD; Water and Waterfalls

Derek Cheung

Baptiste Van Opstal

Youxi Woo

Jason Mayer

 

TOY STORY 4

Alexis Angelidis

Amit Baadkar

Lyon Liew

Michael Lorenzen

 

Outstanding Effects Simulations in an Episode, Commercial, or Real-Time Project

 

GAME OF THRONES; The Bells

Marcel Kern

Paul Fuller

Ryo Sakaguchi

Thomas Hartmann

 

Hennessy: The Seven Worlds

Selcuk Ergen

Radu Ciubotariu

Andreu Lucio

Vincent Ullmann

 

LOST IN SPACE; Precipice; Water Planet

Juri Bryan

Hugo Medda

Kristian Olsson

John Perrigo

 

STRANGER THINGS 3; Melting Tom/Bruce

Nathan Arbuckle

Christian Gaumond

James Dong

Aleksandr Starkov

 

THE MANDALORIAN; The Child; Mudhorn

Xavier Martin Ramirez

Ian Baxter

Fabio Siino

Andrea Rosa

 

Outstanding Compositing in a Feature

 

ALITA: BATTLE ANGEL

Adam Bradley

Carlo Scaduto

Hirofumi Takeda

Ben Roberts

 

AVENGERS: ENDGAME

Tim Walker

Blake Winder

Tobias Wiesner

Joerg Bruemmer

 

CAPTAIN MARVEL; Young Nick Fury

Trent Claus

David Moreno Hernandez

Jeremiah Sweeney

Yuki Uehara

 

STAR WARS: THE RISE OF SKYWALKER

Jeff Sutherland

John Galloway

Sam Bassett

Charles Lai

 

THE IRISHMAN

Nelson Sepulveda

Vincent Papaix

Benjamin O’Brien

Christopher Doerhoff

 

 

Outstanding Compositing in an Episode

 

GAME OF THRONES; The Bells

Sean Heuston

Scott Joseph

James Elster

Corinne Teo

 

GAME OF THRONES; The Long Night; Dragon Ground Battle

Mark Richardson

Darren Christie

Nathan Abbott

Owen Longstaff

 

STRANGER THINGS 3; Starcourt Mall Battle

Simon Lehembre

Andrew Kowbell

Karim El-Masry

Miklos Mesterhazy

 

WATCHMEN; Pilot; Looking Glass

Nathaniel Larouche

Iyi Tubi

Perunika Yorgova

Mitchell Beaton

 

Outstanding Compositing in a Commercial

 

BMW Legend

Toya Drechsler

Vivek Tekale

Guillaume Weiss

Alexander Kulikov

 

Feeding America; I Am Hunger in America

Dan Giraldo

Marcelo Pasqualino

Alexander Koester

 

Hennessy; The Seven Worlds

Rod Norman

Guillaume Weiss

Alexander Kulikov

Alessandro Granella

 

PlayStation: Feel the Power of Pro

Gary Driver

Stefan Susemihl

Greg Spencer

Theajo Dharan

 

Outstanding Special (Practical) Effects in a Photoreal or Animated Project

 

ALADDIN; Magic Carpet

Mark Holt

Jay Mallet

Will Wyatt

Dickon Mitchell

 

GAME OF THRONES; The Bells

Sam Conway

Terry Palmer

Laurence Harvey

Alastair Vardy

 

TERMINATOR: DARK FATE

Neil Corbould

David Brighton

Ray Ferguson

Keith Dawson

 

THE DARK CRYSTAL: THE AGE OF RESISTANCE; She Knows All the Secrets

Sean Mathiesen

Jon Savage

Toby Froud

Phil Harvey

 

Outstanding Visual Effects in a Student Project

 

DOWNFALL

Matias Heker

Stephen Moroz

Bradley Cocksedge

 

LOVE AND FIFTY MEGATONS

Denis Krez

Josephine Roß

Paulo Scatena

Lukas Löffler

 

OEIL POUR OEIL

Alan Guimont

Thomas Boileau

Malcom Hunt

Robin Courtoise

 

THE BEAUTY

Marc Angele

Aleksandra Todorovic

Pascal Schelbli

Noel Winzen

 

 

Recreating the Vatican and Sistine Chapel for Netflix’s The Two Popes

The Two Popes, directed by Fernando Meirelles, stars Anthony Hopkins as Pope Benedict XVI and Jonathan Pryce as current pontiff Pope Francis in a story about one of the most dramatic transitions of power in the Catholic Church’s history. The film follows a frustrated Cardinal Bergoglio (the future Pope Francis) who in 2012 requests permission from Pope Benedict to retire because of his issues with the direction of the church. Instead, facing scandal and self-doubt, the introspective Benedict summons his harshest critic and future successor to Rome to reveal a secret that would shake the foundations of the Catholic Church.

London’s Union was approached in May 2017 and supervised visual effects on location in Argentina and Italy over several months. A large proportion of the film takes place within the walls of Vatican City. The Vatican was not involved in the production and the team had very limited or no access to some of the key locations.

Under the direction of production designer Mark Tildesley, the production replicated parts of the Vatican at Rome’s Cinecitta Studios, including a life-size, open ceiling, Sistine Chapel, which took two months to build.

The team LIDAR-scanned everything available and set about amassing as much reference material as possible — photographing from a permitted distance, scanning the set builds and buying every photographic book they could lay their hands on.

From this material, the team set about building 3D models — created in Autodesk Maya — of St. Peter’s Square, the Basilica and the Sistine Chapel. The environments team was tasked with texturing all of these well-known locations using digital matte painting techniques, including recreating Michelangelo’s masterpiece on the ceiling of the Sistine Chapel.

The story centers on two key changes of pope in 2005 and 2013. Those events attracted huge attention, filling St. Peter’s Square with people eager to discover the identity of the new pope and celebrate his ascension. News crews from around the world also camp out to provide coverage for the billions of Catholics all over the world.

To recreate these scenes, the crew shot at a school in Rome (Ponte Mammolo) that has the same pattern on its floor. A cast of 300 extras was shot in blocks in different positions at different times of day, with costume tweaks including the addition of umbrellas to build a library that would provide enough flexibility during post to recreate these moments at different times of day and in different weather conditions.

Union also called on Clear Angle Studios to individually scan 50 extras to provide additional options for the VFX team. This was an ambitious crowd project, so the team couldn’t shoot in the location, and the end result had to stand up at 4K in very close proximity to the camera. Union designed a Houdini-based system to deal with the number of assets and clothing in such a way that the studio could easily art-direct them as individuals, allow the director to choreograph them and deliver a believable result.

Union conducted several motion capture shoots inhouse at Union to provide some specific animation cycles that married with the occasions they were recreating. This provided even more authentic-looking crowds for the post team.

Union worked on a total of 288 VFX shots, including greenscreens, set extensions, window reflections, muzzle flashes, fog and rain and a storm that included a lightning strike on the Basilica.

In addition, the team did a significant amount of de-aging work to accommodate the film’s eight-year main narrative timeline as well as a long period in Pope Francis’ younger years.

Maxon and Red Giant to merge

Maxon, developers of pro 3D software solutions, and Red Giant, makers of tools for editors, VFX artists, and motion designers, have agreed to merge under the media and entertainment division of Nemetschek Group. The transaction is expected to close in January 2020, subject to regulatory approval and customary closing conditions.

Maxon, best known for its 3D product Cinema 4D, was formed in 1986 to provide high-end yet accessible 3D software solutions. Artists across the globe rely on Maxon products to create high-end visuals. In April of this year, Maxon acquired Redshift, developer of the GPU-accelerated Redshift render engine.

Since 2002, Red Giant has built its brand through products such as Trapcode, Magic Bullet, Universe, PluralEyes and its line of visual effects software. Its tools are used in the fields of film, broadcast and advertising.

The two companies provide tools for companies including ABC, CBS, NBC, HBO, BBC, Sky, Fox Networks, Turner Broadcasting, NFL Network, WWE, Viacom, Netflix, ITV Creative, Discovery Channel, MPC, Digital Domain, VDO, Sony, Universal, The Walt Disney Company, Blizzard Entertainment, BMW, Facebook, Apple, Google, Vitra, Nike and many more.

Main Photo: L-R: Maxon CEO Dave McGavran and Red Giant CEP Chad Bechert

Shape+Light VFX boutique opens in LA with Trent, Lehr at helm


Visual effects and design studio boutique Shape+Light has officially launched in Santa Monica. At the helm is managing director/creative director Rob Trent and executive producer Cara Lehr. Shape+Light provides visual effects, design and finishing services for agency and brand-direct clients. The studio, which has been quietly operating since this summer, has already delivered work for Nike, Apple, Gatorade, Lexus and Proctor & Gamble.

Gatorade

Trent is no stranger to running VFX boutiques. An industry veteran, he began his career as a Flame artist, working at studios including Imaginary Forces and Digital Domain, and then at Asylum VFX as a VFX supervisor/creative director before co-founding The Mission VFX in 2010. In 2015, he established Saint Studio. During his career he has worked on big campaigns, including the launch of the Apple iPhone with David Fincher, celebrating the NFL with Nike and Michael Mann, and honoring moms with Alma Har’el and P&G for the Olympics. He has also contributed to award-winning feature films such as The Curious Case of Benjamin Button, Minority Report, X-Men and Zodiac.

Lehr is an established VFX producer with over 20 years of experience in both commercials and features. She has worked for many of LA’s leading VFX studios, including Zoic Studios, Asylum VFX, Digital Domain, Brickyard VFX and Psyop. She most recently served as EP at Method Studios, where she was on staff since 2012. She has worked on ad campaigns for brands including Apple, Microsoft, Nike, ESPN, Coca Cola, Taco Bell, AT&T, the NBA, Chevrolet and more.

Maya 2020 and Arnold 6 now available from Autodesk

Autodesk has released Autodesk Maya 2020 and Arnold 6 with Arnold GPU. Maya 2020 brings animators, modelers, riggers and technical artists a host of new tools and improvements for CG content creation, while Arnold 6 allows for production rendering on both the CPU and GPU.

Maya 2020 adds more than 60 new updates, as well as performance enhancements and new simulation features to Bifrost, the visual programming environment in Maya.

Maya 2020

Release highlights include:

— Over 60 animation features and updates to the graph editor and time slider.
— Cached Playback: New preview modes, layered dynamics caching and more efficient caching of image planes.
— Animation bookmarks: Mark, organize and navigate through specific events in time and frame playback ranges.
— Bifrost for Maya: Performance improvements, Cached Playback support and new MPM cloth constraints.
— Viewport improvements: Users can interact with and select dense geometry or a large number of smaller meshes faster in the viewport and UV editors.
— Modeling enhancements: New Remesh and Retopologize features.
— Rigging improvements: Matrix-driven workflows, nodes for precisely tracking positions on deforming geometry and a new GPU-accelerated wrap deformer.

The Arnold GPU is based on Nvidia’s OptiX framework and takes advantage of Nvidia RTX technology. Arnold 6 highlights include:

— Unified renderer— Toggle between CPU and GPU rendering.
— Lights, cameras and More— Support for OSL, OpenVDB volumes, on-demand texture loading, most LPEs, lights, shaders and all cameras.
— Reduced GPU noise— Comparable to CPU noise levels when using adaptive sampling, which has been improved to yield faster, more predictable results regardless of the renderer used.
— Optimized for Nvidia RTX hardware— Scale up rendering power when production demands it.
— New USD components— Hydra render delegate, Arnold USD procedural and USD schemas for Arnold nodes and properties are now available on GitHub.

Arnold 6

— Performance improvements— Faster creased subdivisons, an improved Physical Sky shader and dielectric microfacet multiple scattering.

Maya 2020 and Arnold 6 are available now as standalone subscriptions or with a collection of end-to-end creative tools within the Autodesk Media & Entertainment Collection. Monthly, annual and three-year single-user subscriptions of Arnold are available on the Autodesk e-store.

Arnold GPU is also available to try with a free 30-day trial of Arnold 6. Arnold GPU is available in all supported plug-ins for Autodesk Maya, Autodesk 3ds Max, SideFX Houdini, Maxon Cinema 4D and Foundry Katana.

Reallusion’s Headshot plugin for realistic digi-doubles via AI

Reallusion has introduced a plugin for Character Creator 3 to help create realistic-looking digital doubles. According to the company, the Headshot plugin uses AI technology to automatically generate a digital human in minutes from one single photo, and those characters are fully rigged for voice lipsync, facial expression and full body animation.

Headshot allows game developers and virtual production teams to quickly funnel a cast of digital doubles into iClone, Unreal, Unity, Maya, ZBrush and more. The idea is to allow the digital humans to go anywhere they like and give creators a solution to rapidly develop, iterate and collaborate in realtime.

The plugin has two AI modes: Auto Mode and Pro Mode. Auto Mode is a one-click solution for creating mid-rez digital human crowds. This process allows one-click head and hair creation for realtime 3D head models. It also generates a separate 3D hair mesh with alpha mask to soften edge lines. The 3D hair is fully compatible with Character Creator’s conformable hair format (.ccHair). Users can add them into their hair library, and apply them to other CC characters.

Headshot Pro Mode offers full control of the 3D head generation process with advanced features such as Image Matching, Photo Reprojection and Custom Mask with up to 4,096-texture resolution.

The Image Matching Tool overlays an image reference plane for advanced head shape refinement and lens correction. With Photo Reprojection, users can easily fix the texture-to-mesh discrepancies resulting from face morph change.

Using high-rez source images and Headshot’s 1,000-plus morphs, users can get a scan-quality digital human face in 4K texture details. Additional textures include normal, AO, roughness, metallic, SSS and Micro Normal for more realistic digital human rendering.

The 3D Head Morph System is designed to achieve the professional and detailed look of 3D scan models. The 3D sculpting design allow users to hover over a control area and use directional mouse drags to adjust the corresponding mesh shape, from full head and face sculpting to individual features — head contour, face, eyes, nose, mouth and ears with more than 1,000 head morphs. It is now free with a purchase of the Headshot plugin.

The Headshot plugin for Character Creator is $199 and comes with the content pack Headshot Morph 1,000+ ($99). Character Creator 3 Pipeline costs $199.

Redshift integrates Cinema 4D noises, nodes and more

Maxon and Redshift Rendering Technologies have released Redshift 3.0.12, which has native support for Cinema 4D noises and deeper integration with Cinema 4D, including the option to define materials using Cinema 4D’s native node-based material system.

Cinema 4D noise effects have been in demand within other 3D software packages because of their flexibility, efficiency and look. Native support in Redshift means that users of other DCC applications can now access Cinema 4D noises by using Redshift as their rendering solution. Procedural noise allows artists to easily add surface detail and randomness to otherwise perfect surfaces. Cinema 4D offers 32 different types of noise and countless variations based on settings. Native support for Cinema 4D noises means Redshift can preserve GPU memory while delivering high-quality rendered results.

Redshift 3.0.12 provides content creators deeper integration of Redshift within Cinema 4D. Redshift materials can now be defined using Cinema 4D’s nodal material framework, introduced in Release 20. As well, Redshift materials can use the Node Space system introduced in Release 21, which combines the native nodes of multiple render engines into a single material. Redshift is the first to take advantage of the new API in Cinema 4D to implement its own Node Spaces. Users can now also use any Cinema 4D view panel as a Redshift IPR (interactive preview render) window, making it easier to work within compact layouts and interact with a scene while developing materials and lighting.

Redshift 3.0.12 is immediately available from the Redshift website.

Maxon acquired RedShift in April of 2019.

London’s Freefolk beefs up VFX team

Soho-based visual effects studio Freefolk, which has seen growth in its commercials and longform work, has grown its staff to meet this demand. As part of the uptick in work, Freefolk promoted Cheryl Payne from senior producer to head of commercial production. Additionally, Laura Rickets has joined as senior producer, and 2D artist Bradley Cocksedge has been added to the commercials VFX team.

Payne, who has been with Freefolk since the early days, has worked on some of the studio’s biggest commercials, including; Warburtons for Engine, Peloton for Dark Horses and Cadburys for VCCP.

Rickets comes to Freefolk with over 18 years of production experience working at some of the biggest VFX houses in London, including Framestore, The Mill and Smoke & Mirrors, as well as agency side for McCann. Since joining the team, Rickets has VFX-produced work on the I’m A Celebrity IDs, a set of seven technically challenging and CG-heavy spots for the new series of the show as well as ads for the Rugby World Cup and Who Wants to Be a Millionaire?.

Cocksedge is a recent graduate who joins from Framestore, where he was working as an intern on Fantastic Beasts: The Crimes of Grindelwald. While in school at the University of Hertfordshire, he interned at Freefolk and is happy to be back in a full-time position.

“We’ve had an exciting year and have worked on some really stand-out commercials, like TransPennine for Engine and the beautiful spot for The Guardian we completed with Uncommon, so we felt it was time to add to the Freefolk family,” says Fi Kilroe, Freefolk’s co-managing director/executive producer.

Main Image: (L-R) Cheryl Payne, Laura Rickets and Bradley Cocksedge

Alkemy X adds Albert Mason as head of production

Albert Mason has joined VFX house Alkemy X as head of production. He comes to Alkemy X with over two decades of experience in visual effects and post production. He has worked on projects directed by such industry icons as Peter Jackson on the Lord of the Rings trilogy, Tim Burton on Alice in Wonderland and Robert Zemeckis on The Polar Express. In his new role at Alkemy X, he will use his experience in feature films to target the growing episodic space.

A large part of Alkemy X’s work has been for episodic visual effects, with credits that include Amazon Prime’s Emmy-winning original series, The Marvelous Mrs. Maisel, USA’s Mr. Robot, AMC’s Fear the Walking Dead, Netflix’s Maniac, NBC’s Blindspot and Starz’s Power.

Mason began his career at MTV’s on-air promos department, sharpening his production skills on top series promo campaigns and as a part of its newly launched MTV Animation Department. He took an opportunity to transition into VFX, stepping into a production role for Weta Digital and spending three years working globally on the Lord of the Rings trilogy. He then joined Sony Pictures Imageworks, where he contributed to features including Spider-Man 3 and Ghost Rider. He has also produced work for such top industry shops as Logan, Rising Sun Pictures and Greymatter VFX.

“[Albert’s] expertise in constructing advanced pipelines that embrace emerging technologies will be invaluable to our team as we continue to bolster our slate of VFX work,” says Alkemy X president/CEO Justin Wineburgh.

Motorola’s next-gen Razr gets a campaign for today

Many of us have fond memories of our Razr flip phone. At the time, it was the latest and greatest. Then new technology came along, and the smartphone era was born. Now Motorola is asking, “Why can’t you have both?”

Available as of November 13, the new Razr fits in a palm or pocket when shut and flips open to reveal an immersive, full-length touch screen. There is a display screen called the Quick View when closed and the larger Flex View when open — and the two displays are made to work together. Whatever you see on Quick View then moves to the larger Flex View display when you flip it open.

In order to help tell this story, Motorola called on creative shop Los York to help relaunch the Razr. Los York created the new smartphone campaign to tap into the Razr’s original DNA and launch it for today’s user.

Los York developed a 360 campaign that included films, social, digital, TV, print and billboards, with visuals in stores and on devices (wallpapers, ringtones, startup screens). Los York treated the Razr as a luxury item and a piece of art, letting the device reveal itself unencumbered by taglines and copy. The campaign showcases the Razr as a futuristic, high-end “fashion accessory” that speaks to new industry conversations, such as advancing tech along a utopian or dystopian future.

The campaign features a mix of live action and CG. Los York shot on a Panavision DXL with Primo 70 lenses. CG was created using Maxon Cinema 4D with Redshift and composited in Adobe After Effects. The piece was edited in-house on Adobe Premiere.

We reached out to Los York CEO and founder Seth Epstein to find out more:

How much of this is live action versus CG?
The majority is CG, but, originally, the piece was intended to be entirely CG. Early in the creative process, we defined the world in which the new Razr existed and who would belong there. As we worked on the project, we kept feeling that bringing our characters to life in live action and blending the worlds. The proper live action was envisioned after the fact, which is somewhat unusual.

What were some of the most challenging aspects of this piece?
The most challenging part of the project was the fact that the project happened over a period of nine months. Wisely, the product release needed to push, and we continued to evolve the project over time, which is a blessing and a curse.

How did it feel taking on a product with a lot of history and then rebranding it for the modern day?
We felt the key was to relaunch an iconic product like the Razr with an eye to the future. The trap of launching anything iconic is falling back on the obvious retro throwback references, which can come across as too obvious. We dove into the original product and campaigns to extract the brand DNA of 2004 using archetype exercises. We tapped into the attitude and voice of the Razr at that time — and used that attitude as a starting point. We also wanted to look forward and stand three years in the future and imagine what the tone and campaign would be then. All of this is to say that we wanted the new Razr to extract the power of the past but also speak to audiences in a totally fresh and new way.

Check out the campaign here.

Carbon New York grows with three industry vets

Carbon in New York has grown with two senior hires — executive producer Nick Haynes and head of CG Frank Grecco — and the relocation of existing ECD Liam Chapple, who joins from the Chicago office.

Chapple joined Carbon in 2016, moving from Mainframe in London to open Carbon’s Chicago facility.  He brought in clients such as Porsche, Lululemon, Jeep, McDonald’s, and Facebook. “I’ve always looked to the studios, designers and directors in New York as the high bar, and now I welcome the opportunity to pitch against them. There is an amazing pool of talent in New York, and the city’s energy is a magnet for artists and creatives of all ilk. I can’t wait to dive into this and look forward to expanding upon our amazing team of artists and really making an impression in such a competitive and creative market.”

Chapple recently wrapped direction and VFX on films for Teflon and American Express (Ogilvy) and multiple live-action projects for Lululemon. The most recent shoot, conceived and directed by Chapple, was a series of eight live-action films focusing on Lululemon’s brand ambassadors and its new flagship store in Chicago.

Haynes joins Carbon from his former role as EP of MPC, bringing over 20 years of experience earned at The Mill, MPC and Absolute. Haynes recently wrapped the launch film for the Google Pixel phone and the Chromebook, as well as an epic Middle Earth: Shadow of War Monolith Games trailer combining photo-real CGI elements with live-action shot on the frozen Black Sea in Ukraine.  “We want to be there at the inception of the creative and help steer it — ideally, lead it — and be there the whole way through the process, from concept and shoot to delivery. Over the years, whether working for the world’s most creative agencies or directly with prestigious clients like Google, Guinness and IBM, I aim to be as close to the project as possible from the outset, allowing my team to add genuine value that will garner the best result for everyone involved.”

Grecco joins Carbon from Method Studios, where he most recently led projects for Google, Target, Microsoft, Netflix and Marvel’s Deadpool 2.  With a wide range of experience from Emmy-nominated television title sequences to feature films and Super Bowl commercials, Grecco looks forward to helping Carbon continue to push its visuals beyond the high bar that has already been set.

In addition to New York and Chicago, Carbon has a studio in Los Angeles.

Main Image: (L-R) Frank Grecco, Liam Chapple, Nick Haynes

Behind the Title: Compadre’s Jessica Garcia-Scharer

NAME: Jessica Garcia-Scharer

COMPANY: Culver City’s Compadre

CAN YOU DESCRIBE YOUR COMPANY?
We are a creative marketing agency. We make strategically informed branding and creative — and then help to get it out to the world in memorable ways. And we use strategy, design, planning and technology to do it.

WHAT’S YOUR JOB TITLE?
Head of Production

WHAT DOES THAT ENTAIL?
Head of production means different things at different companies. I’m the three-ring binder with the special zip pack that helps to hold everything together in an organized manner. Everything from hearing and understanding client needs, creating proposals, managing budget projections/actuals/contracts, getting in the right talent for the job, all the way to making sure that everyone in-house is happy, balanced and supported.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
Probably the proposals and planning charts. I’m also “Snack Mom!”

WHAT’S YOUR FAVORITE PART OF THE JOB?
Snack Mom. Ha! My favorite part of the job is being part of a team and bringing something to the table that is useful. I like when my team feels like everything is being handled.

WHAT’S YOUR LEAST FAVORITE?
If and when there isn’t enough quiet time to get into the paperwork zone.

WHAT IS YOUR FAVORITE TIME OF THE DAY?
At work: When I get in early and no one is in yet. I get the most work done during that time. Also lunch. I try to make it a point now to get out to lunch and take co-workers with me. It’s nice to be able to break up the day and be regular people for an hour.

Non-work-related: When the sun is just coming up and it’s still a little brisk outside, but the air is fresh and the birds are starting to wake up and chirp. Also, when the sun is starting to descend and it’s still a little warm as the cool ocean breeze starts to come in. The birds are starting to wind down after a hard day of being a bird, and families are coming together to make dinner and talk about their days (well… on the weekend anyway). I am obviously very lucky, and I know that. There are many that don’t get to experience that, and I think of them during that time as well.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
It depends on if I were independently wealthy or not, and where I had been previously. Before going to college, I wanted to be a VFX make-up artist, a marine biologist working with dolphins or a park ranger in Yosemite.

If I were independently wealthy, I would complete a painting collection and put up an art show, start a female/those-who-identify-as-female agency, open up a vegan restaurant and be a hardcore animal activist.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I wish people thought about their careers as more than one path. I have many paths, and I don’t think I’m done just yet. You never know where life will take you from one day to the next, so it’s important to live for today.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
CNN 2020 Election promo package, ESPN 40th Anniversary and another that is pretty neat and a big puzzle to figure out, but I can’t tell you just yet…

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
I technically work on everything, so they’re all my babies, and I’m proud of all of them for different reasons. Most, if not all, of the projects that we work on start out with a complex puzzle to solve. I work with the team to figure it out and present the solution to the client. That is where I thrive, and those documents are what I’m most proud of as far as my own personal accomplishments and physical contributions to the company.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
Water filtration systems, giant greenhouses and air conditioning will be vital because of global warming.

For work, it would be really hard to function without my mobile phone, laptop and headphones.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
Mainly Instagram and Facebook. Facebook is where I learn about events/concerts/protests coming up, keep tabs on people’s birthdays, weddings, babies and share my thoughts on factory farming. Instagram is mindless eye candy for the most part, but I do love how close I feel to certain communities there.

DO YOU LISTEN TO MUSIC WHILE YOU WORK? CARE TO SHARE YOUR FAVORITE MUSIC TO WORK TO?
Usually binaural beats (for focus and clarity) and new age relaxation; but if I’m organizing and cleaning up, then The Cure, Bowie, Duran Duran, Radiohead and Bel Canto.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
As I mentioned before, it’s important to take a lunch break and bond with co-workers and old friends. Taking a step away and remembering that I am a human being living a life that needs to be enjoyed is key to a happy work-life balance. We aren’t saving lives here; we are making fun things for fun people, so as long as you have the systems and resources in place, the stress is the excitement of making things that exceed expectations.

But if I do let things get to me, the best de-stressor is getting home and into my PJs and snuggling up with my family and animals… drowning myself in the escape of love. Oh, and dark chocolate (vegan, of course).

A post engineer’s thoughts on Adobe MAX, new offerings

By Mike McCarthy

Last week, I had the opportunity to attend Adobe’s MAX conference at the LA Convention Center. Adobe showed me, and 15,000 of my closest friends, the newest updates to pretty much all of its Creative Cloud applications, as well as a number of interesting upcoming developments. From a post production perspective, the most significant pieces of news are the release of Premiere Pro 14 and After Effects 17 (a.ka., the 2020 releases of those Creative Cloud apps).

The main show ran from Monday to Wednesday, with a number of pre-show seminars and activities the preceding weekend. My experience started off by attending a screening of the new Terminator Dark Fate film at LA Live, followed by Q&A with the director and post team. The new Terminator was edited in Premiere Pro, sharing the project assets between a large team of editors and assistants, with extensive use of After Effects, Adobe’s newly acquired Substance app and various other tools in the Creative Cloud.

The post team extolled the improvements in shared project support and project opening times since their last Premiere endeavor on the first Deadpool movie. Visual effects editor Jon Carr shared how they used the integration between Premiere and After Effects to facilitate rapid generation of temporary “postvis” effects. This helped the editors tell the story while they were waiting on the VFX teams to finish generating the final CGI characters and renders.

MAX
The conference itself kicked off with a keynote presentation of all of Adobe’s new developments and releases. The 150-minute presentation covered all aspects of the company’s extensive line of applications. “Creativity for All” is the primary message Adobe is going for, and they focused on the tension between creativity and time. So they are trying to improve their products in ways that give their users more time to be creative.

The three prongs of that approach for this iteration of updates were:
– Faster, more powerful, more reliable — fixing time-wasting bugs, improving hardware use.
– Create anywhere, anytime, with anyone — adding functionality via the iPad, and shared Libraries for collaboration.
– Explore new frontiers — specifically in 3D with Adobe’s Dimension, Substance and Aero)

Education is also an important focus for Adobe, with 15 million copies of CC in use in education around the world. They are also creating a platform for CC users to stream their working process to viewers who want to learn from them, directly from within the applications. That will probably integrate with the new expanded Creative Cloud app released last month. They also have released integration for Office apps to access assets in CC libraries.

The first application updates they showed off were in Photoshop. They have made the new locked aspect ratio scaling a toggle-able behavior, improved the warp tool and improved ways to navigate deep layer stacks by seeing which layers effect particular parts of an image. But the biggest improvement is AI-based object selection. This makes detailed maskings based on simple box selections or rough lassos. Illustrator now has GPU acceleration, improving performance of larger documents and a path simplifying tool to reduce the number of anchor points.

They released Photoshop for the iPad and announced that Illustrator will be following that path as well. Fresco is headed the other direction and now available on Windows. That is currently limited to Microsoft Surface products, but I look forward to being able to try it out on my ZBook-X2 at some point. Adobe XD has new features, and apparently is the best way to move complex Illustrator files into After Effects, which I learned at one of the sessions later.

Premiere
Premiere Pro 14 has a number of new features, the most significant one being AI-driven automatic reframe to allow you to automatically convert your edited project into other aspect ratios for various deliverables. While 16×9 is obviously a standard size, certain web platforms are optimized for square or tall videos. The feature can also be used to reframe content for 2.35 to 16×9 or 4×3, which are frequent delivery requirements for feature films that I work on. My favorite aspect of this new functionality is that the user has complete control over the results.

Unlike other automated features like warp stabilizer, which only offers on/off of applying the results, the auto-frame function just generates motion effect keyframes that can be further edited and customized by the user… once the initial AI pass is complete. It also has a nesting feature for retaining existing framing choices, that results in the creation of a new single-layer source sequence. I can envision this being useful for a number of other workflow processes — such as preparing for external color grading or texturing passes, etc.

They also added better support for multi-channel audio workflows and effects, improved playback performance for many popular video formats, better HDR export options and a variety of changes to make the motion graphics tools more flexible and efficient for users who use them extensively. They also increased the range of values available for clip playback speed and volume, and added support for new camera formats and derivations.

The brains behind After Effects have focused on improving playback and performance for this release and have made some significant improvements in that regard. The other big feature that actually may make a difference is content-aware fill for video. This was sneak previewed at MAX last year and first implemented in the NAB 2019 release of After Effects, but it should be greatly refined and improved in this version since it’s now twice as fast.

They also greatly improved support for OpenEXR frame sequences, especially with multiple render pass channels. The channels can be labeled; it creates a video contact sheet for viewing all the layers in thumbnail form. EXR playback performance is supposed to be greatly improved as well.

Character Animator is now at 3.0, and they have added keyframing of all editable values, trigger-able reposition “cameras” and trigger-able audio effects, among other new features. And Adobe Rush now supports publishing directly to TikTok.

Content Authenticity Initiative
Outside of individual applications, Adobe has launched the Content Authenticity Initiative in partnership with the NY Times and Twitter. It aims to fight fake news and restore consumer confidence in media. Its three main goals are: trust, attribution and authenticity. It aims to present end users with who created an image and who edited or altered it and, if so, in what ways. Seemingly at odds with that, they also released a new mobile app that edits images upon capture, using AI empowered “lenses” for highly stylized looks, even providing a live view.

This opening keynote was followed by a selection of over 200 different labs and sessions available over the next three days. I attended a couple sessions focused on After Effects, as that is a program I know I don’t use to its full capacity. (Does anyone, really?)

Partners
A variety of other partner companies were showing off their products in the community pavilion. HP was pushing 3D printing and digital manufacturing tools that integrate with Photoshop and Illustrator. Dell has a new 27-inch color accurate monitor with built-in colorimeter, presumably to compete with HP’s top end DreamColor displays. Asus also has some new HDR monitors that are Dolby Vision compatible. One is designed to be portable, and is as thin and lightweight as a laptop screen. I have always wondered why that wasn’t a standard approach for desktop displays.

Keynotes
Tuesday opened with a keynote presentation from a number of artists of different types, speaking or being interviewed. Jason Levine’s talk with M. Night Shyamalan was my favorite part, even though thrillers aren’t really my cup of tea. Later, I was able to sit down and talk with Patrick Palmer, Adobe’s Premiere Pro product manager about where Premiere is headed and the challenges of developing HDR creation tools when there is no unified set of standards for final delivery. But I am looking forward to being able to view my work in HDR while I am editing at some point in the future.

One of the highlights of MAX is the 90-minute Sneaks session on Tuesday night, where comedian John Mulaney “helped” a number of Adobe researchers demonstrate new media technologies they are working on. These will eventually improve audio quality, automate animation, analyze photographic authenticity and many other tasks once they are refined into final products at some point in the future.

This was only my second time attending MAX, and with Premiere Rush being released last year, video production was a key part of that show. This year, without that factor, it was much more apparent to me that I was an engineer attending an event catering to designers. Not that this is bad, but I mention it here because it is good to have a better idea of what you are stepping into when you are making decisions about whether to invest in attending a particular event.

Adobe focuses MAX on artists and creatives as opposed to engineers and developers, who have other events that are more focused on their interests and needs. I suppose that is understandable since it is not branded Creative Cloud for nothing. But it is always good to connect with the people who develop the tools I use, and the others who use them with me, which is a big part of what Adobe MAX is all about.


Mike McCarthy is an online editor/workflow consultant with over 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.

Behind the Title: Sarofsky EP Steven Anderson

This EP’s responsibilities range gamut “from managing our production staff to treating clients to an amazing dinner.”

Company: Chicago’s Sarofsky

Can you describe your company?
We like to describe ourselves as a design-driven production company. I like to think of us as that but so much more. We can be a one-stop shop for everything from concept through finish, or we can partner with a variety of other companies and just be one piece of the puzzle. It’s like ordering from a Chinese menu — you get to pick what items you want.

What’s your job title, and what does the job entail?
I’m executive producer, and that means different things at different companies and industries. Here at Sarofsky, I am responsible for things that run the gamut from managing our production staff to treating clients to an amazing dinner.

Sarofsky

What would surprise people the most about what falls under that title?
I also run payroll, and I am damn good at it.

How has the VFX industry changed in the time you’ve been working?
It used to be that when you told someone, “This is going to take some time to execute,” that’s what it meant. But now, everyone wants everything two hours ago. On the flip side, the technology we now have access to has streamlined the production process and provided us with some terrific new tools.

Why do you like being on set for shoots? What are the benefits?
I always like being on set whenever I can because decisions are being made that are going to affect the rest of the production paradigm. It’s also a good opportunity to bond with clients and, sometimes, get some kick-ass homemade guacamole.

Did a particular film inspire you along this path in entertainment?
I have been around this business for quite a while, and one of the reasons I got into it was my love of film and filmmaking. I can’t say that one particular film inspired me to do this, but I remember being a young kid and my dad taking me to see The Towering Inferno in the movie theater. I was blown away.

What’s your favorite part of the job?
Choosing a spectacular bottle of wine for a favorite client and watching their face when they taste it. My least favorite has to be chasing down clients for past due invoices. It gets old very quickly.

What is your most productive time of the day?
It’s 6:30am with my first cup of coffee sitting at my kitchen counter before the day comes at me. I get a lot of good thinking and writing done in those early morning hours.

Original Bomb Pop via agency VMLY&R

If you didn’t have this job, what would you be doing instead?
I would own a combo bookstore/wine shop where people could come and enjoy two of my favorite things.

Why did you choose this profession?
I would say this profession chose me. I studied to be an actor and made my living at it for several years, but due to some family issues, I ended up taking a break for a few years. When I came back, I went for a job interview at FCB and the rest is history. I made the move from agency producing to post executive producer five years ago and have not looked back since.

Can you briefly explain one or more ways Sarofsky is addressing the issue of workplace diversity in its business?
We are a smallish women-owned business, and I am a gay man; diversity is part of our DNA. We always look out for the best talent but also try to ensure we are providing opportunities for people who may not have access to them. For example, one of our amazing summer interns came to us through a program called Kaleidoscope 4 Kids, and we all benefited from the experience.

Name some recent projects you have worked on, which are you most proud of, and why?
My first week here at EP, we went to LA for the friends and family screening of Guardians of the Galaxy, and I thought, what an amazing company I work for! Marvel Studios is a terrific production partner, and I would say there is something special about so many of our clients because they keep coming back. I do have a soft spot for our main title for Animal Kingdom just because I am a big Ellen Barkin fan.

Original Bomb Pop via agency VMLY&R

Name three pieces of technology you can’t live without.
I’d be remiss if I didn’t say my MacBook and iPhone, but I also wouldn’t want to live without my cooking thermometer, as I’ve learned how to make sourdough bread this year, and it’s essential.

What social media channels do you follow?
I am a big fan of Instagram; it’s just visual eye candy and provides a nice break during the day. I don’t really partake in much else unless you count NPR. They occupy most of my day.

Do you listen to music while you work? Care to share your favorite music to work to?
I go in waves. Sometimes I do but then I won’t listen to anything for weeks. But I recently enjoyed listening to “Ladies and Gentleman: The Best of George Michael.” It was great to listen to an entire album, a rare treat.

What do you do to de-stress from it all?
I get up early and either walk or do some type of exercise to set the tone for the day. It’s also so important to unplug; my partner and I love to travel, so we do that as often as we can. All that and a 2006 Chateau Margaux usually washes away the day in two delicious sips.

Bonfire adds Jason Mayo as managing director/partner

Jason Mayo has joined digital production company Bonfire in New York as managing director and partner. Industry veteran Mayo will be working with Bonfire’s new leadership lineup, which includes founder/Flame artist Brendan O’Neil, CD Aron Baxter, executive producer Dave Dimeola and partner Peter Corbett. Bonfire’s offerings include VFX, design, CG, animation, color, finishing and live action.

Mayo comes to Bonfire after several years building Postal, the digital arm of the production company Humble. Prior to that he spent 14 years at Click 3X, where he worked closely with Corbett as his partner. While there he also worked with Dimeola, who cut his teeth at Click as a young designer/compositor. Dimeola later went on to create The Brigade, where he developed the network and technology that now forms the remote, cloud-based backbone referred to as the Bonfire Platform.

Mayo says a number of factors convinced him that Bonfire was the right fit for him. “This really was what I’d been looking for,” he says. “The chance to be part of a creative and innovative operation like Bonfire in an ownership role gets me excited, as it allows me to make a real difference and genuinely effect change. And when you’re working closely with a tight group of people who are focused on a single vision, it’s much easier for that vision to be fully aligned. That’s harder to do in a larger company.”

O’Neil says that having Mayo join as partner/MD is a major move for the company. “Jason’s arrival is the missing link for us at Bonfire,” he says. “While each of us has specific areas to focus on, we needed someone who could both handle the day to day of running the company while keeping an eye on our brand and our mission and introducing our model to new opportunities. And that’s exactly his strong suit.”

For the most part, Mayo’s familiarity with his new partners means he’s arriving with a head start. Indeed, his connection to Dimeola, who built the Bonfire Platform — the company’s proprietary remote talent network, nicknamed the “secret sauce” — continued as Mayo tapped Dimeola’s network for overflow and outsourced work while at Postal. Their relationship, he says, was founded on trust.

“Dave came from the artist side, so I knew the work I’d be getting would be top quality and done right,” Mayo explains. “I never actually questioned how it was done, but now that he’s pulled back the curtain, I was blown away by the capabilities of the Platform and how it dramatically differentiates us.

“What separates our system is that we can go to top-level people around the world but have them working on the Bonfire Platform, which gives us total control over the process,” he continues. “They work on our cloud servers with our licenses and use our cloud rendering. The Platform lets us know everything they’re doing, so it’s much easier to track costs and make sure you’re only paying for the work you actually need. More importantly, it’s a way for us to feel connected – it’s like they’re working in a suite down the hall, except they could be anywhere in the world.”

Mayo stresses that while the cloud-based Platform is a huge advantage for Bonfire, it’s just one part of its profile. “We’re not a company riding on the backs of freelancers,” he points out. “We have great, proven talent in our core team who work directly with clients. What I’ve been telling my longtime client contacts is that Bonfire represents a huge step forward in terms of the services and level of work I can offer them.”

Corbett believes he and Mayo will continue to explore new ways of working now that he’s at Bonfire. “In the 14 years Jason and I built Click 3X, we were constantly innovating across both video and digital, integrating live action, post production, VFX and digital engagements in unique ways,” he observes. “I’m greatly looking forward to continuing on that path with him here.”

De-aging John Goodman 30 years for HBO’s The Righteous Gemstones

For HBO’s original series The Righteous Gemstones, VFX house Gradient Effects de-aged John Goodman using its proprietary Shapeshifter tool, an AI-assisted tool that can turn back the time on any video footage. With Shapeshifter, Gradient sidestepped the Uncanny Valley to shave decades off Goodman for an entire episode, delivering nearly 30 minutes of film-quality VFX in six weeks.

In the show’s fifth episode, “Interlude,” viewers journey back to 1989, a time when the Gemstone empire was still growing and Eli’s wife, Aimee-Leigh, was still alive. But going back also meant de-aging Goodman for an entire episode, something never attempted before on television. Gradient accomplished it using Shapeshifter, which allows artists to “reshape” an individual frame and the performers in it and then extend those results across the rest of a shot.

Shapeshifter worked by first analyzing the underlying shape of Goodman’s face. It then extracted important anatomical characteristics, like skin details, stretching and muscle movements. With the extracted elements saved as layers to be reapplied at the end of the process, artists could start reshaping his face without breaking the original performance or footage. Artists could tweak additional frames in 3D down the line as needed, but they often didn’t need to, making the de-aging process nearly automated.

“Shapeshifter an entirely new way to de-age people,” says Olcun Tan, owner and visual effects supervisor at Gradient Effects. “While most productions are limited by time or money, we can turn around award-quality VFX on a TV schedule, opening up new possibilities for shows and films.”

Traditionally, de-aging work for film and television has been done in one of two ways: through filtering (saves time, but hard to scale) or CG replacements (better quality, higher cost), which can take six months to a year. Shapeshifter introduces a new method that not only preserves the actor’s original performance, but also interacts naturally with other objects in the scene.

“One of the first shots of ‘Interlude’ shows stage crew walking in front of John Goodman,” describes Tan. “In the past, a studio would have recommended a full CGI replacement for Goodman’s character because it would be too hard or take too much time to maintain consistency across the shot. With Shapeshifter, we can just reshape one frame and the work is done.”

This is possible because Shapeshifter continuously captures the face, including all of its essential details, using the source footage as its guide. With the data being constantly logged, artists can extract movement information from anywhere on the face whenever they want, replacing expensive motion-capture stages, equipment and makeup teams.

Director Ang Lee: Gemini Man and a digital clone

By Iain Blair

Filmmaker Ang Lee has always pushed the boundaries in cinema, both technically and creatively. His film Life of Pi, which he directed and produced, won four Academy Awards — for Best Direction, Best Cinematography, Best Visual Effects and Best Original Score.

Lee’s Brokeback Mountain won three Academy Awards, including Best Direction, Best Adapted Screenplay and Best Original Score. Crouching Tiger, Hidden Dragon was nominated for 10 Academy Awards and won four, including Best Foreign Language Film for Lee, Best Cinematography, Best Original Score and Best Art Direction/Set Decoration.

His latest, Paramount’s Gemini Man, is another innovative film, this time disguised as an action-thriller. It stars Will Smith in two roles — first, as Henry Brogan, a former Special Forces sniper-turned-assassin for a clandestine government organization; and second (with the assistance of ground-breaking visual effects) as “Junior,” a cloned younger version of himself with peerless fighting skills who is suddenly targeting him in a global chase. The chase takes them from the estuaries of Georgia to the streets of Cartagena and Budapest.

Rounding out the cast is Mary Elizabeth Winstead as Danny Zakarweski, a DIA agent sent to surveil Henry; Golden Globe Award-winner Clive Owen as Clay Verris, a former Marine officer now seeking to create his own personal military organization of elite soldiers; and Benedict Wong as Henry’s longtime friend, Baron.

Lee’s creative team included director of photography Dion Beebe (Memoirs of a Geisha, Chicago), production designer Guy Hendrix Dyas (Inception, Indiana Jones and the Kingdom of the Crystal Skull), longtime editor Tim Squyres (Life of Pi and Crouching Tiger, Hidden Dragon) and composer Lorne Balfe (Mission: Impossible — Fallout, Terminator Genisys).

The groundbreaking visual effects were supervised by Bill Westenhofer, Academy Award-winner for Life of Pi as well as The Golden Compass, and Weta  Digital’s Guy Williams, an Oscar-nominee for The Avengers, Iron Man 3 and Guardians of the Galaxy Vol. 2.

Will Smith and Ang Lee on set

I recently talked to Lee — whose directing credits include Taking Woodstock, Hulk, Ride With the Devil, The Ice Storm and Billy Lynn’s Long Halftime Walk — about making the film, which has already generated a lot of awards talk about its cutting-edge technology, the workflow and his love of editing and post.

Hollywood’s been trying to make this for over two decades now, but the technology just wasn’t there before. Now it’s finally here!
It was such a great idea, if you can visualize it. When I was first approached about it by Jerry Bruckheimer and David Ellison, they said, “We need a movie star who’s been around a long time to play Henry, and it’s an action-thriller and he’s being chased by a clone of himself,” and I thought the whole clone idea was so fascinating. I think if you saw a young clone version of yourself, you wouldn’t see yourself as special anymore. It would be, “What am I?” That also brought up themes like nature versus nurture and how different two people with the same genes can be. Then the whole idea of what makes us human? So there was a lot going on, a lot of great ideas that intrigued me. How does aging work and affect you? How would you feel meeting a younger version of yourself? I knew right away it had to be a digital clone.

You certainly didn’t make it easy for yourself as you also decided to shoot it in 120fps at 4K and in 3D.
(Laughs) You’re right, but I’ve been experimenting with new technology for the past decade, and it all started with Life of Pi. That was my first taste of 3D, and for 3D you really need to shoot digitally because of the need for absolute precision and accuracy in synchronizing the two cameras and your eyes. And you need a higher frame rate to get rid of the strobing effect and any strangeness. Then when you go to 120 frames per second, the image becomes so clear and far smoother. It’s like a whole new kind of moviemaking, and that’s fascinating to me.

Did you shoot native 3D?
Yes, even though it’s still so clumsy, and not easy, but for me it’s also a learning process on the set which I enjoy.

Junior

There’s been a lot of talk about digital de-aging use, especially in Scorsese’s The Irishman. But you didn’t use that technique for Will’s younger self, right?
Right. I haven’t seen The Irishman so I don’t know exactly what they did, but this was a total CGI creation, and it’s a lead character where you need all the details and performance. Maybe the de-aging is fine for a quick flashback, but it’s very expensive to do, and it’s all done manually. This was also quite hard to do, and there are two parts to it: Scientifically, it’s quite mind-boggling, and our VFX supervisor Bill Westenhofer and his team worked so hard at it, along with the Weta team headed by VFX supervisor Guy Williams. So did Will. But then the hardest part is dealing with audiences’ impressions of Junior, as you know in the back of your mind that a young Will Smith doesn’t really exist. Creating a fully digital believable human being has been one of the hardest things to do in movies, but now we can.

How early on did you start integrating post and all the VFX?
Before we even started anything, as we didn’t have unlimited money, a big part of the budget went to doing a lot of tests, new equipment, R&D and so on, so we had to be very careful about planning everything. That’s the only way you can reduce costs in VFX. You have to be a good citizen and very disciplined. It was a two-year process, and you plan and shoot layer by layer, and you have to be very patient… then you start making the film in post.

I assume you did a lot of previz?
(Laughs) A whole lot, and not only for all the obvious action scenes. Even for the non-action stuff, we designed and made the cartoons and did previz and had endless meetings and scouted and measured and so on. It was a lot of effort.

How tough was the shoot?
It was very tough and very slow. My last three movies have been like this since the technology’s all so new, so it’s a learning process as you’re figuring it all out as you go. No matter how much you plan, new stuff comes up all the time and equipment fails. It feels very fragile and very vulnerable sometimes. And we only had a budget for a regular movie, so we could only shoot for 80 days, and we were on three continents and places like Budapest and Cartagena as well as around Savannah in the US. Then I insist on doing all the second unit stuff as well, apart from a few establishing shots and sunsets. I have to shoot everything, so we had to plan very carefully with the sound team as every shot is a big deal.

Where did you post?
All in New York. We rented space at Final Frame, and then later we were at Harbor. The thing is, no lab could process our data since it was so huge, so when we were based in Savannah we just built our own technology base and lab so we could process all our dailies and so on — and we bought all our servers, computers and all the equipment needed. It was all in-house, and our technical supervisor Ben Gervais oversaw it all. It was too difficult to take all that to Cartagena, but we took it all to Budapest and then set it all up later in New York for post.

Do you like the post process?
I like the first half, but then it’s all about previews, getting notes, changing things. That part is excruciating. Although I have to give a lot of credit to Paramount as they totally committed to all the VFX quite early and put the big money there before they even saw a cut so we had time to do them properly.

Junior

Talk about editing with Tim Squyres. How did that work?
We sent him dailies. When I’m shooting, I just want to live in my dreams, unless something alarms me, and he’ll let me know. Otherwise, I prefer to work separately. But on this one, since we had to turn over some shots while we were shooting, he came to the set in Budapest, and we’d start post already, which was new to me. Before, I always liked to cut separately.

What were the big editing challenges?
Trying to put all the complex parts together, dealing with the rhythm and pace, going from quiet moments to things like the motorcycle chase scenes and telling the story as effectively as we could —all the usual things. In this medium, everything is more critical visually.

All the VFX play a big role. How many were there?
Over 1,000, but then Junior alone is a huge visual effect in every scene he’s in. Weta did all of him and complained that they got the hardest and most expensive part. (Laughs) The other, easier stuff was spread out to several companies, including Scanline and Clear Angle.

Ang Lee and Iain Blair

Talk about the importance of sound and music.
We did the mix at Harbor on its new stage, and it’s always so important. This time we did something new. Typically, you do Atmos at the final mix and mix the music along with all the rest, but our music editor did an Atmos mix on all the music first and then brought it to us for the final mix. That was very special.

Where did you do the DI and how important is it to you?
It’s huge on a movie like this. We set up our own DI suite in-house at Final Frame with the latest FilmLight Baselight, which is amazing. Our colorist Marcy Robinson had trained on it, and it was a lot easier than on the last film. Dion came in a lot and they worked together, and then I’d come in. We did a lot of work, especially on all the night scenes, enhancing moonlight and various elements.

I think the film turned out really well and looks great. When you have the combination of these elements like 3D, digital cinematography, high frame rate and high resolution, you really get “new immersive cinema.” So for me, it’s a new and different way of telling stories and processing them in your head. The funny thing is, personally I’m a very low-tech person, but I’ve been really pursuing this for the last few years.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Behind the Title: Title Designer Nina Saxon

For 40 years, Nina Saxon has been a pioneer in the area of designing movie titles. She is still one of the few women working in this part of the industry.

NAME: Nina Saxon

COMPANY: Nina Saxon Design

CAN YOU DESCRIBE YOUR COMPANY?
We design main and end titles for film and television as well as branding for still and moving images.

WHAT’S YOUR JOB TITLE?
Title Designer

WHAT DOES THAT ENTAIL?
Making a moving introduction for films, like a book cover, that introduces a film. Or it might be simple type over picture. Also watching a film and showing the director samples or storyboards of what I think should be used for the film.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
That I’m one of only a few women in this field and have worked for 40 years, hiring others to help me only if necessary.

WHAT’S YOUR FAVORITE PART OF THE JOB?
When my project is done and I get to see my finished work up on the screen.

WHAT’S YOUR LEAST FAVORITE?
Waiting to be paid.

WHAT IS YOUR MOST PRODUCTIVE TIME OF THE DAY?
Morning

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I’d probably be a psychologist.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
In 1975, I was in the film department at UCLA and decided I was determined to work in the film business.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
The upcoming documentary on Paul McCartney called Here, There and Everywhere, and upcoming entertainment industry corporate logos that will be revealed in October. In the past, I did the movie Salt with Angeline Jolie and the movie Flight with Denzel Washington.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
Working on the main title open for Forrest Gump.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
My iPad, iPhone and computer

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I exercise a lot, five to six days a week; drink a nice glass of wine; try to get enough sleep; listen to music while meditating before sleep; and make sure I know what I need to do the next day before I go to bed.

Ziva VFX 1.7 helps simplify CG character creation


Ziva Dynamics has introduced Ziva VFX 1.7, designed to make CG character creation easier thanks to the introduction of Art Directable Rest Shapes (ADRS). This tool allows artists to make characters conform to any shape without losing its dynamic properties, opening up a faster path to cartoons and digi-doubles.

Users can now adjust a character’s silhouette with simple sculpting tools. Once the goal shape is established, Ziva VFX can morph to match it, maintaining all of the dynamics embedded before the change. Whether unnatural or precise, ADRS works with any shape, removing the difficulty of both complex setups and time-intensive corrective work.

The Art Directable Rest Shapes feature has been in development for over a year and was created in collaboration with several major VFX and feature animation studios. According to Ziva, while outputs and art styles differed, each group essentially requested the same thing: extreme accuracy and more control without compromising the dynamics that sell a final shot.

For feature animation characters not based on humans or nature, ADRS can rapidly alter and exaggerate key characteristics, allowing artists to be expressive and creative without losing the power of secondary physics. For live-action films, where the use of digi-doubles and other photorealistic characters is growing, ADRS can minimize the setup process when teams want to quickly tweak a silhouette or make muscles fire in multiple ways during a shot.

According to Josh diCarlo, head of rigging at Sony Pictures Imageworks, “Our creature team is really looking forward to the potential of Art Directable Rest Shapes to augment our facial and shot-work pipelines by adding quality while reducing effort. Ziva VFX 1.7 holds the potential to shave weeks of work off of both processes while simultaneously increasing the quality of the end results.”

To use Art Directable Rest Shapes, artists must duplicate a tissue mesh, sculpt their new shape onto the duplicate and add the new geometry as a Rest Shape over select frames. This process will intuitively morph the character, creating a smooth, novel deformation that adheres to any artistic direction a creative team can think up. On top of ADRS, Ziva VFX 1.7 will also include a new zRBFWarp feature, which can warp NURBS surfaces, curves and meshes.

For a free 60-day trial, click here. Ziva VFX 1.7 is available now as an Autodesk Maya plugin for Windows and Linux users. Ziva VFX 1.7 can be purchased in monthly or yearly installments, depending on user type.

According to Michael Smit, chief commercial officer at Ziva Dynamics, “Ziva is working towards a new platform that will more easily allow us to deploy the software into other software packages, operating systems, and different network architectures. As an example we are currently working on our integrations into iOS and Unreal, both of which have already been used in limited release for production settings. We’re hopeful that once we launch the new platform commercially there will be an opportunity to deploy tools for macOS users.”

Flavor adds Joshua Studebaker as CG supervisor

Creative production house Flavor has added CG supervisor Joshua Studebaker to its Los Angeles studio. For more than eight years, Studebaker has been a freelance CG artist in LA, specializing in design, animation, dynamics, lighting/shading and compositing via Maya, Cinema 4D, Vray/Octane, Nuke and After Effects.

A frequent collaborator with Flavor and its brand and agency partners, Studebaker has also worked with Alma Mater, Arsenal FX, Brand New School, Buck, Greenhaus GFX, Imaginary Forces and We Are Royale in the past five years alone. In his new role with Flavor, Studebaker oversees visual effects and 3D services across the company’s global operations. Flavor’s Chicago, Los Angeles and Detroit studios offer color grading, VFX and picture finishing using tools like Autodesk Lustre and Flame Premium.

Flavor creative director Jason Cook also has a long history of working with Studebaker and deep respect for his talent. “What I love most about Josh is that he is both technical and a really amazing artist and designer. Adding him is a huge boon to the Flavor family, instantly elevating our production capabilities tenfold.”

Flavor has always emphasized creativity as a key ingredient, and according to Studebaker, that’s what attracted him. “I see Flavor as a place to grow my creative and design skills, as well as help bring more standardization to our process in house,” he explained. “My vision is to help Flavor become more agile and more efficient and to do our best work together.”

Boris FX beefs up film VFX arsenal, buys SilhouetteFX, Digital Film Tools

Boris FX, a provider of integrated VFX and workflow solutions for video and film, has bought SilhouetteFX (SFX) and Digital Film Tools (DFT). The two companies have a long history of developing tools used on Hollywood blockbusters and experience collaborating with top VFX studios, including Weta Digital, Framestore, Technicolor and Deluxe.

This is the third acquisition by Boris FX in recent years — Imagineer Systems (2014) and GenArts (2016) — and builds upon the company’s editing, visual effects, and motion graphics solutions used by post pros working in film and television. Silhouette and Digital Film Tools join Boris FX’s tools Sapphire, Continuum and Mocha Pro.

Silhouette’s groundbreaking non-destructive paint and advanced rotoscoping technology was recognized earlier this year by the Academy of Motion Pictures (Technical Achievement Award). It first gained prominence after Weta Digital used the rotoscoping tools on King Kong (2005). Now the full-fledged GPU-accelerated node-based compositing app features over 100 VFX nodes and integrated Boris FX Mocha planar tracking. Over the last 15 years, feature film artists have used Silhouette on films including Avatar (2009), The Hobbit (2012), Wonder Woman (2017), Avengers: End Game (2019) and Fast & Furious Presents: Hobbs & Shaw (2019).

Avengers: End Game courtesy of Marvel

Digital Film Tools (DFT) emerged as an off-shoot of a LA-based motion picture visual effects facility whose work included hundreds of feature films, commercials and television shows.

The Digital Film Tools portfolio includes standalone applications as well as professional plug-in collections for filmmakers, editors, colorists and photographers. The products offer hundreds of realistic filters for optical camera simulation, specialized lenses, film stocks and grain, lens flares, optical lab processes, color correction, keying and compositing, as well as natural light and photographic effects. DFT plug-ins support Adobe’s Photoshop, Lightroom, After Effects and Premiere Pro; Apple’s Final Cut Pro X and Motion; Avid’s Media Composer; and OFX hosts, including Foundry Nuke and Blackmagic DaVinci Resolve.

“This acquisition is a natural next step to our continued growth strategy and singular focus on delivering the most powerful VFX tools and plug-ins to the content creation market,”
“Silhouette fits perfectly into our product line with superior paint and advanced roto tools that highly complement Mocha’s core strength in planar tracking and object removal,” says Boris Yamnitsky, CEO/founder of Boris FX. “Rotoscoping, paint, digital makeup and stereo conversion are some of the most time-consuming, labor-intensive aspects of feature film post. Sharing technology and tools across all our products will make Silhouette even stronger as the leader in these tasks. Furthermore, we are very excited to be working with such an accomplished team [at DFT] and look forward to collaborating on new product offerings for photography, film and video.”

Silhouette founders, Marco Paolini, Paul Miller and Peter Moyer, will continue in their current leadership roles and partner with the Mocha product development team to collaborate on delivering next-generation tools. “By joining forces with Boris FX, we are not only dramatically expanding our team’s capabilities, but we are also joining a group of like-minded film industry pros to provide the best solutions and support to our customers,” says Marco Paolini, Product Designer. “The Mocha planar tracking option we currently license is extremely popular with Silhouette paint and roto artists, and more recently through OFX, we’ve added support for Sapphire plug-ins. Working together under the Boris FX umbrella is our next logical step and we are excited to add new features and continue advancing Silhouette for our user base.”

Both Silhouette and Digital Film Tool plug-ins will continue to be developed and sold under the Boris FX brand. Silhouette will adopt the Boris FX commitment to agile development with annual releases, annual support and subscription options.

Main Image: Silhouette

3D Design + Motion Tour to visit 26 cities, 3D artists presenting

The 3D Design + Motion Tour will hit the road September 2 through December 10, making stops in 26 cities throughout North America and Europe. Artists who want to break into high-end 3D digital production will have the chance to learn from VIPs in the motion design and visual effects industry, including Andrew Kramer from Video Copilot, Nick Campbell from Greyscalegorilla, EJ Hassenfratz with Eyedesyn, Chris Schmidt with Rocket Lasso and Thanassis Pozantzis with Noseman. The tour is sponsored by Maxon, Adobe, Nvidia and Dell and produced by Future Media Conferences with partnership support from NAB Show.

At each 3D Design + Motion Tour stop, attendees will learn how to create a state-of-the-art production company vanity logo animation. Event organizers have tapped the creative team at Perception — the motion graphics design studio recognized for redesigning the Marvel Studios logo and opening animation and visual effects for Black Panther, Ant-Man and the Wasp, Thor: Ragnarok and Batman v. Superman: Dawn of Justice —to design and execute an exclusive project for the tour. Industry-leading guest artists will break down key project elements and share motion design techniques that highlight the integration and performance between Maxon’s Cinema 4D application, Adobe’s Creative Suite and the Redshift GPU-accelerated renderer for unlimited creative capabilities.

The $95 cost to attend the 3D Design + Motion Tour includes software apps to help artists sharpen their 3D skillsets:
• Adobe CC 30-day license
• Cinema 4D 90-day license
• Redshift 90-day license
• Sketchfab 90-day license
• Project files and resources
• Inclusion in event drawings

Tour registration also includes networking with local artists and VIP presenters, access to tutorials and project files, and follow-up webinars. Space is limited. Registration details are here.

Additional details on the 3D Design + Motion Tour, including tour stops and guest presenters, are available here.

Roger and Big Machine merge, operate as Roger

Creative agency Roger and full-service production company Big Machine have merged — a move that will expand the creative capabilities for their respective agency, brand and entertainment clients. The studios will retain the Roger name and operate at Roger’s newly renovated facility in Los Angeles.

The combined management team includes CD Terence Lee, CD Dane Macbeth, EP Josh Libitsky, director Steve Petersen, CD Ken Carlson and Sean Owolo, who focuses on business development.

Roger now offers expanded talent and resources for projects that require branding, design, animation, VFX, VR/AR, live action and content development. Roger uses Adobe Creative Cloud for most of its workflows. The tools vary from project to project, but outside of the Adobe Suite, they also use Maxon Cinema4D, Autodesk Maya, Blackmagic DaVinci Resolve and Foundry Nuke.

Since the merger, the studio is already embarking on a number of projects, including major creative campaigns for Disney and Sony Pictures.

Roger’s new 6,500-square-foot studio includes four private offices, three editing suites, two conference rooms, an empty shooting space for greenscreen work, a kitchen and a lounge.

Technicolor adds Patrick Smith, Steffen Wild to prepro studio

Technicolor has added Patrick Smith to head its visualization department, partnering with filmmakers to help them realize their vision in a digital environment before they hit the set. By helping clients define lensing, set dimensions, asset placement and even precise on-set camera moves, Smith and his team will play a vital role in helping clients plan their shoots in the virtual environment in ways that feel completely natural and intuitive to them. He reports to Kerry Shea, who heads Technicolor’s Pre-Production Studio.

“By enabling clients to leverage the latest visualization technologies and techniques while using hardware similar to what they are already familiar with, Patrick and his team will empower filmmakers by ensuring their creative visions are clearly defined at the very start of their projects — and remain at the heart of everything they do from their first day on set to take their stories to the next level,” explains Shea. “Bringing visualization and the other areas of preproduction together under one roof removes redundancy from the filmmaking process which, in turn, reduces stress on the storytellers and allows them as much time as possible to focus on telling their story. Until now, the process of preproduction has been a divided and inefficient process involving different vendors and repeated steps. Bringing those worlds together and making it a seamless, start-to-finish process is a game changer.”

Smith has held a number of senior positions within the industry, including most recently as creative director/senior visualization supervisor at The Third Floor. He has worked on titles such as Bumblebee, Avengers: Infinity War, Spider-Man: Homecoming, Guardians of the Galaxy Vol. 2 and The Secret Life of Walter Mitty.

“Visualization used to involve deciding roughly what you plan to do on set. Today, you can plan out precisely how to achieve your vision on set down to the inch – from the exact camera lens to use, to exactly how much dolly track you’ll need, to precisely where to place your actors,” he says. “Visualization should be viewed as the director’s paint brush. It’s through the process of visualization that directors can visually explore and design their characters and breathe life into their story. It’s a sandbox where they can experiment, play and perfect their vision before the pressure of being on set.”

In other Technicolor news, last week the studio announced that Steffen Wild has joined as head of its virtual production department. “As head of virtual production, Wild will help drive the studio’s approach to efficient filmmaking by bringing previously separate departments together into a single pipeline,” says Shea. “We currently see what used to be separate departments merge together. For example, previz, techviz and postviz, which were all separate ways to find answers to production questions, are now in the process of collaborating together in virtual production.”

Wild has over 20 years of experience, including 10 years spearheading Jim Henson’s Creature Shop’s expending efforts in innovative animation technologies, virtual studio productions and new ways of visual storytelling. As SVP of digital puppetry and visual effects at the Creature Shop, Wild crafted new production techniques using proprietary game engine technologies. He brings with him in-depth knowledge of global and local VFX and animation production, rapid prototyping and cloud-based entertainment projects. In addition to his role in the development of next-generation cinematic technologies, he has set up VFX/animation studios in the US, China and southeast Europe.

Main Image: (L-R) Patrick Smith and Steffen Wild

An artist’s view of SIGGRAPH 2019

By Andy Brown

While I’ve been lucky enough to visit NAB and IBC several times over the years, this was my first SIGGRAPH. Of course, there are similarities. There are lots of booths, lots of demos, lots of branded T-shirts, lots of pairs of black jeans and a lot of beards. I fit right in. I know we’re not all the same, but we certainly looked like it. (The stats regarding women and diversity in VFX are pretty poor, but that’s another topic.)

Andy Brown

You spend your whole career in one industry and I guess you all start to look more and more like each other. That’s partly the problem for the people selling stuff at SIGGRAPH.

There were plenty of compositing demos from of all sorts of software. (Blackmagic was running a hands-on class for 20 people at a time.) I’m a Flame artist, so I think that Autodesk’s offering is best, obviously. Everyone’s compositing tool can play back large files and color correct, composite, edit, track and deliver, so in the midst of a buzzy trade show, the differences feel far fewer than the similarities.

Mocap
Take the world of tracking and motion capture as another example. There were more booths demonstrating tracking and motion capture than anything in the main hall, and all that tech came in different shapes and sizes and an interesting mix of hardware and software.

The motion capture solution required for a Hollywood movie isn’t the same as the one to create a live avatar on your phone, however. That’s where it gets interesting. There are solutions that can capture and translate the movement of everything from your fingers to your entire body using hardware from an iPhone X to a full 360-camera array. Some solutions used tracking ball markers, some used strips in the bodysuit and some used tiny proximity sensors, but the results were all really impressive.

Vicon

Vicon

Some tracking solution companies had different versions of their software and hardware. If you don’t need all of the cameras and all of the accuracy, then there’s a basic version for you. But if you need everything to be perfectly tracked in real time, then go for the full-on pro version with all the bells and whistles. I had a go at live-animating a monkey using just my hands, and apart from ending with him licking a banana in a highly inappropriate manner, I think it worked pretty well.

AR/VR
AR and VR were everywhere, too. You couldn’t throw a peanut across the room without hitting someone wearing a VR headset. They’d probably be able to bat it away whilst thinking they were Joe Root or Max Muncy (I had to Google him), with the real peanut being replaced with a red or white leather projectile. Haptic feedback made a few appearances, too, so expect to be able to feel those virtual objects very soon. Some of the biggest queues were at the North stand where the company had glasses that looked like the glasses everyone was wearing already (like mine, obviously) except the glasses incorporated a head-up display. I have mixed feelings about this. Google Glass didn’t last very long for a reason, although I don’t think North’s glasses have a camera in them, which makes things feel a bit more comfortable.

Nvidia

Data
One of the central themes for me was data, data and even more data. Whether you are interested in how to capture it, store it, unravel it, play it back or distribute it, there was a stand for you. This mass of data was being managed by really intelligent components and software. I was expecting to be writing all about artificial intelligence and machine learning from the show, and it’s true that there was a lot of software that used machine learning and deep neural networks to create things that looked really cool. Environments created using simple tools looked fabulously realistic because of deep learning. Basic pen strokes could be translated into beautiful pictures because of the power of neural networks. But most of that machine learning is in the background; it’s just doing the work that needs to be done to create the images, lighting and physical reactions that go to make up convincing and realistic images.

The Experience Hall
The Experience Hall was really great because no one was trying to sell me anything. It felt much more like an art gallery than a trade show. There were long waits for some of the exhibits (although not for the golf swing improver that I tried), and it was all really fascinating. I didn’t want to take part in the experiment that recorded your retina scan and made some art out of it, because, well, you know, its my retina scan. I also felt a little reluctant to check out the booth that made light-based animated artwork derived from your date of birth, time of birth and location of birth. But maybe all of these worries are because I’ve just finished watching the Netflix documentary The Great Hack. I can’t help but think that a better source of the data might be something a little less sinister.

The walls of posters back in the main hall described research projects that hadn’t yet made it into full production and gave more insight into what the future might bring. It was all about refinement, creating better algorithms, creating more realistic results. These uses of deep learning and virtual reality were applied to subjects as diverse as translating verbal descriptions into character design, virtual reality therapy for post-stroke patients, relighting portraits and haptic feedback anesthesia training for dental students. The range of the projects was wide. Yet everyone started from the same place, analyzing vast datasets to give more useful results. That brings me back to where I started. We’re all the same, but we’re all different.

Main Image Credit: Mike Tosti


Andy Brown is a Flame artist and creative director of Jogger Studios, a visual effects studio with offices in Los Angeles, New York, San Francisco and London.

Autodesk intros Bifrost for Maya at SIGGRAPH

At SIGGRAPH, Autodesk announced a new visual programming environment in Maya called Bifrost, which makes it possible for 3D artists and technical directors to create serious effects quickly and easily.

“Bifrost for Maya represents a major development milestone for Autodesk, giving artists powerful tools for building feature-quality VFX quickly,” says Chris Vienneau, senior director, Maya and Media & Entertainment Collection. “With visual programming at its core, Bifrost makes it possible for TDs to build custom effects that are reusable across shows. We’re also rolling out an array of ready-to-use graphs to make it easy for artists to get 90% of the way to a finished effect fast. Ultimately, we hope Bifrost empowers Maya artists to streamline the creation of anything from smoke, fire and fuzz to high-performance particle systems.”

Bifrost highlights include:

  • Ready-to-Use Graphs: Artists can quickly create state-of-the-art effects that meet today’s quality demands.
  • One Graph: In a single visual programming graph, users can combine nodes ranging from math operations to simulations.
  • Realistic Previews: Artists can see exactly how effects will look after lighting and rendering right in the Arnold Viewport in Maya.
  • Detailed Smoke, Fire and Explosions: New physically-based solvers for aerodynamics and combustion make it easy to create natural-looking fire effects.
  • The Material Point Method: The new MPM solver helps artists tackle realistic granular, cloth and fiber simulations.
  • High-Performance Particle System: A new particle system crafted entirely using visual programming adds power and scalability to particle workflows in Maya.
  • Artistic Effects with Volumes: Bifrost comes loaded with nodes that help artists convert between meshes, points and volumes to create artistic effects.
  • Flexible Instancing: High-performance, rendering-friendly instancing empowers users to create enormous complexity in their scenes.
  • Detailed Hair, Fur and Fuzz: Artists can now model things consisting of multiple fibers (or strands) procedurally.

Bifrost is available for download now and works with any version of Maya 2018 or later. It will also be included in the installer for Maya 2019.2 and later versions. Updates to Bifrost between Maya releases will be available for download from Autodesk AREA.

In addition to the release of Bifrost, Autodesk highlighted the latest versions of Shotgun, Arnold, Flame and 3ds Max. The company gave a tech preview of a new secure enterprise Shotgun that supports network segregation and customer-managed media isolation on AWS, making it possible for the largest studios to collaborate in a closed-network pipeline in the cloud. Shotgun Create, now out of beta, delivers a cloud-connected desktop experience, making it easier for artists and reviewers to see which tasks demand attention while providing a collaborative environment to review media and exchange feedback accurately and efficiently. Arnold 5.4 adds important updates to the GPU renderer, including OSL and OpenVDB support, while Flame 2020.1 introduces more uses of AI with new Sky Extraction tools and specialized image segmentation features. Also on display, the 3ds Max 2020.1 update features modernized procedural tools for 3D modeling.

Maxon intros Cinema 4D R21, consolidates versions into one offering

By Brady Betzel

At SIGGRAPH 2019, Maxon introduced the next release of its graphics software, Cinema 4D R21. Maxon also announced a subscription-based pricing structure as well as a very welcomed consolidation of its Cinema 4D versions into a single version, aptly titled Cinema 4D.

That’s right, no more Studio, Broadcast or BodyPaint. It all comes in one package at one price, and that pricing will now be subscription-based — but don’t worry, the online anxiety over this change seems to have been misplaced.

The cost has been substantially dropped for Cinema 4D R21, leading the way to start what Maxon is calling the “3D for the Real World” initiative. Maxon wants it to be the tool you choose for your graphics needs.

If you plan on upgrading every year or two, the new subscription-based model seems to be a great deal:

– Cinema 4D subscription paid annually: $59.99/month
– Cinema 4D subscription paid monthly: $94.99/month
– Cinema 4D subscription with Redshift paid annually: $81.99/month
– Cinema 4D subscription with Redshift paid monthly: $116.99/month
– Cinema 4D perpetual pricing: $3,495 (upgradeable)

Maxon did mention that if you have previously purchased Cinema 4D, there will be subscription-based upgrade/crossgrade deals coming.

The Updates
Cinema 4D R21 includes some great updates that will be welcomed by many users, both new and experienced. The new Field Force dynamics object allows the use of dynamic forces in modeling and animation within the MoGraph toolset. Caps and bevels have an all-new system that not only allows the extrusion of 3D logos and text effects but also means caps and bevels are integrated on all spline-based objects.

Furthering Cinema 4D’s integration with third-party apps, there is an all-new Mixamo Control rig allowing you to easily control any Mixamo characters. (If you haven’t checked out the models from Mixamo, you should. It’s a great way to find character rigs fast.)

An all-new Intel Open Image Denoise integration has been added to R21 in what seems like part of a rendering revolution for Cinema 4D. From the acquistion of Redshift to this integration, Maxon is expanding its third-party reach and doesn’t seem scared.

There is a new Node Space, which shows what materials are compatible with chosen render engines, as well as a new API available to third-party developers that allows them to integrate render engines with the new material node system. R21 has overall speed and efficiency improvements, with Cinema 4D supporting the latest processor optimizations from both Intel and AMD.

All this being said, my favorite update — or map toward the future — was actually announced last week. Unreal Engine added Cinema 4D .c4d file support via the Datasmith plugin, which is featured in the free Unreal Studio beta.

Today, Maxon is also announcing its integration with yet another game engine: Unity. In my opinion, the future lies in this mix of real-time rendering alongside real-world television and film production as well as gaming. With Cinema 4D, Maxon is bringing all sides to the table with a mix of 3D modeling, motion-graphics-building support, motion tracking, integration with third-party apps like Adobe After Effects via Cineware, and now integration with real-time game engines like Unreal Engine. Now I just have to learn it all.

Cinema 4D R21 will be available on both Mac OS and Windows on Tuesday, Sept. 3. In the meantime, watch out for some great SIGGRAPH presentations, including one from my favorite, Mike Winkelmann, better known as Beeple. You can find some past presentations on how he uses Cinema 4D to cover his “Everydays.”

Virtual Production Field Guide: Fox VFX Lab’s Glenn Derry

Just ahead of SIGGRAPH, Epic Games has published a resource guide called “The Virtual Production Field Guide”  — a comprehensive look at how virtual production impacts filmmakers, from directors to the art department to stunt coordinators to VFX teams and more. The guide is workflow-agnostic.

The use of realtime game engine technology has the potential to impact every aspect of traditional filmmaking, and the trend is increasingly being used in productions ranging from films like Avengers: Endgame and the upcoming Artemis Fowl to TV series like Game of Thrones.

The Virtual Production Field Guide offers an in-depth look at different types of techniques from creating and integrating high-quality CG elements live on set to virtual location scouting to using photoreal LED walls for in-camera VFX. It provides firsthand insights from award-winning professionals who have used these techniques – including directors Kenneth Branagh and Wes Ball, producers Connie Kennedy and Ryan Stafford, cinematographers Bill Pope and Haris Zambarloukos, VFX supervisors Ben Grossmann and Sam Nicholson, virtual production supervisors Kaya Jabar and Glenn Derry, editor Dan Lebental, previs supervisor Felix Jorge, stunt coordinators Guy and Harrison Norris, production designer Alex McDowell, and grip Kim Heath.

As mentioned, the guide is dense with information, so we decided to run an excerpt to give you an idea of what it covers.

Glenn DerryHere is an interview with Glenn Derry, founder and VP of visual effects at Fox VFX Lab, which offers a variety of virtual production services with a focus on performance capture. Derry is known for his work as a virtual production supervisor on projects like Avatar, Real Steel and The Jungle Book.

Let’s find out more.

How has performance capture evolved since projects such as The Polar Express?
In those earlier eras, there was no realtime visualization during capture. You captured everything as a standalone piece, and then you did what they called the director layout. After-the-fact, you would assemble the animation sequences from the motion data captured. Today, we’ve got a combo platter where we’re able to visualize in realtime.
When we bring a cinematographer in, he can start lining up shots with another device called the hybrid camera. It’s a tracked reference camera that he can hand hold. I can immediately toggle between an Unreal overview or a camera view of that scene.The earlier process was minimal in terms of aesthetics. We did everything we could in MotionBuilder, and we made it look as good as it could. Now we can make a lot more mission-critical decisions earlier in the process because the aesthetics of the renders look a lot better.

What are some additional uses for performance capture?
Sometimes we’re working with a pitch piece, where the studio is deciding whether they want to make a movie at all. We use the capture stage to generate what the director has in mind tonally and how the project could feel. We could do either a short little pitch piece or, for something like Call of the Wild, we created 20 minutes and three key scenes from the film to show the studio we could make it work.

The second the movie gets greenlit, we flip over into preproduction. Now we’re breaking down the full script and working with the art department to create concept art. Then we build the movie’s world out around those concepts.

We have our team doing environmental builds based on sketches. Or in some cases, the concept artists themselves are in Unreal Engine doing the environments. Then our virtual art department (VAD) cleans those up and optimizes them for realtime.

Are the artists modeling directly in Unreal Engine?
The artists model in Maya, Modo, 3ds Max, etc. — we’re not particular about the application as long as the output is FBX. The look development, which is where the texturing happens, is all done within Unreal. We’ll also have artists working in Substance Painter and it will auto-update in Unreal. We have to keep track of assets through the entire process, all the way through to the last visual effects vendor.

How do you handle the level of detail decimation so realtime assets can be reused for visual effects?
The same way we would work on AAA games. We begin with high-resolution detail and then use combinations of texture maps, normal maps and bump maps. That allows us to get high-texture detail without a huge polygon count. There are also some amazing LOD [level of detail] tools built into Unreal, which enable us to take a high-resolution asset and derive something that looks pretty much identical unless you’re right next to it, but runs at a much higher frame rate.

Do you find there’s a learning curve for crew members more accustomed to traditional production?
We’re the team productions come to do realtime on live-action sets. That’s pretty much all we do. That said, it requires prep, and if you want it to look great, you have to make decisions. If you were going to shoot rear projection back in the 1940s or Terminator 2 with large rear projection systems, you still had to have all that material pre-shot to make it work.
It’s the same concept in realtime virtual production. If you want to see it look great in Unreal live on the day, you can’t just show up and decide. You have to pre-build that world and figure out how it’s going to integrate.

The visual effects team and the virtual production team have to be involved from day one. They can’t just be brought in at the last minute. And that’s a significant change for producers and productions in general. It’s not that it’s a tough nut to swallow, it’s just a very different methodology.

How does the cinematographer collaborate with performance capture?
There are two schools of thought: one is to work live with camera operators, shooting the tangible part of the action that’s going on, as the camera is an actor in the scene as much as any of the people are. You can choreograph it all out live if you’ve got the performers and the suits. The other version of it is treated more like a stage play. Then you come back and do all the camera coverage later. I’ve seen DPs like Bill Pope and Caleb Deschanel pick this right up.

How is the experience for actors working in suits and a capture volume?
One of the harder problems we deal with is eye lines. How do we assist the actors so that they’re immersed in this, and they don’t just look around at a bunch of gray box material on a set. On any modern visual effects movie, you’re going to be standing in front of a 50-foot-tall bluescreen at some point.

Performance capture is in some ways more actor-centric versus a traditional set because there aren’t all the other distractions in a volume such as complex lighting and camera setup time. The director gets to focus in on the actors. The challenge is getting the actors to interact with something unseen. We’ll project pieces of the set on the walls and use lasers for eye lines. The quality of the HMDs today are also excellent for showing the actors what they would be seeing.

How do you see performance capture tools evolving?
I think a lot of the stuff we’re prototyping today will soon be available to consumers, home content creators, YouTubers, etc. A lot of what Epic develops also gets released in the engine. Money won’t be the driver in terms of being able to use the tools, your creative vision will be.

My teenage son uses Unreal Engine to storyboard. He knows how to do fly-throughs and use the little camera tools we built — he’s all over it. As it becomes easier to create photorealistic visual effects in realtime with a smaller team and at very high fidelity, the movie business will change dramatically.

Something that used to cost $10 million to produce might be a million or less. It’s not going to take away from artists; you still need them. But you won’t necessarily need these behemoth post companies because you’ll be able to do a lot more yourself. It’s just like desktop video — what used to take hundreds of thousands of dollars’ worth of Flame artists, you can now do yourself in After Effects.

Do you see new opportunities arising as a result of this democratization?
Yes, there are a lot of opportunities. High-quality, good-looking CG assets are still expensive to produce and expensive to make look great. There are already stock sites like TurboSquid and CGTrader where you can purchase beautiful assets economically.

But with the final assembly and coalescing of environments and characters there’s still a lot of need for talented people to do it effectively. I can see companies emerging out of that necessity. We spend a lot of time talking about assets because it’s the core of everything we do. You need to have a set to shoot on and you need compelling characters, which is why actors won’t go away.

What’s happening today isn’t even the tip of the iceberg. There are going to be 50 more big technological breakthroughs along the way. There’s tons of new content being created for Apple, Netflix, Amazon, Disney+, etc. And they’re all going to leverage virtual production.
What’s changing is previs’ role and methodology in the overall scheme of production.
While you might have previously conceived of previs as focused on the pre-production phase of a project and less integral to production, that conception shifts with a realtime engine. Previs is also typically a hands-off collaboration. In a traditional pipeline, a previs artist receives creative notes and art direction then goes off to create animation and present it back to creatives later for feedback.

In the realtime model, because the assets are directly malleable and rendering time is not a limiting factor, creatives can be much more directly and interactively involved in the process. This leads to higher levels of agency and creative satisfaction for all involved. This also means that instead of working with just a supervisor you might be interacting with the director, editor and cinematographer to design sequences and shots earlier in the project. They’re often right in the room with you as you edit the previs sequence and watch the results together in realtime.

Previs image quality has continued to increase in visual fidelity. This means a greater relationship between previs and final pixel image quality. When the assets you develop as a previs artist are of a sufficient quality, they may form the basis of final models for visual effects. The line between pre and final will continue to blur.

The efficiency of modeling assets only once is evident to all involved. By spending the time early in the project to create models of a very high quality, post begins at the outset of a project. Instead of waiting until the final phase of post to deliver the higher-quality models, the production has those assets from the beginning. And the models can also be fed into ancillary areas such as marketing, games, toys and more.