Quantum F1000

Category Archives: Visualization

GTC: GPUs power The Mandalorian‘s in-camera VFX, realtime workflows

By Mike McCarthy

Each year, Nvidia hosts a series of conferences that focus on new developments in GPU-based computing. Originally, these were about graphics and visualization, which were the most advanced things being done with GPUs. Now they focus on everything from supercomputing and AI to self-driving cars and VR. The first GTC conference I attended was in 2016, when Nvidia announced its Pascal architecture with dedicated Tensor cores. While that was targeted to supercomputer users, there was still a lot of graphics-based content to explore, especially with VR.

Over time, the focus has shifted from visual applications to AI applications that aren’t necessarily graphics-based; they just have similar parallel computing requirements to graphics processing and are optimal tasks to be accelerated on GPU hardware. This has made GTC more relevant to programmers and similar users, but the hardware developments that enable those capabilities also accelerate the more traditional graphics workflows — and new ways of using that power are constantly being developed.

I was looking forward to going to March’s GTC to hear the details on what was expected to be an announcement about Nvidia’s next generation of hardware architecture and to see all of the other presentations about how others have been using current GPU technology. Then came the coronavirus, and the world changed. Nvidia canceled the online keynote and a few SDK updates were released, but all major product announcements have been deferred for the time being. What Nvidia did offer was a selection of talks and seminars that were remotely recorded and hosted as videos to watch. These are available to anyone who registers for the free online version of GTC, instead of paying the hundreds it would cost to attend in person.

One that really stood out to me was “Creating In-Camera VFX with Realtime Workflows.” It highlighted the Unreal Engine and what that technology allowed on The Mandalorian — it was amazing. The basic premise is to replace greenscreen composites with VFX projections behind the elements being photographed. This was done years ago for exteriors of in-car scenes using flat prerecorded footage, but technology has progressed dramatically since then. The main advances are in motion capture, 3D rendering and LED walls.

From the physical standpoint, LED video walls have greater brightness, allowing them not only to match the lit foreground subjects, but to light those subjects for accurate shadows and reflections without post compositing. And if that background imagery can be generated in real time — instead of recordings or renders — it can respond to the movement of the camera as well. That is where Unreal comes in — as a 3D game rendering engine that is repurposed to generate images corrected for the camera’s perspective in order to project on the background. This allows live-action actors to be recorded in complex CGI environments as if they were real locations. Actors can see the CGI elements they are interacting with, and the crew can see it all working together in real time without having to imagine how it’s going to look after VFX. We looked at using this technology on the last film I worked on, but it wasn’t quite there yet at the scale we needed; we used greenscreens instead, but it looks like this use of the technology has arrived. And Nvidia should be happy, because it takes a lot more GPU power to render the whole environment in real time than it does to render just what the camera sees after filming. But the power is clearly available, and even more is coming.

While no new Nvidia technology has been announced, something is always right around the corner. The current Turing generation of GPUs, which has been available for over 18 months, brought dedicated RTX cores for realtime raytracing. The coming generation is expected to scale up the number of CUDA cores and amount of memory by using smaller transistors than Turing’s 12nm process. This should offer more processing power for less money, which is always a welcome development.


Mike McCarthy is an online editor/workflow consultant with over 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.

Technicolor adds Patrick Smith, Steffen Wild to prepro studio

Technicolor has added Patrick Smith to head its visualization department, partnering with filmmakers to help them realize their vision in a digital environment before they hit the set. By helping clients define lensing, set dimensions, asset placement and even precise on-set camera moves, Smith and his team will play a vital role in helping clients plan their shoots in the virtual environment in ways that feel completely natural and intuitive to them. He reports to Kerry Shea, who heads Technicolor’s Pre-Production Studio.

“By enabling clients to leverage the latest visualization technologies and techniques while using hardware similar to what they are already familiar with, Patrick and his team will empower filmmakers by ensuring their creative visions are clearly defined at the very start of their projects — and remain at the heart of everything they do from their first day on set to take their stories to the next level,” explains Shea. “Bringing visualization and the other areas of preproduction together under one roof removes redundancy from the filmmaking process which, in turn, reduces stress on the storytellers and allows them as much time as possible to focus on telling their story. Until now, the process of preproduction has been a divided and inefficient process involving different vendors and repeated steps. Bringing those worlds together and making it a seamless, start-to-finish process is a game changer.”

Smith has held a number of senior positions within the industry, including most recently as creative director/senior visualization supervisor at The Third Floor. He has worked on titles such as Bumblebee, Avengers: Infinity War, Spider-Man: Homecoming, Guardians of the Galaxy Vol. 2 and The Secret Life of Walter Mitty.

“Visualization used to involve deciding roughly what you plan to do on set. Today, you can plan out precisely how to achieve your vision on set down to the inch – from the exact camera lens to use, to exactly how much dolly track you’ll need, to precisely where to place your actors,” he says. “Visualization should be viewed as the director’s paint brush. It’s through the process of visualization that directors can visually explore and design their characters and breathe life into their story. It’s a sandbox where they can experiment, play and perfect their vision before the pressure of being on set.”

In other Technicolor news, last week the studio announced that Steffen Wild has joined as head of its virtual production department. “As head of virtual production, Wild will help drive the studio’s approach to efficient filmmaking by bringing previously separate departments together into a single pipeline,” says Shea. “We currently see what used to be separate departments merge together. For example, previz, techviz and postviz, which were all separate ways to find answers to production questions, are now in the process of collaborating together in virtual production.”

Wild has over 20 years of experience, including 10 years spearheading Jim Henson’s Creature Shop’s expending efforts in innovative animation technologies, virtual studio productions and new ways of visual storytelling. As SVP of digital puppetry and visual effects at the Creature Shop, Wild crafted new production techniques using proprietary game engine technologies. He brings with him in-depth knowledge of global and local VFX and animation production, rapid prototyping and cloud-based entertainment projects. In addition to his role in the development of next-generation cinematic technologies, he has set up VFX/animation studios in the US, China and southeast Europe.

Main Image: (L-R) Patrick Smith and Steffen Wild

Quantum F1000