Quantum F1000

Category Archives: Virtual Production

GTC: GPUs power The Mandalorian‘s in-camera VFX, realtime workflows

By Mike McCarthy

Each year, Nvidia hosts a series of conferences that focus on new developments in GPU-based computing. Originally, these were about graphics and visualization, which were the most advanced things being done with GPUs. Now they focus on everything from supercomputing and AI to self-driving cars and VR. The first GTC conference I attended was in 2016, when Nvidia announced its Pascal architecture with dedicated Tensor cores. While that was targeted to supercomputer users, there was still a lot of graphics-based content to explore, especially with VR.

Over time, the focus has shifted from visual applications to AI applications that aren’t necessarily graphics-based; they just have similar parallel computing requirements to graphics processing and are optimal tasks to be accelerated on GPU hardware. This has made GTC more relevant to programmers and similar users, but the hardware developments that enable those capabilities also accelerate the more traditional graphics workflows — and new ways of using that power are constantly being developed.

I was looking forward to going to March’s GTC to hear the details on what was expected to be an announcement about Nvidia’s next generation of hardware architecture and to see all of the other presentations about how others have been using current GPU technology. Then came the coronavirus, and the world changed. Nvidia canceled the online keynote and a few SDK updates were released, but all major product announcements have been deferred for the time being. What Nvidia did offer was a selection of talks and seminars that were remotely recorded and hosted as videos to watch. These are available to anyone who registers for the free online version of GTC, instead of paying the hundreds it would cost to attend in person.

One that really stood out to me was “Creating In-Camera VFX with Realtime Workflows.” It highlighted the Unreal Engine and what that technology allowed on The Mandalorian — it was amazing. The basic premise is to replace greenscreen composites with VFX projections behind the elements being photographed. This was done years ago for exteriors of in-car scenes using flat prerecorded footage, but technology has progressed dramatically since then. The main advances are in motion capture, 3D rendering and LED walls.

From the physical standpoint, LED video walls have greater brightness, allowing them not only to match the lit foreground subjects, but to light those subjects for accurate shadows and reflections without post compositing. And if that background imagery can be generated in real time — instead of recordings or renders — it can respond to the movement of the camera as well. That is where Unreal comes in — as a 3D game rendering engine that is repurposed to generate images corrected for the camera’s perspective in order to project on the background. This allows live-action actors to be recorded in complex CGI environments as if they were real locations. Actors can see the CGI elements they are interacting with, and the crew can see it all working together in real time without having to imagine how it’s going to look after VFX. We looked at using this technology on the last film I worked on, but it wasn’t quite there yet at the scale we needed; we used greenscreens instead, but it looks like this use of the technology has arrived. And Nvidia should be happy, because it takes a lot more GPU power to render the whole environment in real time than it does to render just what the camera sees after filming. But the power is clearly available, and even more is coming.

While no new Nvidia technology has been announced, something is always right around the corner. The current Turing generation of GPUs, which has been available for over 18 months, brought dedicated RTX cores for realtime raytracing. The coming generation is expected to scale up the number of CUDA cores and amount of memory by using smaller transistors than Turing’s 12nm process. This should offer more processing power for less money, which is always a welcome development.


Mike McCarthy is an online editor/workflow consultant with over 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.

ILM’s virtual production platform used on The Mandalorian

To bring The Mandalorian to life, Industrial Light & Magic (ILM) and Epic Games — along with production technology partners Fuse, Lux Machina, Profile Studios, Nvidia and ARRI — have introduced a new way to shoot VFX-heavy projects in collaboration with Jon Favreau’s Golem Creations.

The new virtual production workflow allows filmmakers to capture a significant amount of complex visual effects shots in-camera using realtime game engine technology (Epic’s Unreal Engine) and LED screens to represent dynamic photoreal digital landscapes and sets with creative flexibility previously unimaginable.

Also part of the news, ILM has made its new end-to-end virtual production solution, ILM StageCraft, available for use by filmmakers, agencies and showrunners worldwide.

Over 50 percent of The Mandalorian Season 1 was filmed using this new methodology, eliminating the need for location shoots entirely. Instead, actors in The Mandalorian performed in an immersive and massive 20-foot-high by 270-degree semicircular LED video wall and ceiling with a 75-foot-diameter performance space, where the practical set pieces were combined with digital extensions on the screens.

Digital 3D environments created by ILM played back interactively on the LED walls, edited in realtime during the shoot, which allowed for pixel-accurate tracking and perspective-correct 3D imagery rendered at high resolution via systems powered by Nvidia GPUs.

L-R: Jon Favreau and Richard Bluff

The environments were lit and rendered from the perspective of the camera to provide parallax in real time, as if the camera were really capturing the physical environment with accurate interactive light on the actors and practical sets, giving showrunner Favreau; executive producer/director Dave Filoni; visual effects supervisor Richard Bluff; cinematographers Greig Fraser and Barry “Baz” Idoine and the episodic directors the ability to make concrete creative choices for visual effects-driven work during photography and achieve realtime in-camera composites on set.

The technology and workflow required to make in-camera compositing and effects practical for on-set use combined the ingenuity of all the partners involved.

“We’ve been experimenting with these technologies on my past projects and were finally able to bring a group together with different perspectives to synergize film and gaming advances and test the limits of realtime, in-camera rendering,” explains Favreau adding, “We are proud of what was achieved and feel that the system we built was the most efficient way to bring The Mandalorian to life.”

“Merging our efforts in the space with what Jon Favreau has been working toward using virtual reality and game engine technology in his filmmaking finally gave us the chance to execute the vision,” says Rob Bredow, executive creative director and head of ILM. “StageCraft has grown out of the culmination of over a decade of innovation in the virtual production space at ILM. Seeing our digital sets fully integrated, in real time on stage, providing the kind of in-camera shots we’ve always dreamed of while also providing the majority of the lighting was really a dream come true.”

Bluff adds, “Working with Kim Libreri and his Unreal team, Golem Creations, and the ILM StageCraft team has opened new avenues to both the filmmakers and my fellow key creatives on The Mandalorian, allowing us to shoot principal photography on photoreal, virtual sets that are indistinguishable from their physical counterparts while incorporating physical set pieces and props as needed for interaction. It’s truly a game-changer.”

ILM StageCraft’s production tools provide filmmakers with the combination of traditional filmmaking equipment and methodologies with all of the advantages of a fully digital workflow. With ILM StageCraft, a production can acquire many in-camera finals, allowing filmmakers immediate and complete creative control of work typically handed off and reinterpreted in post, improving the quality of visual effects shots with perfectly integrated elements and reducing visual effects requirements in post, which is a major benefit considering today’s compressed schedules.

Quantum F1000