Tag Archives: Epic Games Unreal Engine

Battlesuit: Creating a pilot remotely, in realtime with Unreal Engine

By Randi Altman

Filmmaker Hasraf “HaZ” Dulull, who began his career as a visual effects artist and VFX supervisor, enjoys working on sci-fi projects. His reel is full of them, including his own films, such as The Beyond and 2036: Origin Unknown. Even the Disney show he directed, Fast Layne, featured an aspect of futurism to it thanks to a sophisticated talking car. He is even offering a master class on science-fiction filmmaking.

Hasraf “HaZ” Dulull

So it’s not surprising that Dulull’s most recent project also focuses on this genre. Dulull recently completed Battlesuit, a proof-of-concept pilot episode for a sci-fi animated series called The Theory. It’s based on the graphic novel of the same name and in development by the comic book publisher TPub Comics. The story centers on an astro-archaeologist who travels the universe looking for answers to help save humanity. In this particular story, she discovers the remains of a mech robot whose last memory will reveal what happened to the planet’s civilization.

Dulull used Epic’s realtime Unreal Engine to produce the pilot cost effectively and remotely — just before COVID-19 hit. The series is currently in development and being shopped around to different networks with Dulull attached as director and executive producer. You can see the proof of concept here.

OK, let’s find out more from Dulull.

How early did you get involved in Battlesuit?
Neil Gibson, who is the creator of the graphic novel “The Theory,” reached out to me last year October for a general coffee chat. He was a fan of my previous work and wanted to get some advice about moving TPub Comic’s IP into the world of film and TV. He gave me some graphic novels that he thought I would like, and “The Theory” was one of them. There was one story within the book that stood out for me — Battlesuit. We caught up once again and Neil mentioned that they were looking at creating proof of concepts for some of their IP being developed for TV and asked if I was interested in directing one. I requested Battlesuit.

Was it always meant to be animated?
Originally, it was planned for a live-action proof of concept, but due to budget and schedule constraints we knew there was no way we would be able to do the vision justice. And I really wanted to stay true to the graphic novel’s story, so I went away to rethink how I would pull it off. That was around the time when Netflix was putting out a lot of animated projects like Love, Death & Robots and Castlevania, so there was a huge rise in that market. So I went back to them with a pitch to do it as a pilot episode for an animated series.

Naturally, their reaction was that animation is waaay too expensive — you’d need an animation house and tons of time, etc. But I had already come up with a way of executing it using a realtime animation approach. I did a quick test scene using existing free assets to get my point across on what this will look like, but also to really know for myself if it could be done — this was the test scene I did. Their response was, whoa, if you can do that for the budget and in a 12-minute pilot episode duration, then go for it. It also helped that I put the test scene online and got great reactions from it. That gave TPub Comics, who financed Battlesuit, the confidence to move ahead.

What was your team size and duration of production and remote production?
The team on the actual animation was three, including myself. As the director I handled all the camera, layout, lighting and shot creations, which was great, as doing realtime animation gave me so much freedom and control. There was also Ronen Eytan, who was the technical Unreal Engine artist who put together a cool animation pipeline using a live link with his iPad to capture the face performance of the actors. Lastly, there was Andrea Tedeschi, a CG artist/generalist responsible for assets and environments. He has collaborated with me on all my projects right back since 2015 when I did my short film Sync, and since then he worked on my features and other stuff with me.

Outside of the animation team, I brought on music composer and sound designer Edward Patrick White, who had just finished delivering the score for the latest Xbox title Gears Tactics. Our voice actors included Nigel Barber (Mission: Impossible – Rogue Nation, Spectre and my feature film The Beyond) and Kosha Engler, who has done performance for video games such as Terminator: Resistance and Star Wars: Battlefront.

How did you find your way to Unreal Engine specifically?
I had been using Unreal Engine since September last year doing previz for my live-action feature film Lunar, which was in soft prep while casting (due to COVID-19, production on that project is on hold), and I realized that the quality of the previz I was creating was very high with cinematic lighting. I thought with a bit of love and raytracing, this could end up being an animated film … but Lunar remains a 100% live-action movie.

There are other realtime engines out there, like Unity, which is great, but I had already been using Unreal Engine, so it made sense to push further with it. I also got some great support from the team at Epic in London to assist me pushing this angle I was going for.

The big thing with using Unreal Engine is the “what you see is what you get” approach to creating scenes and shots, and as a filmmaker that is very hands-on (control freak really!), this was a pure joy to be able to create shots and then hand the shots over to Andrea and Ronen to build further.

But the other big point I want to make is the fact that we removed all the various pipeline steps you usually get with CGI animation projects (rendering, compositing, etc.) because everything was being rendered in real time. So all I was doing was exporting ProRes 444 QuickTimes (Rec 709 color space) and in some cases EXR frames directly out of Unreal Engine and into editorial; that was it.

Any challenges or lessons learned from working in realtime?
The big challenge is adjusting the way to think about what shots and scenes are in a realtime environment. Traditionally in CGI you have each 3D file as a scene or shot, but in Unreal Engine you have one big world called “The Level,” which lives in the main project. Then inside each level are cinematic sequences you create that use all the assets that live in the content part of your project. Once you get your head around that, it’s so much fun and you realize it’s actually way faster working this way.

The other challenge was that everything was coming out of Unreal Engine with no compositing at all to cheat and fix things. This ensured our assets all worked well and we were being smart with shot constructions. One thing to note is the fact that all the shots were created entirely on a laptop — the Razer Studio. Andrea and Ronen used desktop PCs for their work and then sent their packaged Unreal Engine files and assets to me via Dropbox, which I then migrated into the project.

HaZ working on the Razer

The Razer laptop comes with an Nvidia Quadro RTX 5000, and it was literally like having a beast of a desktop machine in my laptop. This was super-helpful because back in early January I was travelling to various CG conferences giving talks and keynotes, and this allowed me to keep working away on the project in a variety of hotel rooms.

Raytracing took the project’s visual look to another level as we were getting reflections, shadows and lighting at such a cinematic level… all in realtime. It was kind of mind blowing at times to be scrubbing back and forth in a sequence in Unreal Engine with explosions going off, spaceships flying, robots firing weapons as I was moving my camera around — again, all in real time.

What other tools did you use in this workflow?
For the war zone sequence, I wanted to have a visceral and gritty tone to the camera moves. I also knew it would take a lot of keyframe animation to do this, so I used a virtual camera solution called DragonFly from Glassbox Technology. Phillipa Carrol, who I knew from The Foundry, reached out to me after seeing my early tests online and gave me a license to use along with some great support from her team. I was able to shoot the action scenes using my iPad as a virtual production camera while the warzone action scenes were playing in realtime.

Virtual camera

The exported shots from Unreal Engine were brought into Blackmagic DaVinci Resolve 16 for editorial and color grading.

Do you think this is the future of filmmaking, especially in the world of COVID-19? How do you see it helping getting production working again?
I think virtual production in general is going to play a big part in content being made for films and TV. And it’s going to be used more and more as the rendering quality of CG in real time is getting so photoreal (I have seen the recent Unreal Engine 5 demo, and wow!) and you can play that back on LED screens and capture actors all in camera.

From my end, it’s allowed me to develop and create the big, bold ideas with animated series content without having a big studio or huge teams — with the entire production done remotely. Even the additional voice recording we needed during editorial was done remotely, with me directing Nigel Barber via iMessage on the iPhone. He would then email me the Wav files and, boom, we have our character voiced in the edit.

Realtime technology also removes that common reason of having everyone under one roof for speed and efficiency in communication, because thanks to Zoom or Skype screen sharing, I can be directing artists as they do the changes instantly in Unreal Engine — without them needing to upload versions for me to review and annotate and send back. So those Zoom/Skype dailies sessions are actually production sessions because at the end of the call, all the changes have been implemented.

What’s next for you?
Battlesuit actually opened the doors for me as a filmmaker to tell stories using animation and broke down the various barriers and obstacles I had before when trying to get animated projects off the ground.

I recently signed on to direct an animated feature film based on a video game IP with producers in Hollywood. I can’t say much about it yetm but it’s using the same approach I did with Battlesuit (all in Unreal Engine). The details will be announced later this year.

You can watch the episode and the making of here:

ILM’s virtual production platform used on The Mandalorian

To bring The Mandalorian to life, Industrial Light & Magic (ILM) and Epic Games — along with production technology partners Fuse, Lux Machina, Profile Studios, Nvidia and ARRI — have introduced a new way to shoot VFX-heavy projects in collaboration with Jon Favreau’s Golem Creations.

The new virtual production workflow allows filmmakers to capture a significant amount of complex visual effects shots in-camera using realtime game engine technology (Epic’s Unreal Engine) and LED screens to represent dynamic photoreal digital landscapes and sets with creative flexibility previously unimaginable.

Also part of the news, ILM has made its new end-to-end virtual production solution, ILM StageCraft, available for use by filmmakers, agencies and showrunners worldwide.

Over 50 percent of The Mandalorian Season 1 was filmed using this new methodology, eliminating the need for location shoots entirely. Instead, actors in The Mandalorian performed in an immersive and massive 20-foot-high by 270-degree semicircular LED video wall and ceiling with a 75-foot-diameter performance space, where the practical set pieces were combined with digital extensions on the screens.

Digital 3D environments created by ILM played back interactively on the LED walls, edited in realtime during the shoot, which allowed for pixel-accurate tracking and perspective-correct 3D imagery rendered at high resolution via systems powered by Nvidia GPUs.

L-R: Jon Favreau and Richard Bluff

The environments were lit and rendered from the perspective of the camera to provide parallax in real time, as if the camera were really capturing the physical environment with accurate interactive light on the actors and practical sets, giving showrunner Favreau; executive producer/director Dave Filoni; visual effects supervisor Richard Bluff; cinematographers Greig Fraser and Barry “Baz” Idoine and the episodic directors the ability to make concrete creative choices for visual effects-driven work during photography and achieve realtime in-camera composites on set.

The technology and workflow required to make in-camera compositing and effects practical for on-set use combined the ingenuity of all the partners involved.

“We’ve been experimenting with these technologies on my past projects and were finally able to bring a group together with different perspectives to synergize film and gaming advances and test the limits of realtime, in-camera rendering,” explains Favreau adding, “We are proud of what was achieved and feel that the system we built was the most efficient way to bring The Mandalorian to life.”

“Merging our efforts in the space with what Jon Favreau has been working toward using virtual reality and game engine technology in his filmmaking finally gave us the chance to execute the vision,” says Rob Bredow, executive creative director and head of ILM. “StageCraft has grown out of the culmination of over a decade of innovation in the virtual production space at ILM. Seeing our digital sets fully integrated, in real time on stage, providing the kind of in-camera shots we’ve always dreamed of while also providing the majority of the lighting was really a dream come true.”

Bluff adds, “Working with Kim Libreri and his Unreal team, Golem Creations, and the ILM StageCraft team has opened new avenues to both the filmmakers and my fellow key creatives on The Mandalorian, allowing us to shoot principal photography on photoreal, virtual sets that are indistinguishable from their physical counterparts while incorporating physical set pieces and props as needed for interaction. It’s truly a game-changer.”

ILM StageCraft’s production tools provide filmmakers with the combination of traditional filmmaking equipment and methodologies with all of the advantages of a fully digital workflow. With ILM StageCraft, a production can acquire many in-camera finals, allowing filmmakers immediate and complete creative control of work typically handed off and reinterpreted in post, improving the quality of visual effects shots with perfectly integrated elements and reducing visual effects requirements in post, which is a major benefit considering today’s compressed schedules.