Academy Award-nominated director Neill Blomkamp (District 9) is directing the next installments in the ADAM franchise — ADAM: The Mirror and ADAM: The Prophet — using the latest version of the Unity, which launches today.
Created and produced by Blomkamp’s Oats Studios, these short films show the power of working within an integrated realtime environment — allowing the team to build, texture, animate, light and render all in Unity to deliver high-quality graphics at a fraction of the cost and time of a normal film production cycle.
ADAM: The Mirror will premiere during the live stream of the Unite Austin 2017 keynote, which begins at 4pm Pacific tonight, and will be available on the Oats YouTube channel shortly after. ADAM: The Prophet will follow before the end of 2017.
“Ever since I started making films I’ve dreamed of a virtual sandbox that would let me build, shoot and edit photorealistic worlds all in one place. Today that dream came true thanks to the power of Unity 2017,” said Neill Blomkamp, founder Oats Studios. “The fact that we could achieve near photorealistic visuals at half of average time of our production cycles is astounding. The future is here, and I can’t wait to see what our fans think.”
The original ADAM was released in 2016 as a short film to demonstrate technical innovations on Unity. It won a Webby Award and was screened at several film festivals, including the Annecy Film Festival and the Nashville Film Festival. ADAM: The Mirror picks up after the end of the events of ADAM where the cyborg hero discovers a clue about what and who he is. ADAM: The Prophet gives viewers their first glimpse of one of the villains in the ADAM universe.
Using the power of realtime rendering, Oats used Unity 2017 to help them create photorealistic graphics and lifelike digital humans. This was achieved through a combination of Unity’s advanced high-end graphics power, new materials using the Custom Render Texture feature, advanced photogrammetry techniques, Alembic-streamed animations for facial and cloth movement and Unity’s Timeline feature.
Innovations will be highlighted in the coming months via a series of behind-the-scenes videos and articles on the Unity website. Innovations in these short films include:
• Lifelike digital humans in realtime: Oats created the best-looking human ever in Unity using custom sub-surface scattering shaders for skin, eyes and hair.
• Alembic-based realtime facial performance capture: Oats has created a new facial performance capture technique that streams 30 scanned heads per second for lifelike animation, all without the use of morph targets or rigs.
• Virtual worlds via photogrammetry: Staying true to their live-action background, Oats shot more than 35,000 photos of environments and props and after the initial photogrammetry solve, imported these into Unity using the delighting tool. This allowed them to quickly create rich complex materials without the need to spend time to model high-resolution models.
• Rapid streamlined iteration in realtime: Working with realtime rendering lets artists and designers “shoot” the story as if on a set, with a live responsiveness that allows room to experiment and make creative decisions anywhere in the process.
• Unity’s timeline backbone for collaboration: Unity’s Timeline feature, a visual sequencing tool that allows artists to orchestrate scenes without additional programming, combined with Multi-Scene Authoring allowed a team of 20 artists to collaborate on the same shot simultaneously.