Category Archives: VFX

SIGGRAPH conference chair Roy C. Anthony: VR, AR, AI, VFX, more

By Randi Altman

Next month, SIGGRAPH returns to Vancouver after turns in Los Angeles and Anaheim. This gorgeous city, whose convention center offers a water view, is home to many visual effects studios providing work for film, television and spots.

As usual, SIGGRAPH will host many presentations, showcase artists’ work, display technology and offer a glimpse into what’s on the horizon for this segment of the market.

Roy C. Anthony

Leading up to the show — which takes place August 12-16 — we reached out to Roy C. Anthony, this year’s conference chair. For his day job, Anthony recently joined Ventuz Technology as VP, creative development. There, he leads initiatives to bring Ventuz’s realtime rendering technologies to creators of sets, stages and ProAV installations around the world

SIGGRAPH is back in Vancouver this year. Can you talk about why it’s important for the industry?
There are 60-plus world-class VFX and animation studios in Vancouver. There are more than 20,000 film and TV jobs, and more than 8,000 VFX and animation jobs in the city.

So, Vancouver’s rich production-centric communities are leading the way in film and VFX production for television and onscreen films. They are also are also busy with new media content, games work and new workflows, including those for AR/VR/mixed reality.

How many exhibitors this year?
The conference and exhibition will play host to over 150 exhibitors on the show floor, showcasing the latest in computer graphics and interactive technologies, products and services. Due to the increase in the amount of new technology that has debuted in the computer graphics marketplace over this past year, almost one quarter of this year’s 150 exhibitors will be presenting at SIGGRAPH for the first time

In addition to the traditional exhibit floor and conferences, what are some of the can’t-miss offerings this year?
We have increased the presence of virtual, augmented and mixed reality projects and experiences — and we are introducing our new Immersive Pavilion in the east convention center, which will be dedicated to this area. We’ve incorporated immersive tech into our computer animation festival with the inclusion of our VR Theater, back for its second year, as well as inviting a special, curated experience with New York University’s Ken Perlin — he’s a legendary computer graphics professor.

We’ll be kicking off the week in a big VR way with a special session following the opening ceremony featuring Ivan Sutherland, considered by many as “the father of computer graphics.” That 50-year retrospective will present the history and innovations that sparked our industry.

We have also brought Syd Mead, a legendary “visual futurist” (Blade Runner, Tron, Star Trek: The Motion Picture, Aliens, Time Cop, Tomorrowland, Blade Runner 2049), who will display an arrangement of his art in a special collection called Progressions. This will be seen within our Production Gallery experience, which also returns for its second year. Progressions will exhibit more than 50 years of artwork by Syd, from his academic years to his most current work.

We will have an amazing array of guest speakers, including those featured within the Business Symposium, which is making a return to SIGGRAPH after an absence of a few years. Among these speakers are people from the Disney Technology Innovation Group, Unity and Georgia Tech.

On Tuesday, August 14, our SIGGRAPH Next series will present a keynote speaker each morning to kick off the day with an inspirational talk. These speakers are Tony Derose, a senior scientist from Pixar; Daniel Szecket, VP of design for Quantitative Imaging Systems; and Bob Nicoll, dean of Blizzard Academy.

There will be a 25th anniversary showing of the original Jurassic Park movie, being hosted by “Spaz” Williams, a digital artist who worked on that film.

Can you talk about this year’s keynote and why he was chosen?
We’re thrilled to have ILM head and senior VP, ECD Rob Bredow deliver the keynote address this year. Rob is all about innovation — pushing through scary new directions while maintaining the leadership of artists and technologists.

Rob is the ultimate modern-day practitioner, a digital VFX supervisor who has been disrupting ‘the way it’s always been done’ to move to new ways. He truly reflects the spirit of ILM, which was founded in 1975 and is just one year younger than SIGGRAPH.

A large part of SIGGRAPH is its slant toward students and education. Can you discuss how this came about and why this is important?
SIGGRAPH supports education in all sub-disciplines of computer graphics and interactive techniques, and it promotes and improves the use of computer graphics in education. Our Education Committee sponsors a broad range of projects, such as curriculum studies, resources for educators and SIGGRAPH conference-related activities.

SIGGRAPH has always been a welcoming and diverse community, one that encourages mentorship, and acknowledges that art inspires science and science enables advances in the arts. SIGGRAPH was built upon a foundation of research and education.

How are the Computer Animation Festival films selected?
The Computer Animation Festival has two programs, the Electronic Theater and the VR Theater. Because of the large volume of submissions for the Electronic Theater (over 400), there is a triage committee for the first phase. The CAF Chair then takes the high scoring pieces to a jury comprised of industry professionals. The jury selects then become the Electronic Theater show pieces.

The selections for the VR Theater are made by a smaller panel comprised mostly of sub-committee members that watch each film in a VR headset and vote.

Can you talk more about how SIGGRAPH is tackling AR/VR/AI and machine learning?
Since SIGGRAPH 2018 is about the theme of “Generations,” we took a step back to look at how we got where we are today in terms of AR/VR, and where we are going with it. Much of what we know today couldn’t have been possible without the research and creation of Ivan Sutherland’s 1968 head-mounted display. We have a fanatic panel celebrating the 50-year anniversary of his HMD, which is widely considered and the first VR HMD.

AI tools are newer, and we created a panel that focuses on trends and the future of AI tools in VFX, called “Future Artificial Intelligence and Deep Learning Tools for VFX.” This panel gains insight from experts embedded in both the AI and VFX industries and gives attendees a look at how different companies plan to further their technology development.

What is the process for making sure that all aspects of the industry are covered in terms of panels?
Every year new ideas for panels and sessions are submitted by contributors from all over the globe. Those submissions are then reviewed by a jury of industry experts, and it is through this process that panelists and cross-industry coverage is determined.

Each year, the conference chair oversees the program chairs, then each of the program chairs become part of a jury process — this helps to ensure the best program with the most industries represented from across all disciplines.

In the rare case a program committee feels they are missing something key in the industry, they can try to curate a panel in, but we still require that that panel be reviewed by subject matter experts before it would be considered for final acceptance.

 

Mark Thorely joins Mill Film Australia as MD

Mill Film in Australia, a Technicolor VFX studio, has named Mark Thorley as managing director.Hi appointment comes in the wake of the February launch of Mill Film in Adelaide, Australia.

Thorley brings with him more than 15 years of executive experience, working at such studios as Lucas Film, Singapore, where he oversaw studio operations and production strategies. Prior to that, Thorley spent nine years at Animal Logic, at both their Los Angeles and Sydney locations, as head of production. He also held senior positions at Screen Queensland and Omnicom.

Throughout his career, Thorley has received credits on numerous blockbuster feature films, including Kong: Skull Island, Rogue One, Jurassic World and Avengers: Age of Ultron. Thorley will drive all aspects of VFX production, client relations and business development for Australia, reporting into the global head of Mill Film, Lauren McCallum.

DG 7.9.18

Disfiguring The Man in Black for HBO’s Westworld

If you watch HBO’s Westworld, you are familiar with the once-good guy turned bad guy The Man in Black. He is ruthless and easy to hate, so when Karma caught up to him audiences were not too upset about it.

Westworld doesn’t shy away from violence. In fact, it has a major role in the series. A recent example of an invisible effect displaying mutilation came during the show’s recent Season Two finale. CVD VFX, a boutique visual effects house based in Vancouver, was called on to create the intricate and gruesome result of what The Man in Black’s hand looked like after being blown to pieces.

During the long-awaited face-off between The Man in Black (Ed Harris) and Dolores Abernathy (Evan Rachel Wood), we see their long-simmering conflict culminate with his pistol pressed against her forehead, cocked and ready to fire. But when he pulls the trigger, the gun backfires and explodes in his hand, sending fingers flying into the sand and leaving horrifyingly bloody stumps.

CVD VFX’s team augmented the on-set footage to bring the moment to life in excruciating detail. Harris’ fingers were wrapped in blue in the original shot, and CVD VFX went to work removing his digits and replacing them with animated stubs, complete with the visceral details of protruding bone and glistening blood. The team used special effects makeup for reference on both blood and lighting, and were able to seamlessly incorporate the practical and digital elements.

The result was impressive, especially considering the short turnaround time that CVD had to create the effect.

“We were brought on a little late in the game as we had a couple weeks to turn it around,” explains Chris van Dyck, founder of CVD VFX, who worked with the show’s VFX supervisor, Jay Worth. “Our first task was to provide reference/style frames of what we’d be proposing. It was great to have relatively free reign to propose how the fingers were blown off. Ultimately, we had great direction and once we put the shots together, everyone was happy pretty quickly.”

CVD used Foundry’s Nuke and Autodesk’s Maya to create the effect.

CVD VFX’s work on Westworld wasn’t the first time they worked with Worth. They previously worked together on Syfy’s The Magicians and Fox’s Wayward Pines.


Behind the Title: Steelhead MD Ted Markovic

NAME: Ted Markovic

COMPANY: LA-based Steelhead

CAN YOU DESCRIBE YOUR COMPANY?
We are a content studio and cross-platform production company. You can walk through our front door with a script and out the back with a piece of content. We produce everything from social to Super Bowl.

WHAT’S YOUR JOB TITLE?
Managing Director

WHAT DOES THAT ENTAIL?
I am responsible for driving the overall culture and financial health of the organization. That includes building strong client relationships, new business development, operational oversight, marketing, recruiting and retaining talent and managing the profits and losses of all departments.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
We all have a wide range of responsibilities and wear many hats. I occasionally find myself replacing the paper towels in the bathrooms because some days that’s what it takes.

WHAT’S YOUR FAVORITE PART OF THE JOB?
We are a very productive group that produces great work. I get a sense of accomplishment almost every day.

WHAT’S YOUR LEAST FAVORITE?
Replacing the paper towels in the bathrooms.

WHAT IS YOUR MOST PRODUCTIVE TIME OF THE DAY?
I get a lot more done while everyone else is busy eating their lunch or driving home.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Solving the traffic problem in Los Angeles. I see a lot of opportunities there.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I am a third-generation post production executive, and essentially grew up in a film lab in New York. I suspect the profession chose me.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
I am currently working on a Volkswagen Tier 2 project where we are shooting six cars over seven days on our stage at Steelhead. We’re incorporating dynamic camera shots of cars on a cyc with kinetic typography, motion graphics and VFX. It’s a great example of how we can do it all under one roof.

We recently worked with Nintendo and Interogate to bring the new Switch games to life in a campaign called Close Call. On set with rams, air mortars, lighting effects and lots of sawed-in-half furniture, we were able create real weight in-camera to layer with our VFX. We augmented the practical effects with HDR light maps, fire and debris simulations, as well as procedurally generated energy beams, 3D models, and 2D compositing to create a synergy between the practical and visual effects that really sells the proximity and sense of danger we were looking to create.

While the coordination of practical and post was no small chore, another interesting challenge we had to overcome was creating the CG weapons to mesh with the live-action plates. We started with low-resolution models directly from the games themselves, converted them and scrubbed in a good layer of detail and refined them to make them photoreal. We also had to conceptualize how some of the more abstract weapons would play with real-world physics.

Another project worth mentioning was a piece we created for Volkswagen called Strange Terrains. The challenge was to create 360-degree timelapse video from day-to-night. Something that’s never been done before. And in order to get this unique footage, we had to build an equally unique rigging system. We partnered with Supply Frame to design and build a custom-milled aluminum head to support four 50.6 megapixel Canon EOS 5DS cameras.

The “holy grail” of timelapse photography is getting the cameras to ramp the exposure over broad light changes. This was especially challenging to capture due to the massive exposure changes in the sky and the harshness of the white salt. After capturing around approximately 2,000 frames per camera — 9TB of working storage — we spent countless hours stitching, compositing, computing and rendering to get a fluid final product.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
About eight years ago, I created a video for my parents’ 40th wedding anniversary. My mother still cries when she watches it.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
The wheel is a pretty essential piece of technology that I’m not sure I could live without. My smartphone is as expected as well as my Sleepwell device for apnea. That device changed my life.

DO YOU LISTEN TO MUSIC WHILE YOU WORK? CARE TO SHARE YOUR FAVORITE MUSIC TO WORK TO?
I can work listening to anything but reggae.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
Exercise.


Framestore Chicago adds compositing lead Chris Beers

Framestore Chicago has added Chris Beers as compositing lead. He will be working across a variety of clients with the Chicago team, as well as diving into Nuke projects of his own.

Beers attended The Illinois Institute of Art, where he earned a BFA in visual effects and motion graphics. After graduation he honed his skills as a junior motion graphics artist at Leviathan in Chicago, and has since worked on projects of all sizes.

Beers’ career highlights include working as an After Effects artist on an expansive projection mapping project for Brazilian musician Amon Tobin’s ISAM world tour, and as a Nuke compositor for the title sequences on Marvel films Ant-Man, Captain America: Civil War and Doctor Strange. Beers was lead compositor on the series finale of Netflix science-fiction drama Sense8, having worked with the team as a Nuke compositor across both seasons of the show.

“As we recently celebrated our office’s official first year and continue to expand, it’s talent like Chris that makes our studio what it is: a creative hub with a strong sense of community, but the firepower of an integrated, global studio,” says Framestore Chicago’s MD Krystina Wilson.


Eli Rotholz joins Alkemy X as VP of biz dev

Creative content company Alkemy X has added Eli Rotholz as VP of business development,. He will be based in the company’s New York headquarters.

Rotholz brings more than 12 years of sales/business development, strategy and production experience, having begun his career as an independent sales rep for Ziegler/Jakubowicz and Moustache NYC. From there, he worked in his first in-house position at Click3X, where he built and managed a diverse roster of directorial talent, as well as the company’s first integrated production offering focusing on live-action, VFX/design/animation and editorial.

Rotholz then founded Honor Society Films. He later joined Hone Production, a brand-direct-focused production company and consultancy, as director of business development/content EP.

“Very few companies in the industry can boast the strong directorial roster and VFX capabilities as Alkemy X,” says Rotholz. “In addition to the amazing entertainment work that Alkemy does, there’s definitely a trend in high-end ‘package’ productions where one company can do both live-action shoots with their directors, as well as editorial and VFX.”


The Orville VFX supervisor on mixing practical and visual effects

By Barry Goch

What do you get when you mix Family Guy and Ted creator Seth MacFarlane with science fiction? The most dysfunctional spaceship in the galaxy, that’s what. What is the Fox series The Orville? Well, it’s more Galaxy Quest/Space Balls than it is Star Trek/Star Wars.

Set 400 years in the future, The Orville is a spaceship captained by MacFarlane’s Ed Mercer, who has to work alongside his ex-wife as they wing their way through space on a science mission. As you might imagine with a show that is set in space, The Orville features a large amount of visual and practical effects shots, including real and CG models of The Orville.

Luke McDonald

We reached out to the show’s VFX supervisor Luke McDonald to find out more.

How did the practical model of The Orville come about?
Jon Favreau was directing the pilot, and he and Seth MacFarlane had been kidding around about doing a practical model of The Orville. I jumped at the chance. In this day and age, visual effects supervisors shooting models is an unheard of thing to do, but something I was absolutely thrilled about.

Favreau’s visual effects supervisor is Rob Legato. I have worked with Rob on many projects, including Martin Scorsese’s Aviator, Shine a Light and Shutter Island, so I was very familiar with how Rob works. The only other chance that I had had to shoot models was with Rob during Shutter Island and Aviator, so in a sense, whenever Rob Legato shows up it’s model time (he laughs). It’s so amazing because it’s just something that the industry shies away from, but given the opportunity it was absolutely fantastic.

Who built the practical model of The Orville?
Glenn Derry made it. He’s worked with Rob Legato on a few things, including Aviator. Glen is kind of a fantastic. He basically does motion controls, models and motion capture. Glen would also look at all the camera moves and all the previz that we did to make sure the camera moves were not doing something that the motion control rig could not do.

How were you able to seamlessly blend the practical model and the CG version of The Orville?
Once we had the design for The Orville, we would then previz out the ships flying by camera, doing whatever, and work out these specific moves. Any move that was too technical for the motion control rig, we would do a CG link-up instead— meaning that it would go from model to a CG ship or vice versa — to get the exact camera move that we wanted. We basically shot all of the miniatures of The Orville at three frames a second. It was kind of like shooting in slow-mo with the motion control rig, and we did about 16 passes per shot — lights on, lights off, key lights, field light, back light, ambient, etc. So, when we got all the passes back, we composited them just like we would any kind of full CG shot.

From the model shoot, we ended up with about 25 individual shots of The Orville. It’s a very time-consuming process, but it’s very rewarding because of how many times you’re going to have to reuse these elements to achieve completely new shots, even though it’s from the same original motion control shoot.

How did the shots of The Orville evolve over the length of the season?
We started to get into more dynamic things, such as big space battles and specific action patenting, where it really wasn’t feasible to continue shooting the model itself. But now we have a complete match for our CG version of The Orville that we can use for our big space battles, where the ship’s flying and whipping around. I need to emphasize that previz on this project was very crucial.

The Orville is a science vessel, but when it needs to throw down and fight, it has the capabilities to be quite maneuverable — it can barrel roll, flip and power slide around to get itself in position to get the best shot off. Seth was responding to these hybrid-type ship-to-ship shots and The Orville moving through space in a unique way when it’s in battle.
There was never a playbook. It was always, “Let’s explore, let’s figure out, and let’s see where we fit in this universe. Do we fit into the traditional Star Trek-y stuff, or do we fit into the Star Wars-type stuff. I’m so pleased that we fit into this really unique world.

How was working with Seth MacFarlane?
Working with Seth has been absolutely amazing. He’s such a dedicated storyteller, even down to the most minute things. He’s such an encyclopedia of sci-fi knowledge, be it Star Trek, Star Wars, Battlestar Galactica or the old-school Buck Rogers and Flash Gordon. All of them are part of his creative repertoire. It’s very rare that he makes a reference that I don’t get, because I’m exactly the same way about sci-fi.

How different is creating VFX for TV versus film?
TV is not that new to me, but for the last 10 years I’ve been doing film work for Bad Robot and JJ Abrams. It was a strange awakening coming to TV, but it wasn’t horrifying. I had to approach things in a different way, especially from a budget standpoint.


Rachel Matchett brought on to lead Technicolor Visual Effects

Technicolor has hired on Rachel Matchett to head the post production group’s newly formed VFX brand, Technicolor Visual Effects. Working side-by-side within the same facilities where post services are offered, Technicolor Visual Effects is expanding to a global offering with an integrated pipeline. Technicolor is growing its namesake VFX team apart from the company’s other visual effects brands: MPC, The Mill, Mr. X and Mikros.

A full-service creative VFX house with local studios in Los Angeles, Toronto and London, Technicolor Visual Effects’ recent credits include the feature films Avengers: Infinity Wars, Black Panther, Paddington 2, and episodic series such as This is Us, Anne With an E and Black Mirror.

Matchett joins Technicolor from her long-tenured position at MPC Film. Her background at MPC London includes nearly a decade of senior management positions at the studio. She most recently served as MPC London’s global head of production. In that role, her divisions at MPC Film oversaw and carried out visual effects on a number of films each year, including director Jon Favreau’s Academy Award-winning The Jungle Book and the critically acclaimed Blade Runner 2049.

“Technicolor Visual Effects is emerging from its position as one of the industry’s best-kept secrets. While continuing to support clients who do color finishing with us, we are excited to work with storytellers from script to screen,” says Matchett. “Having been at the heart of MPC Film’s rapid growth over the past decade, I feel that there is a great opportunity for Technicolor’s future role in VFX to forge a new path within the industry.”


Milk provides VFX for Adrift, adds new head of production Kat Mann

As it celebrates its fifth anniversary, Oscar-, Emmy- and BAFTA-winning VFX studio Milk has taken an additional floor at its London location on Clipstone Street. This visual effects house has worked on projects such as Annihilation, Altered Carbon and Fantastic Beasts and Where to Find Them.

Milk’s expansion increases its artist capacity to 250, and includes two 4K FilmLight Baselight screening rooms and a dedicated client area. The studio has upgraded its pipeline, with all its rendering requirements (along with additional storage and workstation capacity) now entirely in the cloud, allowing full scalability for its roster of film and TV projects.

Annihilation

Milk has just completed production as the main vendor on STXfilms’ new feature film Adrift, the Baltasar Kormákur-directed true story of survival at sea, starring Shailene Woodley and Sam Claflin. The Milk team created all the major water and storm sequences for the feature, which were rendered entirely in the cloud.

Milk has just begun work on new projects, including Four Kids And It — Dan Films/Kindle Entertainment’s upcoming feature film — based on Jacqueline Wilson’s modern-day variation on the 1902 E Nesbit classic novel Five Children And It for which the Milk team will create the protagonist CG sand fairy character. Milk is also in production as sole VFX vendor on Neil Gaiman’s and Terry Pratchett’s six-part TV adaptation of Good Omens for Amazon/BBC.

In other news, Milk has brought on VFX producer Kat Mann as head of production. She will oversee all aspects of the studio’s production at its premises in London and at their Cardiff location. Mann has held senior production roles at ILM and Weta Digital with credits, including Jurassic World: Fallen Kingdom, Thor: The Dark World and Avatar. Milk’s former head of production Clare Norman has been promoted to business development director.

Milk was founded by a small team of VFX supervisors and producers in June 2013,

Behind the Title: Weta’s Head of Tech & Research Luca Fascione

NAME: Luca Fascione

COMPANY: Wellington, New Zealand’s Weta Digital

WHAT’S YOUR JOB TITLE?
Senior Head of Technology and Research

WHAT DOES THAT ENTAIL?
In my role, I lead the activities of Weta Digital that provide software technology to the studio and our partners. There are various groups that form technology and research: Production Engineering oversees the studio’s pipeline and infrastructure software, Software Engineering oversees our large plug-ins such as our hair system (Barbershop/Wig), our tree growth system (Lumberjack/Totara) and our environment construction system (Scenic Designer), to name a few.

Two more departments that make up the technology and research group include Rendering Research and Simulation Research. These departments oversee proprietary renderer, Manuka, and our physical simulation system, Synapse. Both groups have a strong applied research focus and as well as producing software, they are often involved in the publication of scientific papers.

HOW DID YOU GET INTO THIS BUSINESS?
Cinema and computers have been favorites of mine (as well as music) since I was a little kid. We used to play a game when I was maybe 12 or so where we would watch about five seconds of a random movie on TV, turn it off, and recite the title. I was very good at that.

A couple of my friends and I watched all the movies we could find, from arthouse European material to more commercial, mainstream content. When it came time to find a job, I thought finding a way to merge my passion for cinema and my interest in computers into one would be great, if I could.

HOW LONG HAVE YOU BEEN WORKING IN THIS INDUSTRY?
I started at Weta Digital in 2004. Before that I was part of the crew working the feature animation movie Valiant, where I started in 2002. I guess this would make it 15 years.

HOW HAS THE VFX INDUSTRY CHANGED IN THE TIME YOU’VE BEEN WORKING? WHAT’S BEEN GOOD, WHAT’S BEEN BAD?
Everything got bigger. More so the content we want to work with in relation to the machines we want to use to achieve our goals. As much as technology has improved, our ability to use it to drive the hardware extremely hard has grown faster, creating a need for technically creative, innovative solutions to our scaling problems.

Graphics is running out of “easy problems” that one can solve drawing inspiration from other fields of science, and it’s sometimes the case that our research has outpaced the advancements of similar problems in other fields, such as medicine, physics or engineering. At the same time, especially since the recent move toward deep learning and “big data” problems, the top brains in the field are all drawn away from graphics, making it harder than it used to be to get great talent.

DID A PARTICULAR FILM INSPIRE YOU ALONG THIS PATH IN ENTERTAINMENT?
I work in VFX because of Jurassic Park. Although I must also recognize Young Sherlock Holmes and Terminator 2, which also played a big role in this space. During my career in VFX, King Kong and Avatar have been life-shaping experiences.

DID YOU GO TO FILM SCHOOL?
Not at all, I studied Mathematics in Rome, Italy. All I know about movies is due to personal study work. Back in those days nobody taught computer graphics at this level for VFX. The closest were degrees in engineering schools that maybe had a course or two in graphics. Things have changed massively since then in this area.

WHAT’S YOUR FAVORITE PART OF THE JOB?
The variety. I run into a lot of extremely interesting problems, and I like being able to help people find good ways to solve them.

WHAT’S YOUR LEAST FAVORITE?
A role like mine necessarily entails having to have many difficult conversations with crew. I am extremely pleased to say the majority of these result in opportunities for growth and deepening of our mutual understandings. I love working with our crew, they’re great people and I do learn a lot every day.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I like my job, I don’t often think about doing something else. But I have on occasion wondered what it would be like to build guitars for a living.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
The latest War for the Planet of the Apes movie has been a fantastic achievement for the studio. The Technology and Research group has contributed a fair bit of software to the initiative, from our forest system Totara to a new lighting pipeline called PhysLight, a piece of work I was personally involved in and that I am particularly proud of.

During our work on The Jungle Book, we helped the production by reshaping our instancing system to address the dense forests in the movie. Great advancements in our destruction systems were also developed for Rampage.

WHAT IS THE PROJECT/S THAT YOU ARE MOST PROUD OF?
It turns out three of my early projects played a role of some importance in the making of Avatar: The facial solver, the sub-surface scattering system and PantaRay (our Spherical Harmonics occlusion system). After that, I’m extremely proud of my work on Manuka, Weta Digital’s in-house renderer.

WHERE DO YOU FIND INSPIRATION NOW?
All around me, it’s the people, listening to their experiences, problems and wishes. That’s how our job is done.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I play guitar and I build audio amplifiers. I have two daughters in primary school that are a lot of fun and a little boy just joined our family last December. I do take the occasional picture as well.