Tag Archives: VFX

Behind the Title: Carousel’s Head of VFX/CD Jeff Spangler

This creative has been an artist for as long as he could remember. “I’ve always loved the process of creation and can’t imagine any career where I’m not making something,” he says.

Name: Jeff Spangler

Company: NYC’s Carousel

Can you describe your company?
Carousel is a “creative collective” that was a response to this rapidly changing industry we all know and love. Our offerings range from agency creative services to editorial, design, animation (including both motion design and CGI), retouching, color correction, compositing, music licensing, content creation, and pretty much everything that falls between.

We have created a flexible workflow that covers everything from concept to execution (and delivery), while also allowing for clients whose needs are less all-encompassing to step on or off at any point in the process. That’s just one of the reasons we called ourselves Carousel — our clients have the freedom to climb on board for as much of the ride as they desire. And with the different disciplines all living under the same roof, we find that a lot of the inefficiencies and miscommunications that can get in the way of achieving the best possible result are eliminated.

What’s your job title?
Head of VFX/Creative Director

What does that entail?
That’s a really good question. There is the industry standard definition of that title as it applies to most companies. But it’s quite different if you are talking about a collective that combines creative with post production, animation and design. So for me, the dual role of CD and head of VFX works in a couple of ways. Where we have the opportunity to work with agencies, I am able to bring my experience and talents as a VFX lead to bear, communicating with the agency creatives and ensuring that the different Carousel artist involved are all able to collaborate and communicate effectively to get the work done.

Alternatively, when we work direct-to-client, I get involved much earlier in the process, collaborating with the Carousel creative directors to conceptualize and pitch new ideas, design brand elements, visualize concept art, storyboard and write copy or even work with stargeists to help hone the direction and target of a campaign.

That’s the true strength of Carousel — getting creatives from different backgrounds involved early on in the process where their experience and talent can make a much bigger impact in the long run. Most importantly, my role is not about dictating direction as much as it is about guiding and allowing for people’s talents to shine. You have to give artists the room to flourish if you really want to serve your clients and are serious about getting them something more than what they expected.

What would surprise people the most about what falls under that title?
I think that there is this misconception that it’s one creative sitting in a room that comes up with the “Big Idea” and he or she just dictates that idea to everyone. My experience is that any good idea started out as a lot of different ideas that were merged, pruned, refined and polished until they began to resemble something truly great.

Then after 24 hours, you look at that idea again and tear it apart because all of the flaws have started to show and you realize it still needs to be pummeled into shape. That process is generally a collaboration within a group of talented people who all look at the world very differently.

What tools do you use?
Anything that I can get my hands on (and my brain wrapped around). My foundation is as a traditional artist and animator and I find that those core skills are really the strength behind what I do everyday. I started out after college as a broadcast designer and later transitioned into a Flame artist where I spent many years working as a beauty retouch artist and motion designer.

These days, I primarily use Adobe Creative Suite as my role has become more creative in nature. I use Photoshop for digital painting and concept art , Illustrator for design and InDesign for layouts and decks. I also have a lot of experience in After Effects and Autodesk Maya and will use those tools for any animation or CGI that requires me to be hands-on, even if just to communicate the initial concept or design.

What’s your favorite part of the job?
Coming up with new ideas at the very start. At that point, the gloves are off and everything is possible.

What’s your least favorite?
Navigating politics within the industry that can sometimes get in the way of people doing their best work.

What is your favorite time of the day?
I’m definitely more of a night person. But if I had to choose a favorite time of day, it would be early morning — before everything has really started and there’s still a ton of anticipation and potential.

If you didn’t have this job, what would you be doing instead?
Working as a full-time concept artist. Or a logo designer. While I frequently have the opportunity to do both of those things in my role at Carousel, they are, for me, the most rewarding expression of being creative.

A&E’s Scraps

How early on did you know this would be your path?
I’ve been an artist for as long as I can remember and never really had any desire (or ability) to set it aside. I’ve always loved the process of creation and can’t imagine any career where I’m not “making” something.

Can you name some recents projects you have worked on?
We are wrapping up Season 2 of an A&E food show titled Scraps that has allowed us to flex our animation muscles. We’ve also been doing some in-store work with Victoria’s Secret for some of their flagship stores that has been amazing in terms of collaboration and results.

What is the project that you are most proud of?
It’s always hard to pick a favorite and my answer would probably change if you asked me more than once. But I recently had the opportunity to work with an up-and-coming eSports company to develop their logo. Collaborating with their CD, we landed on a design and aesthetic that makes me smile every time I see it out there. The client has taken that initial work and continues to surprise me with the way they use it across print, social media, swag, etc. Seeing their ability to be creative and flexible with what I designed is just validation that I did a good job. That makes me proud.

Name pieces of technology you can’t live without.
My iPad Pro. It’s my portable sketch tablet and presentation device that also makes for a damn good movie player during long commutes.

What do you do to de-stress from it all?
Muay Thai. Don’t get me wrong. I’m no serious martial artist and have never had the time to dedicate myself properly. But working out by punching and kicking a heavy bag can be very cathartic.

Method Studios adds Bill Tlusty joins as global head of production

Method Studios has brought on veteran production executive and features VFX Producer Bill Tlusty on board in the new role of global head of production. Reporting to EVP of global features VFX, Erika Burton, Tlusty will oversee Method’s global feature film and episodics production operation, leading teams worldwide.

Tlusty’s career as both a VFX producer and executive spans two decades. Most recently, as an executive with Universal Pictures, he managed more than 30 features, including First Man and The Huntsman: Winter’s War. His new role marks a return to Method Studios, as he served as head of studio in Vancouver prior to his gig at Universal. Tlusty also spent eight years as a VFX producer and executive producer at Rhythm & Hues.

In this capacity he was lead executive on Snow White and the Huntsman and the VFX Oscar-winning Life of Pi. His other VFX producer credits include Night at the Museum: Battle of the Smithsonian, The Mummy: Tomb of the Emperor Dragon and Yogi Bear, and he served as production manager on Hulk and Peter Pan and coordinator on A.I Artificial Intelligence. Early in his career Tlusty worked as a production aAssistant at American Zoetrope, working for its iconic filmmaker founders, Francis Ford Coppola and George Lucas. His VFX career began at Industrial Light & Magic where he worked in several capacities on the Star Wars prequel trilogy, first as a VFX coordinator and later, production  manager on the series. He is a member of the Producers Guild of America.

“Method has pursued intelligent growth, leveraging the strength across all of its studios, gaining presence in key regions and building on that to deliver high quality work on a massive scale,” Tlusty. “Coming from the client side, I understand how important it is to have the flexibility to grow as needed for projects.”

Tlusty is based in Los Angeles and will travel extensively among Method’s global studios.

Mortal Engines: Weta creates hell on wheels

By Karen Moltenbrey

Over the years, Weta Digital has made a name for itself, creating vast imaginative worlds for highly acclaimed feature film franchises such as The Lord of the Rings and The Hobbit. However, for the recently released Mortal Engines, not only did the studio have to construct wide swaths of land the size of countries, but the crew also had to build supercities that move at head-spinning speed.

Mortal Engines, produced by Universal Pictures and MRC, takes place centuries after a cataclysmic event known as the Sixty Minute War destroys civilization as we know it, leaving behind few resources. Eventually, survivors learn to adapt, and a deadly, mobile society emerges whereby gigantic moving cities roam the earth, preying on smaller towns they hunt down across a landscape called the Great Hunting Ground, basically the size of Europe. It is now a period of pre-revival, as the earth begins to renew itself, and the survivors become nomads on wheels.

Eventually, London, a traction city, emerges at the top of this vicious food chain, consuming resources from other cities and towns it devours, including fuel, food and human labor. It’s a dog-eat-dog world. But there are those who want to end this vicious cycle; they are members of the Anti-Traction League, who advocate for static, self-sustaining homelands.

Based on a book by Philip Reeve, the film is directed by Oscar-winning visual effects artist Christian Rivers (King Kong). Simon Raby (Elysium, District 9) served as cinematographer, while Weta created the visual effects, led by Ken McGaugh, Kevin Andrew Smith and Luke Millar, with Dennis Yoo as animation supervisor.

Ken McGaugh

A New World Order
In all, Weta delivered 1,682 VFX shots for the feature film, most of which pertained to the environments.

How did this work compare to some of Weta’s other world builds? “I can’t speak as to The Hobbit because I didn’t work on that. But on The Lord of the Rings, New Zealand’s landscape was used for Middle-earth, so there was a lot of location work, and most of the world building was all in camera,” says McGaugh. “On Mortal Engines, because earth has been destroyed and manipulated by these giant cities moving over it, there’s nothing left that resembles the earth that we know. So, there was no location for us to shoot; we had to build it from scratch.”

How does one go about building such a world — and then setting it in motion? “In a book there is a lot of metaphor, but film has to be fully literal,” says Rivers. Fortunately, he and the crew had the vast experience as well as the technological genius to get it done.

Such a goal, however, required new rules and workflows, even for a veteran studio like Weta, which has a history of breaking new ground, especially when it comes to animated characters and amazing landscapes. Here, those diverse elements would converge like never before.

“We have quite a bit of experience doing computer-generated vehicles as well as digital environments, but most of our workflows assume that an environment is not a vehicle, that it doesn’t move. So trying to bridge that gap was a challenge. We had nothing that would allow us to do that until we first started,” says McGaugh. “So, we had to invent some new workflows and technology internally to allow us to bridge that gap so the animators could animate the city as if it were a vehicle, but we could build the city and dress it as if it were an environment.”

The Land
The environments in Mortal Engines are CG — built and animated using Autodesk’s Maya and composited in Foundry’s Nuke — with practical set pieces used for filming embedded into them.

In addition to the unique cities, there are some large tracts of land, including the Great Hunting Ground, scarred with massive tread marks left by traction cities over the centuries. Here, the once-organic environment had been reshaped and now appears man-made, but life is establishing a foothold in this once-barren landscape.

“It has all these layered plateaus with hard edges and embedded track shapes that we placed everywhere,” says McGaugh. “Our rule of thumb was that the higher the level of the plain, the more foliage there was, since it had been a long time since it had been driven over. However, on the lower level, at the bottom of the trenches, it was also green, but more marshy and full of reeds, since that is where water accumulates.”

Some survivors of the war pushed into the mountains and founded settlements there, rather than living a nomadic existence. One such settlement is Shan Guo in the East on the Asian Steppes, protected from the mobile cities by mountain ranges. In addition, there is a massive two-kilometer shield wall (6,561 feet high) situated between two of the mountains that protects Shan Guo and the static cities in the Himalayas. This environment alone was daunting to create, as it covers 5,000 square kilometers (over 3,000 square miles).

On one side of the shield wall, the environment is very lush, fertile and green, and the buildings influenced by Bhutan monasteries. On the other side of the wall, there is a lack of foliage, with the landscape strewn with decayed ruins of traction cities that have unsuccessfully attacked the wall. And while the shield wall is massive, it had to appear smaller in comparison to the mountains surrounding it.

While constructing these mountains, the Weta artists used available geographical data, increasing the resolution through erosion simulations that would shape the mountains more naturally. “That gave us extra detail that we could use to make it look more organic,” McGaugh says. The simulation also was used to embed the ruined traction cities into the crater environment as well as situate the crater environment into the surrounding landscape.

Cities on the Move
The moving cities can cover a great deal of ground in very little time, gouging and scarring the earth in their wake with deep crevices; above, airships dot the skies. The relentless ploughing of the traction cities over the landscape has driven layers and layers of mud, debris and waste into the ground. Weta re-created this effect by starting with a precisely coupled fluid simulation with multiple viscosities; this could accurately simulate the combination of solid and liquid layers of the mud. They then began laying tracks and eroding them, then laying more tracks and eroding, repeating the process until the desired result was achieved.

In the film, there are numerous homelands, including Airhaven, a fantastical city in the clouds with a jellyfish look that is home to the Anti-Tractionists.

“Airhaven didn’t have a lot of movement, so we didn’t have to use our new layout puppet technology. But when it crashes, that had to be animation-driven, so we built a lightweight puppet with a large section of the city on each piece of the puppet, so animators could choreograph the crash,” explains McGaugh. “Then they handed that off to our effects department, and they would simulate all the individual pieces breaking apart and exploding, and then add the explosions, fire and all the dynamics on the balloons and the cloth.”

So many of the traction cities have multiple moving parts that it was impractical to animate them by hand. Alternatively, Weta developed a tool called Gumby, which is a vehicle-ground interaction toolset that allows animators to move a city from point A to point B over uneven terrain along a curve. The Gumby system then made sure that all the wheels stuck to the ground, thereby driving the suspension system that causes the infrastructure to move appropriately.

A dynamic caching system allowed for secondary bounce and wobble to occur on various pieces of a city in response to the motion from the Gumby system. “It wasn’t perfect, but it allowed for very complex animation in the blocking stage and made the motion more believable and closer to what the final version would look like,” explains McGaugh. Once blocking was approved, then the animators would refine the motion as needed.

London Lives!
According to McGaugh, the single biggest challenge was constructing London and executing it in a way that maintains its enormous size while keeping it in the realm of believability. “Concept artist Nick Keller came up with a design for London that looked like it could be self-supporting and was scalable, so we could make it as big as it needed to be in order to house 200,000 people, and when it moved, we could sell that as believable, too,” he explains.

London is the largest of the traction cities. It incorporated approximately 17 live-action sets and is a mile wide and a mile and a half long, and over a half-mile high. It is divided into seven tiers, with life aboard London progressively more desirable farther up each tier.

“This is a place where the glass is gone but stone statues have survived,” Rivers says. “We decided to make anything we see in our world today archaeological and then skew and twist things from there.” As a result, some iconic landmarks are recognizable but have an altered appearance.

“The design had to lend itself to believability for being so large and moving, but it also had to evoke a sense of contemporary London through recognizable features, such as the Trafalgar Square lions acting as sentinels on top of the outriggers, so they’re visible from a distance,” McGaugh points out. London was then crowned with a reconstructed St. Paul’s Cathedral.

A contemporary feel was evoked through the architectural style. As McGaugh notes, London is known for its diverse and contrasting architectural styles juxtaposed against each other. So, the designers followed that style when laying out the buildings atop the digital London. “That was also carried out through the front façade of London and at a much larger scale, so that from a distance, you could still feel that diversity where it’s kind of rusty and brutalist at the bottom with a layer of architecture that is reminiscent of the houses of Parliament, and then is topped with chrome and steel construction shaped like a coat of arms,” he adds.

Because of this diversity of architectural styles, the group was able to source from its library of existing buildings — whether Victorian, Georgian, contemporary office buildings, tower blocks, row houses, Buckingham Palace — and mix them together without having to maintain uniformity from building to building.

But with so much detail, it became prohibitively difficult to render, and that’s where Weta’s Cake technology came into play — which used an intelligent way of breaking down geometric and material detail into a format that could be streamed into the renderer, using just the level of detail required. “Before that, it wasn’t viable to render London,” says McGaugh. “But Cake allowed us to process all the data into a format that enabled us to render it, and render it quite efficiently.” Rendering was done within Weta’s proprietary Manuka renderer.

Lighting was also tricky, as the team was following the lighting direction from Raby, who used backlighting — which is not easy to do in CGI when using hard edges, especially when there is shiny glass and metal involved. As a result, the CG lighters, using the studio’s Foundry Katana-based pipeline, had to do tests on almost every shot to find the appropriate angle that sold the backlighting and kept the visuals interesting and not too flat, while maintaining continuity with the camera shots.

London on the Move
A city constantly on the move, London can travel at approximately 300 kilometers (186 miles) per hour, bolstered by massive engines. While that speed sounds ridiculously fast according to real-world physics, it was necessary to hold audiences’ attention, as physics and cinema were often at odds on the film. “There was a lot of testing, and we tried 100 kilometers per hour when London is chasing down [the mining traction city of] Salthook across a vast landscape, but it looked like a couple of snails racing. It was too boring,” says McGaugh. “Indeed, 300 kilometers sounds ludicrous, and if you think about it, it is. But that is what allowed us to keep the chase exciting while constantly selling that there is movement.”

Indeed, London had to move faster than physics would allow, yet just how fast depended on the camera shot. Nevertheless, this wreaked havoc on the effects that simulated natural phenomenon, such as dust. The key, however, was to use visual cues to make sure the cities felt massive and other cues to make sure audiences were not distracted by the fact that the cities are moving so fast.

When constructing the massive city of London, Weta devised the concept of so-called “lily pads,” representing 113 sections of London. Each was rigged and animated independently and contained millions of components that had to be tracked and moved. Each lily pad was constructed modularly, enabling artists to add clusters of buildings, parks, shops and so forth on each platform. More and more detail was then added to areas as needed.

These lily pads were supported by complex suspension systems for individual movement; at times there was some inter-movement among them, as well. “[The movement] was pretty subliminal at times, but if it wasn’t there, you’d have noticed it and everything would have felt static and locked,” McGaugh says.

Shrike
While Weta’s work on the film was heavily focused on environments, Mortal Engines does contain one digital character, Shrike, who had raised the movie’s heroine, Hester Shaw, after her mother’s murder. Half-man/half-machine, Shrike was a dead soldier resurrected by technology. He stands at seven feet tall and weighs close to 1,000 pounds.

Shrike’s anatomy is not human — he has extra appendages and extra mechanical bits that had to be rigged to move differently from that of a typical human. “It was determined early on that we could not use motion capture because we needed him to be inhuman, so we had to invest quite a bit of effort into finding his motion through keyframe techniques,” McGaugh notes.

Shrike’s face comprises metal parts and human skin. To achieve a realistic tug and stretch of the skin against the metal, Weta developed a custom facial-muscle rig so animators could use the visible muscles and skin to allow him to emote in some particularly dramatic moments in the movie, inspired by the performance from actor Stephen Lang.

A New Day
While the scale of the world building for Mortal Engines was not at the level of The Hobbit, it was not without big challenges for the VFX veterans at Weta. Initially, the concept of massive cities on the move was difficult to wrap one’s head around. But, as always, Weta’s artists and animators were able to bring that unique visual to life in a realistic way.

Now with Mortal Engines in theaters, the studio remains on the move with a number of other mega projects in the works, including the Avatar sequels and more on the big screen as well as the final season of Game of Thrones for the small screen. All resulting in more expansive, unique worlds brought to cinematic life.


Karen Moltenbrey is a veteran VFX and post writer.

Behind the Title: We Are Royale CD Chad Howitt

NAME: Chad Howitt

COMPANY: We Are Royale in Los Angeles

CAN YOU DESCRIBE YOUR COMPANY?
We Are Royale is a creative and digital agency looking to partner with brands to create unique experiences across platforms. In the end, we make pretty things for lots of different applications, depending on the creative problem our clients are looking to solve. We’re a full-service production studio that directs live-action commercials, creates full 3D worlds, designs 2D character spots, and develops immersive AR and VR experiences.

WHAT’S YOUR JOB TITLE?
Creative Director

WHAT DOES THAT ENTAIL?
Anything and everything needed to get the job done. On the service side, I’ll work directly with clients and agencies to address their wide variety of needs. So whether that’s creating an idea from scratch or curating a look around an already developed script, I try to figure out how we can help.

Then in-house, I’ll work with our talented team of art directors, CG supervisors and producers to help execute those ideas.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
While the title stays the same, the responsibilities vary by location, person and company culture. So don’t think there’s a hard-and-fast rule about what a creative director is and does.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Seeing a finished project out in the world knowing the hard work the team put in to get it there. Whether it’s on TV, in a space as a part of an installation or online as a part of a pre-roll… it’s a proud moment whenever I see it in the wild.

WHAT’S YOUR LEAST FAVORITE?
Seeing the results of a job we lost or had to pass on knowing that the creative we were planning will never see the light of day.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I’d probably be in the video game industry, but that wasn’t really a feasible career path back then.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
As a little kid, I was obsessed with drawing and computers. So merging those into a profession always seemed like the most natural course. That said, as an LA native, working on film sets just seemed like what out-of-towners wanted to do. So I never saw that coming.

Under Armour spot for its UA HOVR running line

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
I’ve wrapped a few projects with Under Armour, a trio of spots for NASDAQ, and a promo for Billy Bob Thornton’s series on Amazon called Goliath.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
It’d probably be the first project I worked on at We Are Royale, which was an Under Armour spot for its UA HOVR running shoe line. It allowed me to work with merging live-action, CG and beautiful type design.

NAME THREE THINGS YOU CAN’T LIVE WITHOUT.
Fire, indoor plumbing and animal husbandry

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
The last social media I had was MySpace, unless you count LinkedIn…which you really shouldn’t.

CARE TO SHARE YOUR FAVORITE MUSIC TO WORK TO?
Some of my current go-to tracks are “We Were So Young” by Hammock, “Galang” by Vijay Iyer Trio, “Enormous” by Llgl Tndr, “Almost Had to Start a Fight” by Parquet Courts and “Pray” by Jungle.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I stress eat. Cake, cookies and pizza make most problems go away. Although diabetes could become a new problem, but that’s tomorrow.

Artifex provides VFX for Jordan Peele’s Weird City

Vancouver-based VFX house Artifex Studios was the primary visual effects vendor for Weird City, Oscar-winner Jordan Peele’s first foray into scripted OTT content. The dystopian sci-fi/comedy Weird City — from Peele and Charlie Sanders — premieres on YouTube Premium on  February 13. They have released a first trailer and it features a variety of Artifex’s visual effects work.

Artifex’s CG team created the trailer’s opening aerial shots of the futuristic city. Additionally, video/holographic screens, user interfaces, graphics, icons and other interactive surfaces that the characters interact with were tasked to Artifex.

Artifex’s team, led by VFX supervisor Rob Geddes, provided 250 visual effects shots in all, including Awkwafina’s and Yvette Nicole Brown’s outfit swapping (our main image), LeVar Burton’s tube traveling and a number of additional environment shots.

Artifex called on Autodesk Maya, V-ray, Foundry’s Nuke and Adobe Photoshop, along with a mix of Dell, HP, generic PC workstations and Dell and HP render nodes. They also used Side Effects Houdini for procedural generation of the “below the line” buildings in the opening city shot. Qumulo was called on for storage.

 

VFX editor Warren Mazutinec on life, work and Altered Carbon

By Jeremy Presner

Long-time assistant editor Warren Mazutinec’s love for filming began when he saw Star Wars as an eight-year-old in a small town in Edmonton, Alberta. Unlike many other Lucas-heads, however, this one got to live out his dream grinding away in cutting rooms from Vancouver to LA working with some of the biggest editors in the galaxy.

We met back in 1998 when he assisted me on the editing of the Martin Sheen “classic” Voyage of Terror. We remain friends to this day. One of Warren’s more recent projects was Netflix’s VFX-heavy Altered Carbon, which got a lot of love from critics and audiences alike.

My old friend, who is now based in Vancouver, has an interesting story to tell, moving from assistant editor to VFX editor working on films like Underworld 4, Tomorrowland, Elysium and Chappie, so I threw some questions at him. Enjoy!

Warren Mazutinec

How did you get into the business?
I always wanted to work in the entertainment industry, but that was hard to find in Alberta. No film school-type programs were even offered, so I took the closest thing at a local college: audiovisual communications. While there, I studied photography, audio and video, but nothing like actual filmmaking. After that I attended Vancouver Film School. After film school, and with the help of some good friends, I got an opportunity to be a trainee at Shavick Entertainment.

What was it like working at a “film factory” that cranked out five to six pictures a year?
It was fun, but the product ultimately became intolerable. Movies for nine-year-olds can only be so interesting… especially low-budget ones.

What do your parents think of your career option?
Being from Alberta, everyone thought it wasn’t a real job — just a Hollywood dream. It took some convincing; my dad still tells me to look for work between gigs.

How did you learn Avid? Were you self-taught?
I was handed the manual by a post supervisor on day one. I never read it. I just asked questions and played around on any machine available. So I did have a lot of help, but I also went into work during my free time and on weekends to sit and learn what I needed to do.

Over the years I’ve been lucky enough to have cool people to work with and to learn with and from. I did six movies before I had an email address, more before I even owned a computer.

As media strayed away from film into digital, how did your role change in the cutting room? How did you refine your techniques with a changing workflow?
My first non-film movie was Underworld 4. It was shot with a Red One camera. I pretty much lied and said I knew how to deal with it. There was no difference really; just had to say goodbye to lab rolls, Keykode, etc. It was also a 3D stereo project, so that was a pickle, but not too hard to figure out.

How did you figure out the 3D stereo post?
It was basically learning to do everything twice. During production we really only played back in 3D for the novelty. I think most shows are 3D-ified in post. I’m not sure though, I’ve only done the one.

Do you think VR/AR will be something you work with in the future?
Yes, I want to be involved in VR at some point. It’s going to be big. Even just doing sound design would be cool. I think it’s the next step, and I want in.

Who are some of your favorite filmmakers?
David Lynch is my number one, by far. I love his work in all forms. A real treasure tor sure. David Fincher is great too. Scorsese, Christopher Nolan. There are so many great filmmakers working right now.

Is post in your world constantly changing, or have things more or less leveled off?
Both. But usually someone has dailies figured out, so Avid is pretty much the same. We cut in DNx115 or DnX36, so nothing like 4K-type stuff. Conform at the end is always fun, but there are tests we do at the start to figure it all out. We are rarely treading in new water.

What was it like transitioning to VFX editor? What tools did you need to learn to do that role?
FileMaker. And Jesus, son, I didn’t learn it. It’s a tough beast but it can do a lot. I managed to wrangle it to do what I was asked for, but it’s a hugely powerful piece of software. I picked up a few things on Tomorrowland and went from there.

I like the pace of the VFX editor. It’s different than assisting and is a nice change. I’d like to do more of it. I’d like to learn and use After Effects more. On the film I was VFX editor for, I was able to just use the Avid, as it wasn’t that complex. Mostly set extensions, etc.

How many VFX shot revisions would a typical shot go through on Elysium?
On Elysium, the shot version numbers got quite high, but part of that would be internal versioning by the vendor. Director Neil Blomkamp is a VFX guy himself, so he was pretty involved and knew what he wanted. The robots kept looking cooler and cooler as the show went on. Same for Chappie. That robot was almost perfect, but it took a while to get there.

You’ve worked with a vast array of editors, from, including Walter Murch, Lee Smith, Julian Clarke, Nancy Richardson and Bill Steinkamp. Can you talk about that, and have any of them let you cut material?
I’ll assemble scenes if asked to, just to help the editor out so he isn’t starting from scratch. If I get bored, I start cutting scenes as well. On Altered Carbon, when Julian (Clark) was busy with Episodes 2 and 3, I’d try to at least string together a scene or two for Episode 8. Not fine-cutting, mind you, just laying out the framework.

Walter asked a lot of us — the workload was massive. Lee Smith didn’t ask for much. Everyone asks for scene cards that they never use, ha!

Walter hadn’t worked on the Avid for five years or so prior to Tomorrowland, so there was a lot of him walking out of his room asking, “How do I?” It was funny because a lot of the time I knew what he was asking, but I had to actually do it on my machine because it’s so second nature.

What is Walter Murch like in the cutting room? Was learning his organizational process something you carried over into future cutting rooms?
I was a bit intimidated prior to meeting him. He’s awesome though. We got along great and worked well together. There was Walter, a VFX editor and four assistants. We all shared in the process. Of course, Walter’s workflow is unlike any other so it was a huge adjustment, but within a few weeks we were a well-oiled machine.

I’d come in at 6:30am to get dailies sorted and would usually finish around lunch. Then we’d screen in our theater and make notes, all of us. I really enjoyed screening the dailies that way. Then he would go into his room and do his thing. I really wish all films followed his workflow. As tough as it is, it all makes sense and nothing gets lost.

I have seen photos with the colored boxes and triangles on the wall. What does all that mean, and how often was that board updated?
Ha. That’s Walter’s own version of scene cards. It makes way better sense. The colors and shapes mean a particular thing — the longer the card the longer the scene. He did all that himself, said it helps him see the picture. I would peek into his room and watch him do this. He seemed so happy doing it, like a little kid.

Do you always add descriptions and metadata to your shots in Avid Media Composer?
We add everything possible. Usually there is a codebook the studios want, so we generate that with FileMaker on almost all the bigger shows. Walter’s is the same just way bigger and better. It made the VFX database look like a toy.

What is your workflow for managing/organizing footage?
A lot of times you have to follow someone else’s procedure, but if left to my own devices I try to make it the simplest it can be so anyone can figure out what was done.

How do you organize your timeline?
It’s specific to the editor, but I like to use as many audio tracks as possible and as few video tracks as possible, but when it’s a VFX-heavy show, that isn’t possible due to stacking various shot versions.

What did you learn from Lee Smith and Julian Clarke?
Lee Smith is a suuuuuper nice guy. He always had great stories from past films and he’s a very good editor. I’m glad he got the Oscar for Dunkirk, he’s done a lot of great work.

Julian is also great to work with. I’ve worked with him on Elysium, Chappie and Altered Carbon. He likes to cut with a lot of sound, so it’s fun to work with him. I love cutting sound, and on Altered Carbon we had over 60 tracks. It was a alternating stereo setup and we used all the tracks possible.

Altered Carbon

It was such a fun world to create sound for. Everything that could make a sound we put in. We also invented signature sounds for the tech we hoped they’d use in the final. And they did for some things.

Was that a 5.1 temp mix?? Have you ever done one?
No. I want to do a 5.1 Avid mix. Looks fun.

What was the schedule like on Altered Carbon? How was that different than some of the features you’ve worked on?
It was six-day weeks and 12 hours a day. Usually one week per month I’d trade off with the 2nd assistant and she’d let me have an actual weekend. It was a bit of a grind. I worked on Episodes 2, 3 and 8, and the schedules for those were tight, but somehow we got through it all. We had a great team up here for Vancouver’s editorial. They were also cutting in LA as well. It was pretty much non-stop editing the whole way through.

How involved was Netflix in terms of the notes process? Were you working with the same editors on the episodes you assisted?
Yes, all episodes were with Julian. First it went through Skydance notes, then Netflix. Skydance usually had more as they were the first to see the cuts. There were many versions for sure.

What was it like working with Neil Blomkamp?
It was awesome. He makes cool films, and it’s great to see footage like that. I love shooting guns, explosions, swords and swearing. I beat him in ping-pong once. I danced around in victory and he demanded we play again. I retired. One of the best environments I’ve ever worked in. Elysium was my favorite gig.

What’s the largest your crew has gotten in post?
Usually one or two editors, up to four assistants, a PA, a post super — so eight or nine, depending.

Do you prefer working with a large team or do you like smaller films?
I like the larger team. It can all be pretty overwhelming and having others there to help out, the easier it can be to get through. The more the merrier!

Altered Carbon

How do you handle long-ass-days?
Long days aren’t bad when you have something to do. On Altered Carbon I kept a skateboard in my car for those times. I just skated around the studio waiting for a text. Recently I purchased a One-Wheel (skateboard with 1 wheel) and plan to use it to commute to work as much as possible.

How do you navigate the politics of a cutting room?
Politics can be tricky. I usually try to keep out of things unless I’m asked, but I do like to have a sit down or a discussion of what’s going on privately with the editor or post super. I like to be aware of what’s coming, so the rest of us are ready.

Do you prefer features to TV?
It doesn’t matter anymore because the good filmmakers work in both mediums. It used to be that features were one thing and TV was another, with less complex stories. Now that’s different and at times it’s the opposite. Features usually pay more though, but again that’s changing. I still think features are where it’s at, but that’s just vanity talking.

Sometimes your project posts in Vancouver but moves to LA for finishing. Why? Does it ever come back?
Mostly I think it’s because that’s where the director/producers/studio lives. After it’s shot everyone just goes back home. Home is usually LA or NY. I wish they’d stay here.

How long do you think you’ll continue being an AE? Until you retire? What age do you think that’ll be?
No idea; I just want to keep working on projects that excite me.

Would you ever want to be an editor or do you think you’d like to pivot to VFX, or are you happy where you are?
I only hope to keep learning and doing more. I like the VFX editing, I like assisting and I like being creative. As far as cutting goes, I’d like to get on a cool series as a junior editor or at least start doing a few scenes to get better. I just want to keep advancing, I’d love to do some VR stuff.

What’s next for you project wise?
I’m on a Disney Show called Timmy Failure. I can’t say anything more at this point.

What advice do you have for other assistant editors trying to come up?
It’s going to take a lot longer than you think to become good at the job. Being the only assistant does not make you a qualified first assistant. It took me 10 years to get there. Also you never stop learning, so always be open to another approach. Everyone does things differently. With Murch on Tomorrowland, it was a whole new way of doing things that I had never seen before, so it was interesting to learn, although it was very intimidating at the start.


Jeremy Presner is an Emmy-nominated film and television editor residing in New York City. Twenty years ago, Warren was AE on his first film. Since then he has cut such diverse projects as Carrie, Stargate Atlantis, Love & Hip Hop and Breaking Amish.

VFX studio Electric Theatre Collective adds three to London team

London visual effects studio Electric Theatre Collective has added three to its production team: Elle Lockhart, Polly Durrance and Antonia Vlasto.

Lockhart brings with her extensive CG experience, joining from Touch Surgery where she ran the Johnson & Johnson account. Prior to that she worked at Analog as a VFX producer where she delivered three global campaigns for Nike. At Electric, she will serve as producer on Martini and Toyota.

Vlasto joins Electric working on clients such Mercedes, Tourism Ireland and Tui. She joins from 750MPH where, over a four-year period, she served as producer on Nike, Great Western Railway, VW and Amazon to name but a few.

At Electric, Polly Durrance will serve as producer on H&M, TK Maxx and Carphone Warehouse. She joins from Unit where she helped launched their in-house Design Collective, worked with clients such as Lush, Pepsi and Thatchers Cider. Prior to Unit Polly was at Big Buoy where she produced work for Jaguar Land Rover, giffgaff and Redbull.

Recent projects at the studio, which also has an office in Santa Monica, California, include Tourism Ireland Capture Your Heart and Honda Palindrome.

Main Image: (L-R) Elle Lockhart, Antonia Vlasto and Polly Durrance.

Rodeo VFX supe Arnaud Brisebois on the Fantastic Beasts sequel

By Randi Altman

Fantastic Beasts: Crimes of Grindelwald, directed by David Yates and written by J.K. Rowling, is a sequel to 2016’s Fantastic Beasts and Where to Find Them. It follows Newt Scamander (Eddie Redmayne) and a young Albus Dumbledore (Jude Law) as they attempt to take down the dark wizard Gellert Grindelwald (Johnny Depp).

Arnaud_Brisebois

As you can imagine, the film features a load of visual effects, and once again the team at Rodeo FX was called on to help. Their work included establishing the period in which the film is set and helping with the history of the Obscurus, Credence Barebone, and more.

Rodeo FX visual effects supervisor Arnaud Brisebois and team worked with the film’s VFX supervisors — Tim Burke and Christian Manz — to create digital environments, including detailed recreations of Paris in the 1920s and iconic wizarding locations like the Ministry of Magic.

Beyond these settings, the Montreal-based Brisebois was also in charge of creating the set pieces of the Obscurus’ destructive powers and a scene depicting its backstory. In all, they produced approximately 200 shots over a dozen sequences. While Brisebois visited the film’s set in Leavesden to get a better feel of the practical environments, he was not involved in principal photography.

Let’s find out more…

How early did you get involved, and how much input did you have?
Rodeo got involved in May 2017, at the time mainly working on pre-production creatures, design and concept art. I had a few calls with the film’s VFX supervisors, Tim Burke and Christian Manz, to discuss creatures and main directive lines for us to play with. From there we tried various ideas.
At that moment in pre-production, the essence of what the creatures were was clear, but their visual representation could really swing between extremes. That was the time to invent, study and propose directions for design.

Can you talk about creating the Ministry of Magic, which was partially practical, yes?
Correct, the London Ministry of Magic was indeed partially practically built. The partial set in this case meant a simple incurved corridor with a ceramic tiled wall. We still had to build the whole environment in CG in order to directly extend that practical set, but, most importantly, we extended the environment itself, with its immense circular atrium filled with thousands of busy offices.

For this build, we were provided with original Harry Potter set plans from production designer Stuart Craig, as well as plan revisions meant specifically for Crimes of Grindelwald. We also had access to LIDAR scans and cross-polarized photography from areas of the Harry Potter tour in Leavesden, which was extremely useful.

Every single architectural element was precisely built as individual units, and each unit composed of individual pieces. The single office variants were procedurally laid out on a flat grid over the set plan elevations and then wrapped as a cylinder using an expression.

The use of a procedural approach for this asset allowed for faster turnarounds and for changes to be made, even in the 11th hour. A crowd library was built to populate the offices and various areas of the Ministry, helping give it life and support the sense of scale.

So you were able to use assets from previous films?
What really links these movies together is production designer Stuart Craig. This is definitely his world, at least in visual terms. Also, as with all the Potter films, there are a large number of references and guidelines available for inspiration. This world has its own mythology, history and visual language. One does not need to look for long before finding a hint, something to link or ground a new effect in the wizarding world.

What about the scenes involving the Obscurus? Was any of the destruction it caused practical?
Apart from a few fans blowing a bit of wind on the actors, all destruction was full-frontal CG. A complex model of Irma’s house was built with precise architectural details required for its destruction. We also built a wide library of high-resolution hero debris, which was scattered on points and simulated for the very close-up shots. In the end, only the actors were preserved from live photography.

What was the most challenging sequence you worked on?
It was definitely Irma’s death. This sequence involved such a wide variety of effects — ranging from cloth and RBD levitation, tearing cloth, huge RBD simulations and, of course, the Obscurus itself, which is a very abstract and complex cloth setup driving flip simulations. The challenge also came from shot values, which meant everything we built or simulated had to hold up for tight close-ups, as well as wide shots.

Can you talk about the tools you used for VFX, management and review and approval?
All our tracking and review is done in Autodesk Shotgun. Artists worked up versions that they would then submit for dailies. All these submissions got in front of me at one point or another, and I then reviewed them and entered notes and directives to guide artists in the right direction.
For a project the size of Crimes of Grindelwald, over the course of 10 months, I reviewed and commented on approximately 6,000 versions for about 500 assets and 200 shots.

We are working on a Maya-based pipeline mainly, using it for modeling, rigging and shading. Zbrush is of course our main tool for organic modeling. We mostly use Mari and Substance Designer for textures. FX and CFX is handled in Houdini and our lighting pipeline is Katana based using Arnold as renderer. Our compositing pipeline is Nuke with a little use of Flame/Flare for very specific cases. We obviously have proprietary tools which help us boost these great softwares potential and offer custom solutions.

How did the workflow differ on this film from previous films?
It didn’t really differ. Working with the same team and the same crew, it really just felt like a continuation of our collaboration. These films are great to work on, not only because of their subject matter, but also thanks to the terrific people involved.

VFX Supervision: The Coens’ Western The Ballad of Buster Scruggs

By Randi Altman

The writing and directing duo of Joel and Ethan Coen have taken on the American Western with their new Netflix film, The Ballad of Buster Scruggs. This offering features six different vignettes that follow outlaws and settlers on the American frontier.

It stars the Coen brothers’ favorite Tim Blake Nelson as Buster, along with Liam Neeson, James Franco, Brenden Gleeson and many other familiar faces, even Tom Waits! It’s got dark humor and a ton of Coen quirkiness.

Alex Lemke (middle) on set with the Coen brothers.

For their visual effects needs, the filmmakers turned to New York-based East Side Effects co-founders and VFX supervisors Alexander Lemke and Michael Huber to help make things look authentic.

We reached out to visual effects supervisors Lemke and Huber to find out more about their process on the film and how they worked with these acclaimed filmmakers. East Side Effects created two-thirds of the visual effects in-house, while other houses, such as The Mill and Method, provided shots as well.

How many VFX shots were there in total?
Alexander Lemke: In the end, 704 shots had digital effects in them. This has to be a new record for the Coens. Joel at one point jokingly called it their “Marvel movie.”

How early did you get involved? Can you talk about that process?
Michael Huber: Alex and myself were first approached in January 2017 and had our first meetings shortly thereafter. We went through the script with the Coens and designed what we call a “VFX bible,” which outlined how we thought certain effects could be achieved. We then started collecting references from other films or real-life footage.

Did you do previs? 
Lemke: The Coens have been doing movies for so long in their own way that previs never really became an issue. For the Indian battles, we tried to interest them in the Ncam virtual camera system in combination with pre-generated assets, but that is not their way of doing a film.

The whole project was storyboarded by J. Todd Anderson, who has been their go-to storyboard guy since Raising Arizona. These storyboards gave a pretty good indication of what to expect, but there were still a lot of changes due to the nature of the project, such as weather and shooting with animals.

What were some of the challenges of the process and can you talk about creating the digital characters that were needed?
Huber: Every story had its own challenge, ranging from straightforward paintouts and continuity fixes to CG animals and complex head replacements using motion control technology. In order to keep the work as close to the directors as possible, we assembled a group of artists to serve as an extended in-house team, creating the majority of shots while also acting as a hub for external vendor work.

In addition, a color workflow using ACES and FilmLight Baselight was established to match VFX shots seamlessly to the dailies look established by cinematographer Bruno Delbonnel and senior colorist Peter Doyle. All VFX pulls were handled in-house.

Lemke: The Coens like to keep things in-camera as much as possible, so animals like the owl in “All Gold Canyon” or the dog in “Gal” were real. Very early on it was clear that some horse falls wouldn’t be possible as a practical stunt, so Joel and Ethan had a reel compiled with various digital horse stunts — including the “Battle of the Bastards” from Game of Thrones, which was done by Iloura (now Method). We liked that so much that we decided to just go for it and reach out to these guys, and we were thrilled when we got them on board for this. They did the “dog-hole!” horse falls in the “The Gal Who Got Rattled” segment, as well as the carriage horses in “Mortal Remains.”

Huber: For the deer in “All Gold Canyon,” the long-time plan was to shoot a real deer against bluescreen, but it became clear that we might not get the very specific actions Joel and Ethan wanted to see. They were constantly referring to the opening of Shane, which has this great shot of the titular character appearing through the antlers of a deer. So, it became more and more clear it would have to be a digital solution, and we were very happy to get The Mill in New York to work on that for us. Eventually, they would also handle all the other critters in the opening sequence.

Can you talk about Meal Ticket’s “artist” character, who is missing limbs?
Lemke: The “Wingless Thrush” — as he is referred to on a poster in the film — was a combined effort of the art department, special effects, costume design, VFX and, of course, actor Harry Melling’s incredible stamina. He was performing this poetry while standing in a hole in the ground with his hands behind his back, and went for it take after take, sometimes in the freezing cold.

Huber: It was clear that 98% of all shots would be painting out his arms and legs, so SFX supervisor Steve Cremin had to devise a way to cut holes into the set and his chair to make it appear he was resting on his stumps. Our costume designer, Mary Zophres, had the great idea of having him wear a regular shirt where the longs sleeves were just folded up, which helped with hiding his arms. He wasn’t wearing any blue garment, just black, which helped with getting any unnecessary color spill in the set.

Alex was on set to make sure we would shoot clean plates after each setup. Luckily, the Coen brothers’ approach to these shots was really focusing on Harry’s performance in long locked-off takes, so we didn’t have to deal with a lot of camera motion. We also helped Harry’s look by warping his shoulders closer to his body in some shots.

Was there a particular scene with this character that was most challenging or that you are most proud of?
Lemke: While most of the paintout shots were pretty straightforward — we just had to deal with the sheer amount of shots and edit changes — the most challenging parts are when Liam Neeson carries Harry in a backpack up the stairs in a brothel. He then puts him on the ground and eventually turns him away from the “action” that is about to happen.

We talked about different approaches early on. At some point, a rig was considered to help with him being carried up the stairs, but this would have meant an enormous amount of paint work, not to mention the setup time on a very tight shooting schedule. A CG head might have worked for the stairs, but for the long close up shots of Harry — both over a minute long, and only with very subtle facial expressions — it would have been cost prohibitive and maybe not successful in the end. So a head replacement seemed like the best solution, which comes with its own set of problems. In our case, shooting a head element of Harry that would match exactly what the dummy on Liam’s back and on the ground was doing in the production plates.

We came up with a very elaborate set up, where we would track the backpack and a dummy in the live-action photography in 3D Equalizer. We then reengineered this data into Kuper move files that would drive a motion control motion base combo.

Basically, Harry would sit on a computerized motion base that would do the turning motion so he could react to being pushed around. This happened while the motion control camera would take care of all the translations. This also meant our DP Bruno had to create animated lighting for the staircase shot to make the head element really sit in the plate.

We worked with Pacific Motion for the motion control. Mike Leben was our operator. For the NAC effects for the motion base, Nic Nicholson took care of this. Special thanks goes out to Christoph Gaudl for his camera and object tracking, Stefan Galleithner for taking on the task of converting all that data into something the camera and base would understand, and Kelly Chang and Mike Viscione for on-set Maya support.

Of course, you only get an element that works 80% of the way — the rest was laborious compositing work. Since we put the motion base to its speed limits on the staircase shot, we actually had to shoot it half speed and then speed it up in post. This meant a lot of warping/tracking was needed to make sure there was no slippage.

Michael Huber

The dummy we used for the live-action photography didn’t have any breathing movement in it, so we used parts of Harry’s bluescreen plates as a guideline of how his chest should move. These tricky tasks were expertly performed mainly by Danica Parry, Euna Kho and Sabrina Tenore.

Can you talk about how valuable it is being on set?
Huber: It is just valuable to be on set when the call sheet calls for a greenscreen, while we really need a bluescreen! But joking aside, Joel and Ethan were very happy to have someone there all the time during the main shoot in case something came up, which happened a lot because we were shooting outdoors so much and we were dependent on the weather.

For the opening shot of Buster riding through Monument Valley, they were thinking of a very specific view — something they had seen on a picture on the Internet. Through Google Maps and research, Alex was able to find out the exact location that picture was taken. So, on a weekend when we weren’t shooting, he packed up his family and drove up to the Valley to shoot photographs that would serve as the basis for the matte painting for the first shot of the film — instead of going there with a whole crew.

Another instance being on set helped would be the scene with Tom Waits in the tree — the backgrounds for these bluescreen shots were a mixture of B camera and Alex’s location photography while in Colorado. Same goes for the owl tree backgrounds.

What tools did East Side use on the film?
Huber: For software we called on Foundry Nuke (X & Studio), Boris FX Mocha Pro and Side Effects Houdini. For hardware we used HP and SuperMicro workstations running Linux. There was also proprietary software such as using Houdini digital assets for blood simulations.

We were using Autodesk Shotgun with a proprietary connection to Nuke that handled all our artist interaction and versioning, including automatically applying the correct Baselight grade when creating a version. This also allowed us to use the RV-Shotgun integration for reviewing.

Can you talk about the turnaround times and deadlines?
Lemke: Working on a Coen brothers film means you don’t have a lot of things you normally have to deal with — studio screenings, trailers, and such. At the same time, they insisted on working through the stories chronologically, so that meant that the later segments would come in late in the schedule. But, it is always a great experience working with filmmakers who have a clear vision and know what they are doing.

Asahi beer spot gets the VFX treatment

A collaboration between The Monkeys Melbourne, In The Thicket and Alt, a newly released Asahi campaign takes viewers on a journey through landscapes built around surreal Japanese iconography. Watch Asahi Super Dry — Enter Asahi here.

From script to shoot — a huge operation that took place at Sydney’s Fox Studios — director Marco Prestini and his executive producer Genevieve Triquet (from production house In The Thicket) brought on the VFX team at Alt to help realize the creative vision.

The VFX team at Alt (which has offices in Sydney, Melbourne and Los Angeles) worked with Prestini to help design and build the complex “one shot” look, with everything from robotic geishas to a gigantic CG squid in the mix, alongside a seamless blend of CG set extensions and beautifully shot live-action plates.

“VFX supervisor Dave Edwards and the team at Alt, together with my EP Genevieve, have been there since the very beginning, and their creative input and expertise were key in every step of the way,” explains Prestini. “Everything we did on set was the results of weeks of endless back and forth on technical previz, a process that required pretty much everyone’s input on a daily basis and that was incredibly inspiring for me to be part of.”

Dave Edwards, VFX supervisor at Alt, shares: “Production designer Michael Iacono designed sets in 3D, with five huge sets built for the shoot. The team then worked out camera speeds for timings based on these five sets and seven plates. DP Stefan Duscio would suggest rigs and mounts, which our team was able to then test it in previs to see if it would work with the set. During previs, we worked out that we couldn’t get the resolution and the required frame rate to shoot the high frame rate samurais, so we had to use Alexa LF. Of course, that also helped Marco, who wanted minimal lens distortion as it allowed a wide field of view without the distortion of normal anamorphic lenses.”

One complex scene involves a character battling a gigantic underwater squid, which was done via a process known as “dry for wet” — a film technique in which smoke, colored filters and/or lighting effects are used to simulate a character being underwater while filming on a dry stage. The team at Alt did a rough animation of the squid to help drive the actions of the talent and the stunt team on the day, before spending the final weeks perfecting the look of the photoreal monster.

In terms of tools, for concept design/matte painting Alt used Adobe Photoshop while previs/modeling/texturing/animation was done in Autodesk Maya. All of the effects/lighting/look development was via Side Effects Houdini; the compositing pipeline was built around Foundry Nuke; final online was completed in Autodesk Flame; and for graphics, they used Adobe After Effects.
The final edit was done by The Butchery.

Here is the VFX breakdown:

Enter Asahi – VFX Breakdown from altvfx on Vimeo.