Tag Archives: VFX

VFX studios Mr. X and Mill Film merge to target post-COVID world

Technicolor visual effects companies Mr. X and Mill Film have merged under the Mr. X name. Mr. X now becomes a VFX studio crossing four time zones, spanning Canada, the United States, Australia and India. This does not impact The Mill which continues to operate as a separate entity.

The newly combined, expanded studio will service clients across both features and episodic. Laura Fitzpatrick, MD of Mill Film, will move into a managing director role at Mr. X, based in Montreal. Dennis Berardi, founder of Mr. X, assumes the role of creative director for the studio.

Technicolor acknowledges that COVID-19 is changing the entertainment industry with the theatrical market being re-imagined and many projects are currently on hold indefinitely. They say this merger is a direct and necessary response to align to the changing needs of the industry and creative partners, as productions begin again, and as the entertainment industry looks to move forward.

The newly combined and expanded studio will service clients across both features and episodic — with the flexibility to serve productions resuming at different times in different parts of the world, and the capacity to handle an anticipated increase in VFX in response to changes required for live action filming.

“As the entertainment landscape has continued to evolve, both studios were naturally overlapping into each other’s spaces,” says Berardi. “Merging both brands allows us to build the perfect team for each and every client.”

With 20 years in the business, Mr. X has built collaborative partnerships with directors such as Guillermo Del Toro and Paul W.S Anderson. The studio has worked on The Shape of Water, Roma and Shazam! to name a few.

Mill Film has delivered projects such as Gladiator, which won the Academy Award for Best Visual Effects in 2001, Harry Potter and the Philosopher’s Stone, plus more recent releases Maleficent: Mistress of Evil and Dora and the Lost City of Gold.

“Our aim is to partner with clients to realize their ideas and exceed visual expectations,” says Fitzpatrick. “With our merged brand we can pitch global expertise in all creative areas: original design and art direction, on-set supervision, environment creation, FX simulations, creature and character work.”

All facilities remain open in Toronto, Montreal, Los Angeles, Adelaide and Bangalore. The merger will be effectively immediately, with a period of transition for employees.

Main Image: Laura Fitzpatrick and Dennis Berardi.

Jellyfish Pictures uses cloud to grow global talent pool

Animation and VFX studio Jellyfish Pictures has expanded its operating model to access talent across the world. The move is the company’s next stage of development after opening a large virtual studio at the end of last year.

This new way of working allows Jellyfish Pictures to access talent anywhere in the world without having to invest in brick and mortar or on-premises hardware. Artists can work from their own homes and have the same experience as teammates located 6,000 miles away, thanks to Teradici Cloud Access Software and Microsoft Azure. This new model has been implemented with artists joining the company from Israel, India, North America, Finland, Canada, Spain and Réunion.

With Jellyfish Pictures’ IT infrastructure already housed off site and completely virtual, the company uses Azure’s backbone to set up hubs all over the world, which connect back to the Jellyfish Pictures’ tier-one data center in the UK.

Cristina Ortega working from home in the UK.

All content resides on PixStor, Pixit Media’s software-defined storage solution. Using Pixit Media’s dynamic data manager, Ngenea, integrated with pipeline tools and Azure, Jellyfish Pictures distributes files across creative hubs quickly and securely. Artists access their content from PixStor running in the cloud hub, which guarantees their performance requirements are always met. When completed, files automatically move back to the UK data center.

Data never leaves the secure Azure hub, with pixels streamed to artists’ monitors via an encrypted streaming session over Teradici PCoIP technology. Data cannot be downloaded, shared or accessed, remaining fully compliant with TPN protocols and the stringent security measures withheld in the physical studios.

To further strengthen the global operation, Jellyfish Pictures’ review tool, which extends to the public cloud, allows clients to review content seamlessly in 4K. No matter where they are based in the world, both client and artist can share the same screen, updating and annotating in real time.

According to Jellyfish CEO Phil Dobree, “From the very beginning, when I first started looking at cloud and virtual technologies with Jellyfish CTO Jeremy Smith, it was always my vision to be able to go to where the artists are. We introduced cloud rendering and virtual desktops so we could break out of our four walls. Now in 2020, with events no one could have foreseen, we have over 280 artists working from home with no loss in productivity. Moving our staff to this environment was a relatively simple; connecting to the data center from home is the same as if they were connecting from the studio.

“It was always our intention to roll out this way of working on a global scale. We have merely accelerated our plan due to current circumstances.”

Main Image: Art director Katri Valkamo working out of her home in Finland. 

Alkemy X: VFX supervisors share work from home process

By Bilali Mack and Erin Nash

On the heels of joining Alkemy X’s VFX team, what we expected of our first few weeks was quickly interrupted by a global crisis. After getting to know the company and settling in, we were tasked with responding to the COVID-19 pandemic and transitioning the staff to remote work as quickly and efficiently as possible. As a headcount, that would be 42 artists, three supervisors, three pipeline engineers, three in editorial and the I/O department, and eight production management personnel.

Erin Nash’s WFH setup

We were fortunate that Alkemy X already had systems and processes in place and ready for these virtual workflows. It was just a matter of making the decision to get ahead of state mandates and make the shift early to set ourselves up for success. Our pivot to a remote workflow was structured and executed the week prior to March 16. We began to build our plan starting Tuesday, March 10, and by that Friday, the engineering and pipeline team had built on its pre-existing security-compliant processes to roll out to the entire staff of artists and production.

The company uses RGS to connect artists to a low-latency screen-sharing session on their work computers. Since the remote artists are working off the computer they normally use at work, they still have access to all of the software, licenses and tools they have when at the office. Agile and innovative responses have made our jobs easier, despite these circumstances.

Alkemy X built an openVPN server to allow secure, encrypted, multi-factor authentication and remote access to our internal network. By working remotely, we are able to maintain security and keep assets contained within our secure network. Artists have access to their files via high-speed file servers, with no need for time-consuming file transfers.

Bilali Mack working from home

Alkemy X uses Shotgun to manage our shows and workflow, but we are leaning on it more heavily now as a first-line review tool before heading to high-resolution reviews through HP RGS. Our traditional dailies have been replaced by rolling spot checks in Shotgun followed by more exhaustive reviews of full-resolution media.

We use Google Meet for meetings, screen sharing, video chat and telephone calls. We use Slack extensively on non-networked computers for team communication, keeping everyone connected and up to date and to quickly get assistance with any technical problems.

Priority is still placed on building and maintaining the company’s culture in addition to the quality of creative work, but we’re doing so behind the top of a dining room table or bedroom-stationed desk and within steps from our kitchens.

Erin Nash

As we move from our former posts, here’s how we are individually navigating working from home:

Erin Nash: Although managing a team remotely is a new experience for me, I can’t say I have found it very difficult to transition. While the team as a whole is new to me, I have known many of the artists for years. Being able to guide their creative process and help them solve difficult technical problems from afar isn’t as different as I would have expected. Now instead of saying “Can I drive your box?” it has become “Let’s do a screen share.”

People by and large do all the same things from home that they would do in the office, with the main difference being that now nobody can tell if I’ve gone for a workout over lunch.

Bilali Mack: Starting out at any company takes time to get up to speed. Add something like a global pandemic, and you would think it would be nearly impossible not only to get up to speed, but also to manage teams, collaborate on creative and retain our company’s culture. We adapted by preparing artist and production remote on-boarding documents and deploying necessary hardware and software to any and all artists on our team.

On a cultural note, we’re still holding company happy hours and open Google Meet “office” hours, just because it’s nice to be able to jump on and chat with each other about how things are different now.

Bilali Mack

Alkemy X built an openVPN server to allow secure, encrypted, multi-factor authentication, remote access to our internal network. Alkemy X uses RGS to connect artists to a low-latency screen sharing session on their work computers. Since the artists working remotely are working off of the computer that they normally use at work, they still have access to all of the software, licenses, tools that they have when at the office. By working remotely, we are able to maintain security and keep assets contained within our secure network. Artists have access to their files via high-speed file servers and with no need to do time consuming file transfers.

Alkemy X uses Shotgun as usual to manage our shows and workflow but are leaning on it heavier now as a first line review tool before heading to high resolution reviews through HP RGS. Our traditional dailies have been replaced by rolling spot checks in Shotgun followed by more exhaustive reviews of full resolution media.

We use Google Meet for meetings, screen sharing, video chat, and telephone calls. We use Slack extensively on non-networked computers for team communication, keeping everyone connected and up to date, and to quickly get assistance with any technical problems. All regular company meetings, and Friday night happy hours are done with Google Meet.

Main Image: Bilali Mack WFH.


VFX supervisor Bilali Mack comes to Alkemy X from MPC, where he supervised and executed VFX for brands including Adidas, Google and BMW. Erin Nash joined the team from FuseFX was head of 2D/VFX supervisor, leveraging his experience across television, film and commercial work.

Invisible VFX on Hulu’s Big Time Adolescence

By Randi Altman

Hulu’s original film Big Time Adolescence is a coming-of-age story that follows 16-year-old Mo, who is befriended by his sister’s older and sketchy ex-boyfriend Zeke. This aimless college dropout happily introduces the innocent-but-curious Mo to drink and drugs and a poorly thought-out tattoo.

Big Time Adolescence stars Pete Davidson (Zeke), Griffin Gluck (Mo) and Machine Gun Kelly (Nick) and features Jon Cryer as Mo’s dad. This irony will not be lost on those who know Cryer from his own role as disenfranchised teen Duckie in Pretty in Pink.

Shaina Holmes

While this film doesn’t scream visual effects movie, they are there — 29 shots — and they are invisible, created by Syracuse, New York-based post house Flying Turtle. We recently reached out to Flying Turtle’s Shaina Holmes to find out about her work on the film and her process.

Holmes served as VFX supervisor, VFX producer and lead VFX artist on Big Time Adolescence, creating things like flying baseballs, adding smoke to a hotboxed car, removals, replacements and more. In addition to owning Flying Turtle Post, she is a teacher at Syracuse University, where she mentors students who often end up working at her post house.

She has over 200 film and television credits, including The Notebook, Tropic Thunder, Eternal Sunshine of the Spotless Mind, Men in Black 3, Swiss Army Man and True Detective.

Let’s find our more…

How early did you get involved on Big Time Adolescence?
This this was our fifth project in a year with production company American High. With all projects overlapping in various stages of production, we were in constant contact with the client to help answer any questions that arose in early stages of pre-production and production.

Once the edit was picture-locked, we bid all the VFX shots in October/November 2018, VFX turnovers were received in November, and we had a few short weeks to complete all VFX in time for the premiere at the Sundance Film Festival in January 2019.

What direction were you given from your client?
Because this was our fifth feature with American High and each project has similar basic needs, we already had plans in place for how to shoot certain elements.

For example, most of the American High projects deal with high school, so cell phones and computer screens are a large part of how the characters communicate. Production has been really proactive about hiring an on-set graphics artist to design and create phone and computer screen graphics that can be used either during the shoot or provided to my team to add in VFX.

Having these graphics prebuilt has saved a lot of design time in post. While we still need to occasionally change times and dates, remove the carrier, change photos, replace text and other editorial changes, we end up only needing to do a handful of shots instead of all the screen replacements. We really encourage communication during the entire process to come up with alternatives and solutions that can be shot practically, and that usually makes our jobs more efficient later on.

Were you on set?
I was not physically needed on set for this film, however after filming completed, we realized in post that we were missing some footage during the batting cages scene. The post supervisor and I, along with my VFX coordinator, rented a camera and braved the freezing Syracuse, New York, winter to go to the same batting cages and shoot the missing elements. These plates became essential, as production had turned off the pitching machine during the filming.

Before and After: Digital baseballs

To recreate the baseball in CG, we needed more information for modeling, texture and animation within this space to create more realistic interaction with the characters and environment in VFX. After shoveling snow and ice, we were able to set the camera up at the batting cage and create the reference footage we needed to match our CG baseball animation. Luckily, since the film shot so close to where we all live and work, this was not a problem… besides our frozen fingers!

What other effects did you provide?
We aren’t reinventing the wheel here in the work we do. We work on features wherein invisible VFX are the supporting roles that help create a seamless experience for the audience without distractions from technical imperfections and without revising graphics to enable the story to unfold properly. I work with the production team to advise on ways to shoot to save on costs in post production and use creative problem solving to cut down costs in VFX to satisfy their budget and achieve their intended vision

That being said, we were able to do some fun sequences including CG baseballs, hotboxing a car, screen replacements, graphic animation and alterations, fluid morphs and artifact cleanup, intricate wipe transitions, split screens and removals (tattoos, equipment, out-of-season nature elements).

Can you talk about some of those more challenging scenes/effects?
Besides the CG baseball, the most difficult shots are the fluid morphs. These usually consist of split screens where one side of the split has a speed change effect to editorially cut out dialogue or revise action/reactions.

They seem simple, but to seamlessly morph two completely different actions together over a few frames and create all the in-betweens takes a lot of skill. These are often more advanced than our entry-level artists can handle, so they usually end up on my plate.

What was the review and approval process like?
All the work starts with me receiving plates from the clients and ends with me delivering final versions to the clients. As I am the compositing supervisor, we go through many internal reviews and versions before I approve shots to send to the client for feedback, which is a role I’ve done for the bulk of my career.

For most of the American High projects, the clients are spread out between Syracuse, LA and NYC. No reviews were done in person, although if needed, I could go to Syracuse Studios at any time to review dailies if there was any footage I thought could help with some fix-it-in-post VFX requests.

All shots were sent online for review and final delivery. We worked closely with the executive producer, post supervisor, editor and assistant editor for feedback, notes, design and revisions. Most review sessions were collaborative as far as feedback and what’s possible.

What tools did you use on the film?
Blackmagic’s Fusion is the main compositing software. Artists were trained on Fusion by me when they were in college, so it is an easy and affordable transition for them to use for professional-quality work. Since everyone has their own personal computer setup at home, it’s been fairly easy for artists to send comp files back to me and I render on my end after relinking. That has been a much quicker process for internal feedback and deliveries as we’re working on UHD and 4K resolutions.

For Big Time Adolescence specifically, we also needed to use Adobe After Effects for some of the fluid morph shots, plus some final clean-up in Fusion. For the CG baseball shots, we used Autodesk Maya and Substance Painter, rendered with Arnold and comped in Fusion.

You are female-owned and you are in Syracuse, New York. Not something you hear about every day.
Yes, we are definitely set up in a great up-and-coming area here in Ithaca and Syracuse. I went to film school at Ithaca College. From there, I worked in LA and NYC for 20 years as a VFX artist and producer. In 2016, I was offered the opportunity to teach VFX back at Ithaca College, so I came back to the Central New York area to see if teaching was the next chapter for me.

Timing worked out perfectly when some of my former co-workers were helping create American High, using the Central New York tax incentives and they were prepping to shoot feature films in Syracuse. They brought me on as the local VFX support since we had already been working together off and on since 2010 in NYC. When I found myself both teaching and working on feature films, that gave me the idea to create a company to combine forces.

Teaching at Syracuse University and focusing on VFX and post for live-action film and TV, I am based at The Newhouse School, which is very closely connected with American High and Syracuse Studios. I was already integrated into their productions, so this was just a really good fit all around to bring our students into the growing Central New York film industry, aiming to create a sustainable local talent pool.

Our team is made up of artists who started with me in post mentorship groups I created at both Ithaca College (Park Post) and Syracuse University (SU Post). I teach them in class, they join these post group collaborative learning spaces for peer-to-peer mentorship, and then a select few continue to grow at Flying Turtle Post.

What haven’t I asked that’s important?
When most people hear visual effects, they think of huge blockbusters, but that was never my thing. I love working on invisible VFX and the fact that it blows people’s minds — how so much attention is paid to every single shot, let alone frame, to achieve complete immersion for the audience, so they’re not picking out the boom mic or dead pixels. So much work goes on to create this perfect illusion. It’s odd to say, but there is such satisfaction when no one noticed the work you did. That’s the sign of doing your job right!

Every show relies on invisible VFX these days, even the smallest indie film with a tiny budget. These are the projects I really like to be involved in as that’s where creativity and innovation are at their best. It’s my hope that up-and-coming filmmakers who have amazing stories to tell will identify with my company’s mentorship-focused approach and feel they also are able to grow their vision with us. We support female and underrepresented filmmakers in their pursuit to make change in our industry.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Arch platform launches for cloud-based visual effects

Arch Platform Technologies, a provider of cloud-based infrastructure for content creation, has made its secure, scalable, cloud-based visual effects platform available commercially. The Arch platform is designed for movie studios, productions and VFX companies and enables them to leverage a VFX infrastructure in the cloud from anywhere in the world.

An earlier iteration of the Arch platform was only available to those companies who were already working with Hollywood-based Vitality VFX, where the technology was created by Guy Botham. Now, Arch is making its next-generation version of its “rent vs. own” cloud-based VFX platform commercially available broadly to movie studios, productions and VFX companies. This version was well along in its development when COVID-19 arrived, making it a very timely offering.

By moving VFX to the cloud, the platform lets VFX teams scale up and down quickly from anywhere and build and manage capacity with cloud-based workstations, renderfarms, storage and workflow management – all in a secure environment.

“We engineered a robust Infrastructure as a Service (IaaS), which now enables a group of VFX artists to collaborate on the same infrastructure as if they were using an on-premises system,” says Botham. “Networked workstations can be added in minutes nearly anywhere in the world, including at an artist’s home, to create a small to large VFX studio environment running all the industry-standard software and plugins.”

Recently, Solstice Studios, a Hollywood distribution and production studio, used the Arch platform for the VFX work on the studio’s upcoming first movie, Unhinged. The platform has also been used by VFX companies Track VFX and FatBelly VFX and is now commercially available to the industry.

Due to COVID, The Blacklist turns to live-action/animation season finale

By Daniel Restuccio

When the COVID-19 crisis shut down production in New York City, necessity became the mother of invention for The Blacklist showrunners. They took the 21 minutes of live-action footage they had shot for Episode 19, “The Kazanjian Brothers,” and combined it with 21 minutes of graphic-novel-style animation to give viewers the season finale they deserved.

Adam Coglan

Thanks to previs/visual effects company Proof, the producers were able to transition from scenes that were shot traditionally to a world where FBI agent Elizabeth Keane and wanted fugitive Raymond Reddington lived as animated characters.

The Blacklist team reached out to Proof the week everyone at the studio was asked to start working from home. In London, artists were given workstations as needed; in the US, they had all the computers set up in the office while the team remoted into those workstations based on the different proprietary and confidentiality rules, and to keep everything on the same servers.

Over six weeks, 29 people in London, including support staff and asset people, worked on the show. While in the US, the numbers varied between 10 to 15 people. As you can imagine, it’s a big undertaking.

Patrice Avery

We reached out to Adam Coglan and Matt Perrin, Proof animation supervisors based in London, and Patrice Avery global head of production for Proof, about the production and post workflow.

How did you connect with the producers on the show?
Patrice Avery: Producer Jon Bokenkamp and John Eisendrath knew Proof’s owner and president, Ron Frankel. After The Blacklist shut down, they brainstormed ideas, and animation was one they thought might make sense.

Adam Coglan: The Proof US offices tend to work using toon shaders on models, and the producers had seen our previs work on the Hunger Games, Guardians of the Galaxy and Wrinkle in Time.

Can you walk us through the workflow?
Coglan: Everybody was working in parallel. There was no time to wait for the models to get built, then start texturing, then start rigging, then start animating. Animation had to start from the beginning, so we started out using proxy geometry for the characters.

Matt Perrin

Character build and animation was going on at the same time. We were building the sequences, blocking them out, staging them. We had a client call every day, so we’d be getting notes every day. The clients wouldn’t be looking at anything that resembled their main actors until a good three or four weeks into the actual project.

Avery: It meant they had to run blind a bit with their animation. They got scratch dialog, and then they got final. They didn’t get the real dialogue until almost the end because they still were trying to figure out how to get the best quality dialogue recorded from their actors.

Obviously, the script had been written, so you essentially animated the existing script?
Coglan: Yes. We were given the script early on and it did evolve a little bit, but not wildly. We managed to stick to the sequences that we’d actually blocked out from the start. There were no storyboards; we basically just interpreted what their script gave us.

Can you talk a bit about some of the scenes you enhanced?
Coglan: There’s a helicopter sequence at the end of the show that they hadn’t planned to shoot with live action, because of safety issues with the helicopter. They brought it back when they realized they could do all of those shots in animation. So there are big aerials over the helicopter landing pad. The helicopter is going while the main villain approaches the helicopter.

Matt Perrin: There are shots peppered throughout the whole thing that would have been tricky to fit into the timescales that they normally shoot the show in. Throughout the show, there are angles and camera work that were easier in animation. In the pilot episode, they visited Washington Mall and the Capitol building, so we got to return to that.

You used Autodesk Maya as your main tool? What else was used?
Coglan: Yes, predominantly Maya. Character heads were built in Z-Brush and then brought into Maya and textured using Substance and Photoshop. The toon shader is a set of proprietary shaders that Proof developed in Maya with filters on the textures to give them the toon shaded look.

Your networks are obviously connected?
Coglan: Absolutely. We’ve been using Teradici, which has really saved our skin on this show. It’s been a godsend and offers really good remote access.

Aside from the truncated production schedule, what were some of the other challenges that you had?
Coglan: Working completely remotely with a team of over 20 odd people was a big challenge.

Perrin: Yes. Everything slows down. Coordinating the work from home with all the artists, the communication that you have face-to-face with the team being in the same room as you is, obviously, stretched. We would communicate over Zoom chats daily, multiple times a day with them, and with the producers.

On the flip side, it felt like we had more access to the producers of the show than we might under normal circumstances, because we had a scheduled meeting with them every day as well. It was great to tap directly into their taste and get their feedback so immediately.

Can you describe how the animation was done? Keyframe, rotoscoping, motion capture, or some combination of those?
Perrin: We started with very simple blockouts of action and cameras for each scene. This allowed us to get the layout and timing approved fast, saving as much time as possible for the keyframe animation stage. There are a couple of transitions between live action and animation that required some rotoanimation. We also did a little mocap (mostly for background character motions.) On the whole though, it was a lot of keyframe animation.

How were the editors involved?
Perrin: Chris Brookshire and Elyse Holloway “got it” from the beginning. They gave us the cut of the live-action show with placeholders slugged in for the scenes we would be animating. Between watching that and the script, which was already pretty tight, it gave us an idea of what the scope of our role was going to be.

We decided to go straight into a very basic blocking pass rendered in gray scale 3D so they could see the space and start testing angles with faster iterations. It allowed them to start cutting earlier and give us those edits back. They never got an excess of footage from us.

When they shoot the show, they’ve got reels of footage to go through, whereas with us they get the shots we created and not many spare. But the editors and showrunners got the idea that they could actually start calling out for shots too. They’d ask us to change the layout in some instances because they want to shuffle the shots around from what we’d initially intended.

From that point it allowed our asset makers and R&D teams to be looking into what the character should look like in the environments and building that parallel with us. Then we were ready to go into animation.

How did you set the style for the final look of the piece?
Perrin: The client had a strong idea. They already had done a spinoff comic book series. We’d seen what they’ve done with that, and they talked about the kind of styling of The Blacklist being quite noir.

Coglan: They gave the current episode and past episodes. So they could always reference scenes that were similar to other episodes. As soon as one of the show runners started talking about leaning into the graphic novel styling of things a little light went off and we thought, okay, we know exactly what they’re after for this now. It gave us a really good creative direction.

That was the biggest development on the project — to get the fidelity of the toon shaders to stand up to broadcast quality, better than we’ve been used to in the past. Because normally these don’t go past producers and directors when we work in previs.

When you say better quality what is that actually?
Coglan: There are some extreme close-ups in this where we were right on the main characters’ faces. They had detail actually hand-painted in Photoshop and then Substance. A lot of the lines to define features were actually painted by hand.

Avery: When using the toon shading in previs, we didn’t really do much to the backgrounds, it was all character line toon shaded, and in this one we created a process for the background sets to also make them look toon shaded.

Did you recreate any of the existing sets?
Coglan: They gave us blueprints for a lot of their set builds, so, yes, some of the sets were straight from the show.

Perrin: One set, a medical facility, we’d built already from their blueprints, so that when we transition out of the live-action into the animation it’s kind of seamless.

What other things did you accomplish that you’re proud of that you haven’t mentioned yet?
Coglan: I think for me really, the amount of work that we did in the compressed amount of time really was the big takeaway for me. Dealing with people totally remotely I just didn’t know whether that could work and we made it work.

Perrin: The whole way through was very exciting, because of the current situation everybody’s in, and the time constraints. It was very liberating for us. We didn’t have the multi-tiered approval stages or the normal infrastructure. It was immediate feedback and fast results.

Avery: What was cool for me was watching the creative discussions. There was a point a few weeks in when the client was giving more notes about the comic book-style and leaning into that. Our teams are so used to the constraints of live-action and the rules that they need to follow. There was this switch when they finally were like, “Oh, we can do really cool comic book angles. Oh, we could do this and that.” To just see the team really embracing, untethered a bit, to just go for it. It was really cool to see.

What would you do if you got a call, “Hey, we’ve got an entire series that wants to go this route?”
Perrin: I think I’d jump at it really. Although I don’t think I could do it in the same timescale for everything. I think there needs to be slightly more planning involved, but it’s been pretty enjoyable.


Dan Restuccio is a writer/director with Realwork Entertainment and part of the Visual Arts faculty at California Lutheran University. He is a former Disney Imagineer. You can reach him at dansweb451@gmail.com.

Color grading Togo with an Autochrome-type look

Before principal photography began on the Disney+ period drama Togo, the film’s director and cinematographer, Ericson Core, asked Company 3 senior colorist Siggy Ferstl to help design a visual approach for the color grade that would give the 1920s-era drama a unique look. Based on a true story, Togo is named for the lead sled dog on Leonhard Seppala’s (Willem Dafoe) team and tells the story of their life-and-death relay through Alaska’s tundra to deliver diphtheria antitoxin to the desperate citizens of Nome.

Siggy Ferstl

Core wanted a look that was reminiscent of the early color photography process called Autochrome, as well as an approach that evoked an aged, distressed feel. Ferstl, who recently colored Lost in Space (Netflix) and The Boys (Amazon), spent months — while not working on other projects — developing new ways of building this look using Blackmagic’s Resolve 16.

Many of Ferstl’s ideas were realized using the new Fusion VFX tab in Resolve 16. It allowed him to manipulate images in ways that took his work beyond the normal realm of color grading and into the arena of visual effects.

By the time he got to work grading Togo, Ferstl had already created looks that had some of the visual qualities of Autochrome melded with a sense of age, almost as if the images were shot in that antiquated format. Togo “reflects the kind of style that I like,” explains Ferstl. “Ericson, as both director and cinematographer, was able to provide very clear input about what he wanted the movie to look like.”

In order for this process to succeed, it needed to go beyond the appearance of a color effect seemingly just placed “on top” of the images. It had to feel organic and interact with the photography, to seem embedded in the picture.

A Layered Approach
Ferstl started this large task by dividing the process into a series of layers that would work together to affect the color, of course, but also to create lens distortion, aging artifacts and all the other effects. A number of these operations would traditionally be sent to Company 3’s VFX department or to an outside vendor to be created by their artists and returned as finished elements. But that kind of workflow would have added an enormous amount of time to the post process. And, just as importantly, all these effects and color corrections needed to work interactively during grading sessions at Company 3 so Ferstl and Core could continuously see and refine the overall look. Even a slight tweak to a single layer could affect how other layers performed, so Ferstl needed complete, realtime control of every layer for every fine adjustment.

Likewise the work of Company 3 conform artist Paul Carlin could not be done in the way conform has typically been done. It couldn’t be sent out of Resolve and into a different conform/compositing tool, republished to the company network and then returned to Ferstl’s Resolve timeline. This would have taken too long and wouldn’t have allowed for the interactivity required in grading sessions.

Carlin needed to be able to handle the small effects that are part of the conform process — split screens, wire removals, etc. — quickly, and that meant working from the same media Ferstl was accessing. Carlin worked entirely in Resolve using Fusion for any cleanup and compositing effects — a practice becoming more and more common among conform artists at Company 3. “He could do his work and return it to our shared timeline,” Ferstl says. “We both had access to all the original material.”


Most of the layers actually consisted of multiple sublayers. Here is some detail:
Texture: This group of sublayers was based on overlaid textures that Ferstl created to have a kind of “paper” feel to the images. There were sublayers based on photographs of fabrics and surfaces that all play together to form a texture over the imagery.
Border: This was an additional texture that darkened portions of the edges of the frame. It inserts a sense of a subtle vignette or age artifact that framed the image. It isn’t consistent throughout; it continually changes. Sublayers bring to the images a bit of edge distortion that resembles the look of diffraction that can happen to lenses, particularly lenses from the early 20th century, under various circumstances.
Lens effects: DP Core shot with modern lenses built with very evolved coatings, but Ferstl was interested in achieving the look of uncoated and less-refined optics of the day. This involved the creation of sublayers of subtle distortion and defocus effects.
Stain: Ferstl applied a somewhat sepia-colored stain to parts of the image to help with the aging effect. He added a hint of additional texture and brought some sepia to some of the very bluish exterior shots, introducing hints of warmth into the images.
Grain-like effect: “We didn’t go for something that exactly mimicked the effect of film grain,” Ferstl notes. “That just didn’t suit this film. But we wanted something that has that feel, so using Resolve’s Grain OFX, I generated a grain pattern, rendered it out and then brought it back into Resolve and experimented with running the pattern at various speeds. We decided it looked best slowed to 6fps, but then it had a steppiness to it that we didn’t like. So I went back and used the tool’s Optical Flow in the process of slowing it down. That blends the frames together, and the result provided just a hint of old-world filmmaking. It’s very subtle and more part of the overall texture.”

Combining Elements
“It wasn’t just a matter of stacking one layer on top of the other and applying a regular blend. I felt it needed to be more integrated and react subtly with the footage in an organic-looking way,” Ferstl recalls.

One toolset he used for this was a series of customized lens flare using Resolve’s OFX, not for their actual purpose but as the basis of a matte. “The effect is generated based on highlight detail in the shot,” explains Ferstl. “So I created a matte shape from the lens flare effect and used that shape as the basis to integrate some of the texture layers into the shots. It’s the textures that become more or less pronounced based on the highlight details in the photography and that lets the textures breathe more.”

Ferstl also made use of the Tilt-Shift effect in Fusion that alters the image in the way movements within a tilt/shift lens would. He could have used a standard Power Window to qualify the portion of the image to apply blur to, but that method applied the effect more evenly and gave a diffused look, which Ferstl felt wasn’t like a natural lens effect. Again, the idea was to avoid having any of these effects look like some blanket change merely sitting on top of the image.

“You can adjust a window’s softness,” he notes, “but it just didn’t look like something that was optical… it looked too digital. I was desperate to have a more optical feel, so I started playing around with the Tilt-Shift OFX and applying that just to the defocus effect.

“But that only affected the top and bottom of the frame, and I wanted more control than that,” he continues. “I wanted to draw shapes to determine where and how much the tilt/shift effect would be applied. So I added the Tilt-Shift in Fusion and fed a poly mask into it as an external matte. I had the ability to use the mask like a depth map to add dimensionality to the effect.”

As Ferstl moved forward with the look development, the issue that continually came up was that while he and Core were happy with the way these processes affected any static image in the show, “as soon as the camera moves,” Ferstl explains, “you’d feel like the work went from being part of the image to just a veil stuck on top.”

He once again made use of Fusion’s compositing capabilities: The delivery spec was UHD, and he graded the actual photography in that resolution. But he built all the effects layers at the much larger 7K. “With the larger layers,” he says, “if the camera moved, I was able to use Fusion to track and blend the texture with it. It didn’t have to just seem tacked on. That really made an enormous difference.”

Firepower
Fortunately for Ferstl, Company 3’s infrastructure provided the enormous throughput, storage and graphics/rendering capabilities to work with all these elements (some of which were extremely GPU-intensive) playing back in concert in a color grading bay. “I had all these textured elements and external mattes all playing live off the [studio’s custom-built] SAN and being blended in Resolve. We had OpenFx plugins for border and texture and flares generated in real time with the swing/tilt effect running on every shot. That’s a lot of GPU power!”

Ferstl found this entire experience artistically rewarding, and looks forward to similar challenges. “It’s always great when a project involves exploring the tools I have to work with and being able to create new looks that push the boundaries of what my job of colorist entails.”

Faceware Studio uses ML to create facial animation in realtime

Faceware Technologies, which provides markerless 3D facial motion capture solutions, has released Faceware Studio, a new platform for creating high-quality facial animation in realtime. Faceware Studio is built from the ground up to be a complete replacement for the company’s former Live product.

According to the company, Studio reimagines the realtime streaming workflow with a modern and intuitive approach to creating instant facial animation. Using single-click calibration, Studio can track and animate a face in real time by using machine learning and the latest neural network techniques. Artists can then tune and tailor the animation to an actor’s unique performance and build additive logic with Motion Effects. The data can then be streamed to Faceware-supported plugins in Unreal Engine, Unity, MotionBuilder and soon Maya for live streaming or recording in engine on an avatar.

Faceware Studio is available now. Pricing starts at just $195 per month or $2,340 annually, which includes support. Trial versions are available on their site.

New features include:
Realtime jaw positioning using deep learning: Faceware’s improved jaw positioning tech, which is currently used in Faceware Retargeter, is now available in Studio, giving users the ability to create fast and accurate lipsync animation in realtime.

Motion Effects and Animation Tuning: Studio offers users direct control over their final animation. They can visualize and adjust actor-specific profiles with Animation Tuning and build powerful logic into the realtime data stream using Motion Effects.

Realtime Animation Viewport and Media Timeline: Users can see their facial animation from any angle with Studio’s 3D animation viewport and use the timeline and media controls to pause, play and scrub through their media to find suitable frames for calibration and to focus on specific sections of their video.

Dockable, customizable interface: A customizable user interface with docking panels and saveable workspaces.

The Embassy opens in Culver City with EP Kenny Solomon leading charge

Vancouver-based visual effects and production studio The Embassy is opening an office in LA office in Culver City. EP Kenny Solomon will head up the operation. The move comes following the studio’s growth in film, advertising and streaming, and a successful 2019.The LA-based office will allow The Embassy to have a direct connection and point of contact with its growing US client base and provide front-end project support and creative development while Vancouver — offering pipeline and technology infrastructure — remains the heart of operations.

New studio head Solomon has worked in the film, TV and streaming industries for the past 20 years, launching and operating a number of companies. The most recent of which was Big Block Media Holdings — an Emmy-, Cannes-, Webby-, Promax- and Clio-winning integrated media company founded by Solomon nine years ago.

“We have a beautiful studio in Culver City with infrastructure to quickly staff up to 15 artists 2D, 3D and design, a screening room, conference room, edit bay and wonderful outdoor space for a late night ping-pong match and a local Golden Road beer or two,” says Solomon. “Obviously, everyone is WFH right now but at a moment’s notice we are able to scale accordingly. And Vancouver will always be our heartbeat and main production hub.”

“We have happily been here in Vancouver for the past 17 years plus,” says The Embassy president Winston Helgason. “I’ve seen the global industry go through its ups and downs, and yet we continue to thrive. The last few months have been a difficult period of uncertainty and business interruption and, while we are operating successfully out of current WFH restrictions, I can’t wait to open up to our full potential once the world is a little more back to normal.”

In 2020, The Embassy reunited with Area 23/FCB and RSA director Robert Stromberg (Maleficent) to craft a series of fantastical VFX environments for Emgality’s new campaign. The team has also been in full production for the past 16 months on all VFX work for Warrior Nun, an upcoming 10-episode series for Netflix. The Embassy was responsible for providing everything from concept art to pre-production, on-set supervision and almost 700 visual VFX shots for the show. The team in Vancouver is working both remotely and in the studio to deliver the full 10 episodes.

Solomon is excited to get to work, saying that he always respected The Embassy’s work, even while he was competing with them when he was at CafeFX/The Syndicate and Big Block.

As part of the expansion, The Embassy has also added a number of new reps to the team — Sarah Gitersonke joins for Midwest representation, and Kelly Flint and Sarah Lange join for East Coast.

Vegas Post upgrades for VFX, compositing and stills

Vegas Creative Software, in partnership with FXhome, has added new versions of Vegas Effects and Vegas Image to the Vegas Post suite of editing, VFX, compositing and imaging tools for video professionals, editors and VFX artists.

The Vegas Post workflow centers on Vegas Pro for editing and adds Vegas Effects and Vegas Image for VFX, compositing and still-image editing.

Vegas Effects is a full-featured visual effects and compositing tool that provides a variety of high-quality effects, presets and correction tools. With over 800 effects and filters to tweak, combine, pull apart and put back together, Vegas Effects provides users with a powerful library of effects including:
• Particle generators
• Text and titling
• Behavior effects
• 3D model rendering
• A unified 3D space
• Fire and lightning generators
• Greenscreen removal
• Muzzle flash generators
• Picture in picture
• Vertical video integration

Vegas Image is a non-destructive raw image compositor that enables video editors to work with still-image and graphical content and incorporate it directly into their final productions — all directly integrated with Vegas Post. This new version of Vegas Image contains feature updates including:
• Brush masks: A new mask type that allows the user to brush in/out effects or layers and includes basic brush settings like radius, opacity, softness, spacing and smoothing
• Multiple layer transform: Gives the ability to move, rotate and scale a selection of layers
• Multi-point gradient effect: An effect that enables users to create colored gradients using an unlimited amount of colored points
• Light rays effect: An effect that uses bright spots to cast light rays in scenes, e.g., light rays streaming through trees
• Raw denoise: Bespoke denoise step for raw images, which can remove defective pixels and large noise patterns
• Lens distortion effect: Can be used to perform lens-based adjustments, such as barrel/pincushion distortion or chromatic aberration
• Halftone effect: Produces a halftone look, like a newspaper print or pop art
• Configurable mask overlay color: Users can now pick what color is overlaid when the mask overlay render option is enabled

Vegas Post is available now for $999 or as a subscription starting at $21 per month.

Dolores McGinley heads Goldcrest London’s VFX division

London’s Goldcrest Post, a picture and audio post studio, has launched a visual effects division at its Lexington Street location. It will be led by VFX vet Dolores McGinley, whose first task is to assemble a team of artists that will provide services for both new and existing clients.

During the COVID-19 crisis, all Goldcrest staff is working from home except the colorists, who are coming in as needed and working alone in the grading suites. McGinley and her team will move into the Goldcrest facility when lockdown has ended.

“Having been immersed in such a diverse range of projects over the past five years, we identified the need to expand into VFX some time ago,” explains Goldcrest MD Patrick Malone. “We know how essential an integrated VFX service is to our continued success as a leading supplier of creative post solutions to the film and broadcast community.

“As a successful VFX artist in her own right, Dolores is positioned to interpret the client’s brief and offer constructive creative input throughout the production process. She will also draw upon her considerable experience working with colorists to streamline the inclusion of VFX into the grade and guarantee we are able to meet the specific creative requirements of our clients.”

With over two decades of creative experience, McGinley joins Goldcrest having held various senior roles within the London VFX community. Recent examples of her work include The Crown, Giri/Haji and Good Omens.

VFX turn Minnesota into Alabama for indie film Tuscaloosa

By Randi Altman

Director Philip Harder’s Tuscaloosa is a 1970s coming-of-age story that follows recent college graduate Billy Mitchell as he falls in love with a psychiatric patient from his dad’s mental hospital. As you can imagine, the elder Mitchell is not okay with the relationship or the interest his son is taking in the racial tensions that are heating up in Alabama.

As a period-piece, Tuscaloosa required a good amount of visual effects work, and Minneapolis-based Splice served as the picture’s main post and VFX house. Splice called the newly launched and local boutique Nocturnal Robot for overspill and to help turn current-day Minnesota, where the film was shot, into 1970s Tuscaloosa, Alabama.

Jeremy Wanek

Nocturnal Robot’s owner, editor and VFX artist, Jeremy Wanek and artist Conrad Flemming provided a variety of effects, from removing foliage to adding store signs and period cars to rebuilding a Tuscaloosa street. Let’s find out more.

How early did you get involved?
Nocturnal Robot got involved as the edit was nearing picture lock. Splice was bidding on the project’s VFX at the time and it became apparent that they were going to need some support due to the volume of complex shots and constrained budget.

Splice was the main VFX house on the film, and they provided editing as well?
Yes, Splice handled the edit and was the main hub for the VFX work. Clayton Condit edited the movie, along with Kyle Walczak as additional editor. The VFX team was led by Ben Watne. Splice handled around 50 shots, while my team handled around 20, and then Rude Cock Productions (led by the now LA-based Jon Julsrud) jumped in toward the end to finish up some miscellaneous shots, and finally, The Harbor Picture Company tackled some last-minute effects during finishing — so lots of support on the VFX front!

What direction were you given from the client?
Phillip Harder and I met at Splice and went through the shots that concerned him most. Primarily, these discussions centered on details that would lend themselves well to the transformation of modern-day Minnesota, where the movie was shot, to 1970s Alabama, where the movie takes place.

Were you on set?
We were brought in well after the movie had been shot, which is usually the case on a lot of the indie films we work on.

      
Before and After: Period car addition

Can you talk about prepro? Did you do any and if so in what tool?
No prepro, just the discussion I had with the director before we started working. As far as tools, he loved using his laser pointer to point out details (laughs).

Speaking of tools, what did you use on the show, and can you talk about review and approvals?
Our team was very small for this project, since my company had just officially launched. It was just me, as VFX supervisor/VFX artist along with VFX artist Conrad Flemming. We did our compositing in Adobe After Effects, sometimes using Red Giant tools as well. Digital cleanup was via Adobe Photoshop, and planar tracking was done using BorisFX Mocha Pro. We did 3D work in Maxon Cinema 4D, as well as Video Copilot’s Element 3D plugin for Adobe After Effects.

I would stop by Splice, there Kyle Walczak (who was doing some additional editing at the time) would transfer footage over to a hard drive for me. Then, it was a simple workflow between me and Conrad. I worked on my Mac Pro trash can, while Conrad worked on his PC. I sent him shots via my Google Drive. For review and final delivery we used Splice’s personal FTP site.

   
Before and After: Foliage removal

A lot of the review process was sending emails back and forth with Phil. This worked out okay because we were able to get most shots approved quickly. Right after this project we started using Frame.io, and I wish I would have been using that on this one — it’s much cleaner and more efficient.

Can you talk about what types of VFX you provided, and did they pick Minnesota because the buildings more closely resembled Tuscaloosa of the ‘70s?
Phil picked Minnesota because it’s where he lives, and he realized that present-day Alabama doesn’t look much like it did in the 70s. In Minnesota he could pull the resources he had access to and stretch his budget further. They shot at a lot of great timeless locations and brought in some period cars and lots of wardrobe to really sell it. They did an incredible job during production, so VFX-wise, it was mostly just enhancing here and there.

Can you talk about some of those more challenging scenes?
There were two shots in particular that were challenging, and we handled each case very differently. I had lengthy discussions with Phil about how to handle them. If you watch our VFX reel, they are the first and last shots shown.

In the first shot, we see our two lead characters pull up to a restaurant. Phil wanted to change the environment and add a parking lot of period-appropriate cars. He had some images that were photographed in the ’70s and he wanted to composite them into the live-action plate the crew had shot. It was really interesting trying to blend something that was nearly 50 years old into a high-quality Alexa shot. It took a lot of cleanup work on the old images since they were littered with artifacts, as well as cutting them up to fit into the shot more seamlessly. It also involved adding some CG period cars. It was a fun challenge, and once it was all put together it created a unique look.

       
Before and After: Bama Theater

In the second challenging shot, the live-action plate featured a modern-day Minnesota street with a few period vehicles driving down it. We needed to transform this, as you’d expect, into a 70s Alabama street — this time featuring the Bama Theater. This involved a lot of detailed work. I had Conrad focus most of his attention on this shot because I knew he could pull it off in the limited time we had, and his attention to the period’s details would go a long way. There wasn’t a lot of reference material for us to analyze from the ’70s that was taken on that particular street, so we did our best looking at other images we could find from the time and area.

Phil had a lot of notes and details to help us along. We had live-action plates shot on the Red camera to build upon — some buildings, period cars, the extras walking around and a handful of other small objects. But because so much had to be reconstructed, the shot had to be put together from scratch.

Some of the things we noticed in images from the ’70s that we implemented were removing lots of the foliage/trees, adding the fancy signs above stores and adding the stoplights that hung on wires, among other details. It also involved adding lots of CG cars to the environment to fill out the street and add some movement to the foreground and background.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Maxon plugin allows for integration of Cinema 4D assets into Unity


Maxon is now a Unity Technologies Verified Solutions Partner and is distributing a plugin for Unity called Cineware by Maxon. The new plugin provides developers and creatives with seamless integration of Cinema 4D assets into Unity. Artists can easily create models and animations in Cinema 4D for use in realtime 3D (RT3D), interactive 2D, 3D, VR and AR experiences. The Cineware by Maxon plugin is now available free of charge on the Unity Asset Store.

The plugin is compatible with Cinema 4D Release 21, the latest version of the software, and Unity’s latest release, 2019.3. The plugin does not require a license of Cinema 4D as long as Cinema 4D scenes have been “Saved for Cineware.” By default, imported assets will appear relative to the asset folder or imported asset. The plugin also supports user-defined folder hierarchies.

Cineware by Maxon currently supports Geometry:
• Vertex Position, Normals, UV, Skinning Weight, Color
• Skin and Binding Rig
• Pose Morphs as Blend Shapes
• Lightmap UV2 Generation on Import

Materials:
• PBR Reflectance Channel Materials conversion
• Albedo/Metal/Rough
• Normal Map
• Bump Map
• Emission

Animated Materials:
• Color including Transparency
• Metalness
• Roughness
• Emission Intensity, Color
• Alpha Cutout Threshold

Lighting:
• Spot, Directional, Point
• Animated properties supported:
• Cone
• Intensity
• Color

Cameras:
• Animated properties
• Field of Vision (FOV)

Main Image: Courtesy of Cornelius Dämmrich

The-Artery sees red, creates VFX for Huawei’s AppGallery

The-Artery recently worked on a global campaign for agency LH in Israel, and consumer electronics brand Huawei’s official app distribution platform, AppGallery.

The campaign — set to an original musical track called Explore It by artist Tomer Biran — is meant to show the AppGallery as more than a mobile app store, but rather as a gate to an endless world of digital content that comes with data protection and privacy.

Each scene features the platform’s signature red square logo but shown in a variety of creative ways thanks to The-Artery’s visual effects work. This includes floating Tetris-like cubes that change with the beat of the music, camera focuses, red-seated subway cars with a floating red cube and more.

“Director Eli Sverdlov, editor Noam Weissman and executive producer Kobi Hoffman all have distinct artistic processes that are unforgiving to conventional storytelling,” explains founder/executive creative director Vico Sharabani. “We had ongoing conversations about how to create a deeper connection between the brand and audiences. The agency, LH, gave us the freedom to really explore the fun, convenience and security behind downloading apps on the Huawei AppGallery.”

Filming took place across the globe in Kiev, Ukraine, via production company Jiminy Creative Tel Aviv, while editing, design, animation, visual effects and color grading were all done under one roof in The-Artery’s New York studio. The entire production was completed in only 16 days.

The studio used Autodesk’s Flame and 3DS Max, Side Effects Houdini, Adobe’s After Effects and Photoshop for the visual effects and graphics. Colorist: Steve Picano called on Blackmagic’s DaVinci Resolve. Asaf Bitton provided sound design.

The Call of the Wild director Chris Sanders on combining live-action, VFX

By Iain Blair

The Fox family film The Call of the Wild, based on the Jack London tale, tells the story of  a big-hearted dog named Buck whose is stolen from his California home and transported to the Canadian Yukon during the Gold Rush. Director Chris Sanders called on the latest visual effects and animation technology to bring the animals in the film to life. The film stars Harrison Ford and is based on a screenplay by Michael Green.

Sanders’ crew included two-time Oscar–winning cinematographer Janusz Kaminski; production designer Stefan Dechant; editors William Hoy, ACE, and David Heinz; composer John Powell; and visual effects supervisor Erik Nash.

I spoke with Sanders — who has helmed the animated films Lilo & Stitch, The Croods and How to Train Your Dragon — about making the film, which features a ton of visual effects.

You’ve had a very successful career in animation but wasn’t this a very ambitious project to take on for your live-action debut?
It was. It’s a big story, but I felt comfortable because it has such a huge animated element, and I felt I could bring a lot to the party. I also felt up to the task of learning — and having such an amazing crew made all of that as easy as it could possibly be.

Chris Sanders on set.

What sort of film did you set out to make?
As true a version as we could tell in a family-friendly way. No one’s ever tried to do the whole story. This is the first time. Before, people just focused on the last 30 pages of the novel and focused on the relationship between Buck and John Thornton, played by Harrison. And that makes perfect sense, but what you miss is the whole origin story of how they end up together — how Buck has to learn to become a sled dog, how he meets the wolves and joins their world. I loved all that, and also all the animation needed to bring it all alive.

How early on did you start integrating post and all the visual effects?
Right away, and we began with previs.

Your animation background must have helped with all the previs needed on this. Did you do a lot of previs, and what was the most demanding sequence?
We did a ton. In animation it’s called layout, a rough version, and on this we didn’t arrive on set without having explored the sequence many times in previs. It helped us place the cameras and block it all, and we also improvised and invented on set. But previs was a huge help with any heavy VFX element, like when Thornton’s going down river. We had real canoes in a river in Canada with inertial measurement devices and inertial recorders, and that was the most extensive recording we had to do. Later in post, we had to replace the stuntman in the canoe with Thornton and Buck in an identical canoe with identical movements. That was so intensive.

 

How was it working with Harrison Ford?
The devotion to his craft and professionalism… he really made me understand what “preparing for a role” really means, and he really focused on Thornton’s back story. The scene where he writes the letter to his wife? Harrison dictated all of that to me and I just wrote it down on top of the script. He invented all that. He did that quite a few times. He made the whole experience exciting and easy.

The film has a sort of retro look. Talk about working with DP Janusz Kaminski.
We talked about the look a lot, and we both wanted to evoke those old Disney films we saw as kids —something very rich with a magical storybook feel to it. We storyboarded a lot of the film, and I used all the skills I’d learned in animation. I’d see sequences a certain way, draw it out, and sometimes we’d keep them and cut them into editorial, which is exactly what you do in animation.

How tough was the shoot? It must have been quite a change of pace for you.
You’re right. It was about 50 days, and it was extremely arduous. It’s the hardest thing I’ve ever done physically, and I was not fully prepared for how exhausted you get — and there’s no time to rest. I’d be driving to set by 4:30am every day, and we’d be shooting by 6am. And we weren’t even in the Yukon — we shot here in California, a mixture of locations doubling for the Yukon and stage work.

 

Where did you post?
All on the Fox lot, and MPC Montreal did all the VFX. We cut it in relatively small offices. I’m so used to post, as all animation is basically post. I wish it was faster, but you can’t rush it.

You had two editors — William Hoy and David Heinz. How did that work?
We sent them dailies and they divided up the work since we had so much material. Having two great voices is great, as long as everyone’s making the same movie.

What were the big editing challenges?
The creative process in editorial is very different from animation, and I was floored by how malleable this thing was. I wasn’t prepared for that. You could change a scene completely in editorial, and I was blown away at what they could accomplish. It took a long time because we came back with over three hours of material in the first assembly, and we had to crush that down to 90 minutes. So we had to lose a huge amount, and what we kept had to be really condensed, and the narrative would shift a lot. We’d take comedic bits and make them more serious and vice versa.

Visual effects play a key role. Can you talk about working on them with VFX supervisor Erik Nash.
I love working with VFX, and they were huge in this. I believe there are less than 30 shots in the whole film that don’t have some VFX. And apart from creating Buck and most of the other dogs and animals, we had some very complex visual effects scenes, like the avalanche and the sledding sequence.

L-R: Director Chris Sanders and writer Iain Blair

We had VFX people on set at all times. Erik was always there supervising the reference. He’d also advise us on camera angles now and then, and we’d work very closely with him all the time. The cameras were hooked up to send data to our recording units so that we always knew what lens was on what camera at what focal length and aperture, so later the VFX team knew exactly how to lens the scenes with all the set extensions and how to light them.

The music and sound also play a key role, especially for Buck, right?
Yes, because music becomes Buck’s voice. The dogs don’t talk like they do in Lion King, so it was critical. John Powell wrote a beautiful score that we recorded on the Newman Stage at Fox, and then we mixed at 5 Cat Studios.

Where did you do the DI, and how important is it to you?
We did it at Technicolor with colorist Mike Hatzer, and I’m pretty involved. Janusz did the first pass and set the table, and then we fine-tuned it, and I’m very happy with the rich look we got.

Do you want to direct another live-action film?
Yes. I’m much more comfortable with the idea now that I know what goes into it. It’s a challenge, but a welcome one.

What’s next?
I’m looking at all sorts of projects, and I love the idea of doing another hybrid like this.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Behind the Title: Blue Bolt VFX supervisor Richard Frazer

“If we have done our job well, the viewers should never notice the work and instead just be enjoying the storytelling.”

Name: Richard Frazer

Company: London’s BlueBolt

Can you describe your company?
For the last four years, I’ve worked at BlueBolt, a Soho-based visual effects company in London. We work on high-end TV and feature films, and our main area of specialty is creating CG environments and populating them. BlueBolt is a privately owned company run by two women, which is pretty rare. They believe in nurturing good talent and training them up to help break through the glass ceiling, if an artist is up for it.

What’s your job title?
I joined as a lead compositor with a view to becoming a 2D supervisor, and now I am one of the studio’s main core VFX supervisors.

What does that entail?
It means I oversee all of the visual effects work for a specific TV show or movie — from script stage to final delivery. That includes working with the director and DP in preproduction to determine what they would like to depict on the screen. We then work out what is possible to shoot practically, or if we need to use visual effects to help out.

I’ll then often be on the set during the shoot to make sure we correctly capture everything we need for post work. I’ll work with the VFX producer to calculate the costs and time scales of the VFX work. Finally, I will creatively lead our team of talented artists to create those rendered images and make sure it all fits in with the show in a visually seamless way.

What would surprise people the most about what falls under that title?
The staggering amount of time and effort involved by many talented people to create something that an audience should be totally unaware exists. If we have done our job well, the viewers should never notice the work and instead just be enjoying the storytelling.

How long have you been working in VFX?
For around a decade. I started out as a rotoscope artist in 2008 and then became a compositor. I did my first supervisor job back in 2012.

How has the VFX industry changed in the time you’ve been working?
A big shift has been just how much more visual effects work there is on TV shows and how much the standard of that work has improved. It used to be that TV work was looked down on as the poor cousin of feature film work. But shows like Game of Thrones have set audience expectations so much higher now. I worked on nothing but movies for the first part of my career, but the majority of my work now is on TV shows.

Did a particular film inspire you along this path in entertainment?
I grew up on ‘80s sci-fi and horror, so movies like Aliens and The Thing were definitely inspirations. This was back when effects were almost all done practically, so I wanted to get into model-making or prosthetics. The first time I remember being blown away by digital VFX work was seeing Terminator 2 at the cinema. I’ve ended up doing the type of work I dreamed of as a kid, just in a digital form.

Did you go to film school?
No, I actually studied graphic design. I worked for some time doing animation, video editing and motion graphics. I taught myself compositing for commercials using After Effects. But I always had a love of cinema and decided to try and specializing in this area. Almost all of what I’ve learned has been on the job. I think there’s no better training than just throwing yourself at the work, absorbing everything you can from the people around you and just being passionate about what you do.

What’s your favorite part of the job?
Each project has its own unique set of challenges, and every day involves creative problem-solving. I love the process of translating what only exists in someone’s imagination and the journey of creating those images in a way that looks entirely real.

I also love the mix of being at the offices one day creating things that only exist in a virtual world, while the next day I might be on a film set shooting things in the real world. I get to travel to all kinds of random places and get paid to do so!

What’s your least favorite?
There are so many moving parts involved in creating a TV show or movie — so many departments all working together trying to complete the task at hand, as well as factors that are utterly out of your control. You have to have a perfectly clear idea of what needs to be done, but also be able to completely scrap that and come up with another idea at a moment’s notice.

If you didn’t have this job, what would you be doing instead?
Something where I can be creative and make things that physically exist. I’m always in awe of people who build and craft things with their hands.

Can you name some recent projects you have worked on?
Recent work has included Peaky Blinders, The Last Kingdom and Jamestown, as well as a movie called The Rhythm Section.

What is the project that you are most proud of?
I worked on a movie called Under the Skin a few years ago, which was a very technically and creatively challenging project. It was a very interesting piece of sci-fi that people seem to either love or hate, and everyone I ask seems to have a slightly different interpretation of what it was actually about.

What tools so you use day to day?
Almost exclusively Foundry Nuke. I use it for everything from drawing up concepts to reviewing artists’ work. If there’s functionality that I need from it that doesn’t exist, I’ll just write Python code to add features.

Where do you find inspiration now?
In the real world, if you just spend the time observing it in the right way. I often find myself distracted by how things look in certain light. And Instagram — it’s the perfect social media for me, as it’s just beautiful images, artwork and photography.

What do you do to de-stress from it all?
The job can be quite mentally and creatively draining and you spend a lot of time in dark rooms staring at screens, so I try to do the opposite of that. Anything that involves being outdoors or doing something physical — I find cycling or boxing are good ways to unwind.

I recently went on a paragliding trip in the French Alps, which was great, but I found myself looking at all these beautiful views of sunsets over mountains and just analyzing how the sunlight was interacting with the fog and the atmospheric hazing. Apparently, I can never entirely turn off that part of my brain.

ILM’s virtual production platform used on The Mandalorian

To bring The Mandalorian to life, Industrial Light & Magic (ILM) and Epic Games — along with production technology partners Fuse, Lux Machina, Profile Studios, Nvidia and ARRI — have introduced a new way to shoot VFX-heavy projects in collaboration with Jon Favreau’s Golem Creations.

The new virtual production workflow allows filmmakers to capture a significant amount of complex visual effects shots in-camera using realtime game engine technology (Epic’s Unreal Engine) and LED screens to represent dynamic photoreal digital landscapes and sets with creative flexibility previously unimaginable.

Also part of the news, ILM has made its new end-to-end virtual production solution, ILM StageCraft, available for use by filmmakers, agencies and showrunners worldwide.

Over 50 percent of The Mandalorian Season 1 was filmed using this new methodology, eliminating the need for location shoots entirely. Instead, actors in The Mandalorian performed in an immersive and massive 20-foot-high by 270-degree semicircular LED video wall and ceiling with a 75-foot-diameter performance space, where the practical set pieces were combined with digital extensions on the screens.

Digital 3D environments created by ILM played back interactively on the LED walls, edited in realtime during the shoot, which allowed for pixel-accurate tracking and perspective-correct 3D imagery rendered at high resolution via systems powered by Nvidia GPUs.

L-R: Jon Favreau and Richard Bluff

The environments were lit and rendered from the perspective of the camera to provide parallax in real time, as if the camera were really capturing the physical environment with accurate interactive light on the actors and practical sets, giving showrunner Favreau; executive producer/director Dave Filoni; visual effects supervisor Richard Bluff; cinematographers Greig Fraser and Barry “Baz” Idoine and the episodic directors the ability to make concrete creative choices for visual effects-driven work during photography and achieve realtime in-camera composites on set.

The technology and workflow required to make in-camera compositing and effects practical for on-set use combined the ingenuity of all the partners involved.

“We’ve been experimenting with these technologies on my past projects and were finally able to bring a group together with different perspectives to synergize film and gaming advances and test the limits of realtime, in-camera rendering,” explains Favreau adding, “We are proud of what was achieved and feel that the system we built was the most efficient way to bring The Mandalorian to life.”

“Merging our efforts in the space with what Jon Favreau has been working toward using virtual reality and game engine technology in his filmmaking finally gave us the chance to execute the vision,” says Rob Bredow, executive creative director and head of ILM. “StageCraft has grown out of the culmination of over a decade of innovation in the virtual production space at ILM. Seeing our digital sets fully integrated, in real time on stage, providing the kind of in-camera shots we’ve always dreamed of while also providing the majority of the lighting was really a dream come true.”

Bluff adds, “Working with Kim Libreri and his Unreal team, Golem Creations, and the ILM StageCraft team has opened new avenues to both the filmmakers and my fellow key creatives on The Mandalorian, allowing us to shoot principal photography on photoreal, virtual sets that are indistinguishable from their physical counterparts while incorporating physical set pieces and props as needed for interaction. It’s truly a game-changer.”

ILM StageCraft’s production tools provide filmmakers with the combination of traditional filmmaking equipment and methodologies with all of the advantages of a fully digital workflow. With ILM StageCraft, a production can acquire many in-camera finals, allowing filmmakers immediate and complete creative control of work typically handed off and reinterpreted in post, improving the quality of visual effects shots with perfectly integrated elements and reducing visual effects requirements in post, which is a major benefit considering today’s compressed schedules.

Destin Daniel Cretton talks directing Warner’s Just Mercy

By Iain Blair

An emotionally powerful and thought-provoking true story, Just Mercy is the latest film from award-winning filmmaker Destin Daniel Cretton (The Glass Castle, Short Term 12), who directed the film from a screenplay he co-wrote. Based on famed lawyer and activist Bryan Stevenson’s memoir, “Just Mercy: A Story of Justice and Redemption,” which details his crusade to defend, among others, wrongly accused prisoners on death row, it stars Michael B. Jordan and Oscar winners Jamie Foxx and Brie Larson.

The story starts when, after graduating from Harvard, Stevenson (Jordan) — who had his pick of lucrative jobs — instead heads to Alabama to defend those wrongly condemned or who were not afforded proper representation, with the support of local advocate Eva Ansley (Larson).

One of his first cases is that of Walter McMillian (Foxx), who, in 1987, was sentenced to die for the murder of an 18-year-old girl, despite evidence proving his innocence. In the years that follow, Stevenson becomes embroiled in a labyrinth of legal and political maneuverings as well as overt racism as he fights for Walter, and others like him, with the odds — and the system — stacked against them.

This case becomes the main focus of the film, whose cast also includes Rob Morgan as Herbert Richardson, a fellow prisoner who also sits on death row; Tim Blake Nelson as Ralph Myers, whose pivotal testimony against Walter McMillian is called into question; Rafe Spall as Tommy Chapman, the DA who is fighting to uphold Walter’s conviction and sentence; O’Shea Jackson Jr. as Anthony Ray Hinton, another wrongly convicted death row inmate whose cause is taken up by Stevenson; and Karan Kendrick as Walter’s wife, Minnie McMillian.

Cretton’s behind-the-scenes creative team included DP Brett Pawlak, co-writer Andrew Lanham, production designer Sharon Seymour, editor Nat Sanders and composer Joel P. West, all of whom previously collaborated with the director on The Glass Castle.

Destin Daniel Crettin

I spoke with the director about making the film, his workflow and his love of post.

When you read Brian’s book, did you feel compelled to take this on?
I did. His voice and the way he tells the story about these characters, who seem so easy to judge at first. Then he starts peeling off all the layers, and the way he uses humor in certain areas and devastation in others. Somehow it still makes you feel hopeful and inspired to do something about all the injustice – all of it just hit me so hard, and I felt I had to be involved in it some way.

Did you work very closely with him on the film?
I did. Before we even began writing a word, we went to meet him in Montgomery, and he introduced us to the real Anthony Ray Hinton and a bunch of lawyers working on cases. Brian was with us through the whole writing process, filling in the blanks and helping us piece the story together. We did a lot of research, and we had the book, but it obviously couldn’t include everything. Brian gave us all the transcripts of all the hearings, and a lot of the lines were taken directly from those.

This is different from most other courtroom dramas, as the trial’s already happened when the movie begins. What sort of film did you set out to make?
We set out to make the book in as compelling a way as possible. And it’s a story about this young lawyer who’s trying to convince the system and state they made a terrible mistake, with all the ups and downs, and just how long it takes him to succeed. That’s the drama.

What were the main challenges in pulling it all together?
Telling a very intense, true story about people, many of whom are still alive and still doing the work they were doing then. So accuracy was a huge thing, and we all really felt the burden and responsibility to get it right. I felt it more so than any film I’ve ever done because I respect Brian’s work so much. We’re also telling stories about people who were very vulnerable.

Trying to figure out how to tell a narrative that still moved at the right pace and gave you an emotional ride, but which stayed completely accurate to the facts and to a legal process that moves incredibly slowly was very challenging. A big moment for me were when Brian first saw the film and gave me a big hug and thank you; he told me it was not for how he was portrayed, but for how we took care of his clients. That was his big concern.

What did Jamie and Michael bring to their roles?
They’ve been friends for a long time, so they already had this great natural chemistry, and they were able to play through scenes like two jazz musicians and bring a lot of stuff that wasn’t there on the page.

I heard you actually shot in the south. How tough was the shoot?
Filming in some of the real locations really helped. We were able to shoot in Montgomery — such as the scenes where Brian’s doing his morning jogs, the Baptist church where MLK Jr. was the pastor, and then the cotton fields and places where Walter and his family actually lived. Being there and feeling the weight of history was very important to the whole experience. Then we shot the rest of the film in Atlanta.

Where did you post?
All in LA on the Warner lot.

Do you like the post process?
I love post and I hate it (laughs). And it depends on whether you’re finding a solution to a problem or you’re realizing you have a big problem. Post, of course, is where you make the film and where all the problems are exposed… the problems with all the choices I made on set. Sometimes things are working great, but usually it’s the problems you’re having to face. But working with a good post team is so fulfilling, and you’re doing the final rewrite, and we solved so many things in post on this.

Talk about editing with your go-to Nat Sanders, who got an Oscar nom for his work (with co-editor Joi McMillon) on Moonlight and also cut If Beale Street Could Talk.
Nat wasn’t on set. He began cutting material here in LA while we shot on location in Atlanta and Alabama, and we talked a lot on the phone. He did the first assembly which was just over three hours long. All the elements were there but shaping all the material and fine-tuning it all took nearly a year as we went through every scene, talking them out.

Finding the correct emotional ride and balance was a big challenge, as this has so many emotional highs and lows and you can easily tire an audience out. We had to cut some storylines that were working, but we were sending people on another down when they needed something lighter. The other part of it was performance, and you can craft so much of that in the edit; our leads gave us so many takes and options to play with. Dealing with that is one of Nat’s big strengths. Both of us are meticulous, and we did a lot of test screenings and kept making adjustments.

Writer Iain Blair (left) and director Destin Daniel Crettin.

Nat and I both felt the hardest scene to cut and get right was Herb’s execution scene, because of the specific tone needed. If you went too far in one direction, it felt too much, but if you went too far the other way, it didn’t quite hit the emotional beat it needed. So that took a lot of time, playing around with all the cross-cutting and the music and sound to create the right balance.

All period films need VFX. What was entailed?
Crafty Apes did them, and we did a lot of fixes, added period stuff and did a lot of wig fixes — more than you’d think (laughs). We weren’t allowed to shoot at the real prison, so we had to create all the backdrops and set extensions for the death row sequences.

Can you talk about the importance of sound and music.
It’s always huge for me, and I’ve worked with my composer, Joel, and supervising sound editor/re-recording mixer Onnalee Blank, who was half of the sound team, since the start. For both of them, it was all about finding the right tone to create just the right amount of emotion that doesn’t overdo it, and Joel wrote the score in a very stripped-down way and then got all these jazz musicians to improvise along with the score.

Where did you do the DI and how important is it to you?
That’s huge too, and we did it at Light Iron with colorist Ian Vertovec. He’s worked with my DP on almost every project I’ve done, and he’s so good at grading and giving you a very subtle palette.

What’s next?
We’re currently on preproduction on Shang-Chi and the Legend of the Ten Rings, featuring Marvel’s first Asian superhero. It’s definitely a change of pace after this.


 

Kevin Lau heads up advertising, immersive at Digital Domain

Visual effects studio Digital Domain has brought on Kevin Lau as executive creative director of advertising, games and new media. In this newly created position, Lau will oversee all short-form projects and act as a creative partner for agencies and brands.

Lau brings over 18 years of ad-based visual effects and commercial production experience, working on campaigns for brands such as Target, Visa and Sprint.

Most recently, he was the executive creative director and founding partner at Timber, an LA-based studio focused on ads (GMC, Winter Olympics) and music videos (Kendrick Lamar’s Humble). Prior to that, he held creative director positions at Mirada, Brand New School and Superfad. Throughout his career, his work has been honored with multiple awards including Clios, AICP Awards, MTV VMAs and a Cannes Gold Lion for Sprint’s “Now Network” campaign via Goodby.

Lau, who joins Digital Domain EPs Nicole Fina and John Canning as they continue to build the studio’s short-form business, will help unify the vision for the advertising, games and new media/experiential groups, promoting a consistent voice across campaigns.

Lau joins the team as the new media group prepares to unveil its biggest project to date: Time’s The March, a virtual reality recreation of the 1963 March on Washington for Jobs and Freedom. Digital Domain’s experience with digital humans will play a major role in the future of both groups as they continue to build on the photoreal cinematics and in-game characters previously created for Activision, Electronic Arts and Ubisoft.

VFX-heavy Skyworth OLED TV spot via The-Artery

The-Artery created a spot for Skyworth’s latest version of its W81|W81 Pro Wallpaper OLED TV, which debuted last month at the “See the Wonder” event at CES 2020.

Created using The-Artery‘s newly opened Resolve-based color room and expanded design capabilities —spearheaded by colorist Stephen Picano and design director Lauren Indovina — the commercial features a couple swimming through space-like waters, children battling origami dragons while floating in a paper boat and a traveler treking through snowy tundras while glowing jellyfish float overhead. Publicis, Skyworth’s agency, wanted the ad to reflect “the wonder” of the company’s newest television model.

“The campaign, helmed by director Eli Sverdlov, was very director-led in a way that I’ve never seen before,” explains The-Artery’s EP/MD, Deborah Sullivan. “Of course, there was still ongoing dialogue with the client and agency, but the level of creative control that was entrusted is almost unheard of. Everything was open from start to finish, including the ideation phase, color grading and design — to name a few. Our team had a lot of fun jumping straight into the edit to develop and launch what we consider as a high-end conceptual throwback to the nineties.”

Sverdlov agrees: “Our flexible creative process was in a condensed schedule and required a very unique collaboration. We were practically creating the ideas and visuals while editing and sourcing footage.”

Due to the production’s long shooting schedule and tight deadlines, the visual effects were designed via Autodesk Flame in realtime, all under one roof, while filming took place in Serbia. Additional footage was carefully curated as well as color graded and cut to fit the tone and flow of the rest of the piece. The creature imagery such as the jellyfish was done via CG.

In addition to Flame and Resolve, The-Artery called on SideFX Houdini, Autodesk Maya, Maxon’s RedShift, Otoy’s Octane, Autodesk’s Arnold, Adobe After Effects and Maxon’s Cinema 4D.

Framestore launches FPS preproduction services

VFX studio Framestore has launched FPS (Framestore Pre-production Services) for the global film and content production industries. An expansion of Framestore’s existing capability, FPS is available to clients in need of standalone preproduction support or an end-to-end production solution.

The move builds out and aligns the company’s previz, virtual production, techviz and postviz services with Framestore’s art department (which operates either as part of the Framestore workflow or as a stand-alone creative service), virtual production team and R&D unit, and integrates with the company’s VFX and animation teams. The move builds on work on films such as Gravity and the knowledge gained during the company’s eight-year London joint venture with visualization company The Third Floor. FPS is working on feature film projects as part of an integrated offering and as a standalone visualization partner, with more projects slated in the coming months.

The new team is led by Alex Webster, who joins as FPS managing director after running The Third Floor London. He will report directly to Fiona Walkinshaw, Framestore’s global managing director, film.

“This work aligns Framestore’s singular VFX and animation craft with a granular understanding of the visualization industry,” says Webster. “It marries the company’s extraordinary legacy in VFX with established visualization and emergent virtual production processes, supported by bleeding-edge technology and dedicated R&D resource to inform the nimble approach which our clients need. Consolidating our preproduction services represents a significant creative step forward.”

“Preproduction is a crucial stage for filmmakers,” says chief creative officer Tim Webber. “From mapping out environments to developing creatures and characters to helping plot action sequences it provides unparalleled freedom in terms of seeing how a story unfolds or how characters interact with the worlds we create. Bringing together our technical innovation with an understanding of filmmaking, we want to offer a bespoke service for each film and each individual to help tell compelling, carefully crafted stories.”

“Our clients’ needs are as varied as the projects they bring to us, with some needing a start-to-finish service that begins with concept art and ends in post while others want a bespoke, standalone solution to specific creative challenges, be that in early stage concepting, through layout and visualization or in final animation and VFX” says Framestore CEO William Sargent. “It makes sense to bring all these services in-house — even more so when you consider how our work in adjacent fields like AR, VR and MR has helped the likes of HBO, Marvel and Warner Bros. bring their IP to new, immersive platforms. What we’ll ultimately deliver goes well beyond previz and beyond visualization.”

Main Image: (L-R) Tim Webber, Fiona Walkinshaw and Alex Webster.

FXhome’s HitFilm Express 14, ‘Pay What You Want’ option

FXhome has a new “Pay What You Want” good-will program inspired by the HitFilm Express community’s requests to be able to help pay for development of the historically free video editing and VFX software. Pay What You Want gives users the option to contribute financially, ensuring that those funds will be allocated for future development and improvements to HitFilm.

Additionally, FXhome will contribute a percentage of the proceeds of Pay What You Want to organizations dedicated to global causes important to the company and its community. At its launch, the FXhome Pay What You Want initiative will donate a portion of its proceeds to the WWF and the Australia Emergency Bushfire Fund. The larger the contribution from customers, the more FXhome will donate.

HitFilm Express remains a free download, however, first-time customers will now have the option to “Pay What You Want” on the software. They’ll also receive some exclusive discounts on HitFilm add-on packs and effects.

Coinciding with the release of Pay What You Want, FXhome is releasing HitFilm Express 14, the first version of HitFilm Express to be eligible for the Pay What You Want initiative. HitFilm Express 14 features a new and simplified export process, new text controls, a streamlined UI and a host of new features.

For new customers who would like to download HitFilm Express 14 and also contribute to the Pay What You Want program, there are three options available:

• Starter Pack Level: With a contribution as little as $9, new HitFilm Express 14 customers will also receive a free Starter Pack of software and effects that includes:
o Professional dark mode interface
o Edit tools including Text, Split Screen Masking, PiP, Vertical Video, Action Cam Crop
o Color tools including Exposure, Vibrance, Shadows and Highlights, Custom Gray, Color Phase, Channel Mixer and 16-bit color
o Additional VFX packs including Shatter, 3D Extrusion, Fire, Blood Spray and Animated Lasers
• Content Creator Level: With contributions of $19 or more, users will receive everything included in the Starter Pack, as well as:
o Edit: Repair Pack with Denoise, Grain Removal and Rolling Shutter
o Color: LUT Pack with LUTs and Grading Transfer
o Edit: Beautify Pack with Bilateral Blur and Pro Skin Retouch
• VFX Artist Level: Users who contribute from $39 to $99 get everything in the Starter Pack and Content Creator levels plus:
o Composite Toolkit Pack with Wire Removal, Projector, Clone and Channel Swapper
o Composite Pro-Keying Pack for Chroma Keying
o Motion Audio Visual Pack with Atomic Particles, Audio Spectrum and Audio Waveform
o VFX Neon Lights Pack with Lightsword Ultra (2-Point Auto), Lightsword Ultra (4-Point Manual), Lightsword Ultra (Glow Only) and Neon Path
o VFX Lighting Pack with Anamorphic Lens Flares, Gleam, Flicker and Auto Volumetrics

What’s new in HitFilm Express 14
HitFilm Express 14 adds a number of VFX workflow enhancements to enable even more sophisticated effects for content creators, including a simplified export workflow that allows users to export content directly from the timeline and comps, new text controls, a streamlined UI and a host of new features. Updates include:

• Video Textures for 3D Models: For creators who already have the 3D: Model Render Pack, they can now use a video layer as a texture on a 3D model to add animated bullet holes, cracked glass or changing textures.
• Improvements to the Export Process: In HitFilm Express 14, the Export Queue is now an Export Panel, and is now much easier to use. Exporting can also now be done from the timeline and from comps. These “in-context” exports will export the content between the In and Out points set or the entire timeline using the current default preset (which can be changed from the menu).
• Additional Text Controls: Customizing text in HitFilm Express 14 is now even simpler, with Text panel options for All Caps, Small Caps, Subscript and Superscript. Users can also change the character spacing, horizontal or vertical scale, as well as baseline shift (for that Stranger-Things-style titling).
• Usability and Workflow Enhancements: In addition to the new and improved export process, FXhome has also implemented new changes to the interface to further simplify the entire post production process, including a new “composite button” in the media panel, double-click and keyboard shortcuts. A new Masking feature adds new automation to the workflow; when users double-click the Rectangle or Ellipse tools, a centered mask is automatically placed to fill the center of the screen. Masks are also automatically assigned colors, which can be changed to more easily identify different masks.
• Effects: Users can now double-click the effects panel to apply to the selected layer and drop 2D effects directly onto layers in the viewer. Some effects — such as the Chroma Key and Light Flares — can be dropped on a specific point, or users can select a specific color to key by. Users can also now favorite “effects” for quick and easy access to their five most recently used effects from the ‘Effects’ menu in the toolbar.
• Additional Improvements: Users can now use Behavior effects from the editor timeline, click-drag across multiple layers to toggle “solo,” “locked” or “visibility” settings in one action, and access templates directly from the media panel with the new Templates button. Menus have also been added to the tab of each panel to make customization of the interface easier.
• Open Imerge Pro files in HitFilm: Imerge Pro files can now be opened directly from HitFilm as image assets. Any changes made in the Imerge Pro project will be automatically updated with any save, making it easier to change image assets in real time.
• Introducing Light Mode: The HitFilm Express interface is now available in Light Mode and will open in Light Mode the first time you open the software. Users with a pre-existing HitFilm Express license can easily change back to the dark theme if desired.

HitFilm Express 14 is available immediately and is a free download. Customers downloading HitFilm Express 14 for the first time are eligible to participate in the new Pay What You Want initiative. Free effects and software packs offered in conjunction with Pay What You Want are only available at initial download of HitFilm Express 14.

Rob Legato talks The Lion King‘s Oscar-nominated visual effects

By Karen Moltenbrey

There was a lot of buzz before — and after — this summer’s release of Disney’s remake of the animated classic The Lion King. And what’s not to love? From the animals to the African savannas, Disney brought the fabled world of Simba to life in what is essentially a “live-action” version of the beloved 1994 2D feature of the same name. Indeed, the filmmakers used tenets of live-action filmmaking to create The Lion King, and themselves call it a visual effects film. However, there are those who consider this remake, like the original, an animated movie, as 2019’s The Lion King used cutting-edge CGI for the photoreal beasts and environments.

Rob Legato

Whether you call it “live action” or “animation,” one thing’s for sure. This is no ordinary film. And, it was made using no ordinary production process. Rather, it was filmed entirely in virtual reality. And it’s been nominated for a Best Visual Effects Oscar this year.

“Everything in it is a visual effect, created in the same way that we would make a visual effects-oriented film, where we augment or create the backgrounds or create computer-generated characters for a scene or sequence. But in this case, that spanned the entire movie,” says VFX supervisor Rob Legato. “We used a traditional visual effects pipeline and hired MPC, which is a visual effects studio, not an animation house.”

MPC, which created the animals and environments, crafted all the elements, which were CG, and handled the virtual production, working with Magnopus to develop the necessary tools that would take the filmmakers from previz though shooting and, eventually, into post production. Even the location scouting occurred within VR, with Legato, director Jon Favreau and others, including cinematographer Caleb Deschanel, simultaneously walking through the sets and action by using HTC Vive headsets.

Caleb Deschanel (headset) and Rob Legato. Credit: Michael Legato

The Animations and Environments
MPC, known for its photorealistic animals and more, had worked with Disney and Favreau on the 2016 remake of The Jungle Book, which was shot within a total greenscreen environment and used realistic CG characters and sets with the exception of the boy Mowgli. (It also used VR, albeit for previsualization only.) The group’s innovative effort for that work won an Oscar for visual effects. Apparently that was just the tip of the spear, so to speak, as the team upped its game with The Lion King, making the whole production entirely CG and taking the total filmmaking process into virtual reality.

“It had to look as believable as possible. We didn’t want to exaggerate the performances or the facial features, which would make them less realistic,” says Legato of the animal characters in The Lion King.

The CG skeletons were built practically bone for bone to match their real-life counterparts, and the digital fur matched the hair variations of the various species found in nature. The animators, meanwhile, studied the motion of the real-life animals and moved the digital muscles accordingly.

“Your eye picks up when [the animal] is doing something that it can’t really do, like if it stretches its leg too far or doesn’t have the correct weight distribution that’s affecting the other muscles when it puts a paw down,” says Legato, contending that it is almost impossible to tell the CG version of the characters from the real thing in a non-talking shot or a still frame.

To craft the animals and environments, the MPC artists used Autodesk’s Maya as the main animation program, along with SideFX Houdini for water and fire simulations and Pixar’s RenderMan for rendering. MPC also used custom shaders and tools, particularly for the fur, mimicking that of the actual animal. “A lion has so many different types of hair — short hair around the body, the bushy mane, thick eyebrow hairs and whiskers. And every little nuance was recreated and faithfully reproduced,” Legato adds.

MPC artists brought to life dozens and dozens of animals for the film and then generated many more unique variations — from lions to mandrills to hyenas to zebras and more, even birds and bugs. And then the main cast and background animals were placed within a photoreal environment, where they were shot with virtual cameras that mimicked real cameras.

The world comprises expansive, open landscapes. “There were many, many miles of landscapes that were constructed,” says Legato. The filmmakers would film within pockets that were dressed and populated for different scenes, from Pride Rock to the interior of a cave to the savanna to the elephant graveyard — all built in CGI.

“Everything was simulated to be the real thing, so the sum total of the illusion is that it’s all real. And everything supports each other — the grounds, the characters, what they are physically doing. The sum total of that adds up to where your brain just says, ‘OK, this must be real. I’ll stop looking for flaws and will now just watch the story,’” says Legato. “That was the creative intent behind it.”

Virtual Production
All the virtual camera work was accomplished within Unity’s engine, so all the assets were ported in and out of that game engine. “Everyone would then know where our cameras were, what our camera moves were, how we were following the action, our lens choices, where the lights were placed … all those things,” says Legato.

Magnopus created the VR tools specific for the film, which ran on top of Unity to get the various work accomplished, such as the operation of the cameras. “We had a crane, dolly and other types of cameras encoded so that it basically drove its mate in the computer. For instance, we created a dolly and then had a physical dolly with encoders on it, so everything was hand operated, and we had a dolly grip and a camera assistant pulling focus. There was someone operating the cameras, and sometimes there was a crane operator. We did SteadiCam as well through an actual SteadiCam with a sensor on it to work with OptiTrack [motion capture that was used to track the camera],” explains Legato. “We built a little rig for the SteadiCam as well as one for a drone we’d fly around the stage, and we’d create the illusion that it was a helicopter shot while flying around Africa.”

Because the area within VR was so vast, a menu system was created so the filmmakers could locate one another within the virtual environment, making location scouting much easier. They also could take snapshots of different areas and angles and share them with the group. “We were standing next to each other [on stage], but within the virtual environment, we could be miles apart and not see each other because we’re maybe behind trees or rocks.”

As Legato points out, the menu tool is pretty robust. “We basically built a game of film production. Everything was customizable,” he says. Using iPads, the group could play the animation. As the camera was in operation, they could stop the animation, wind it backward, speed it forward, shoot it in slow motion or faster motion. “These options were all accessible to us,” he adds.

Legato provides the following brief step-by-step overview of how the virtual production occurred. First, the art department created the sets — Africa with the trees, ponds, rivers, mountains, waterfalls and so forth. “Based on the script, you know somewhat where you need to be [in the set],” he says. Production designer James Chinlund would make a composite background, and then they — along with Favreau, Deschanel and animation supervisor Andrew Jones — would go into VR.

“We had built these full-size stationary chess pieces of the animals, and in VR, we’d have these tools that let us grab a lion, for instance, or a meerkat, and position them, and then we’d look through the lens and start from there,” says Legato. “We would either move them by hand or puppeteer a simple walk cycle to get the idea of the blocking.”

Jones and his team would animate that tableau and port it back into the game engine as an animation cycle. “We’d find camera angles and augment them. We’d change some of the animation or slow it down or move the animals in slightly different positions. And then we’d shoot it like it’s on a live-action stage,” explains Legato. “We’d put a dolly track down, cover the action with various types of lenses, create full-coverage film dailies… We could shoot the same scene in as many different angles as we’d wish. We could then play it out to a video deck and start editing it right away.” The shots they liked might get rendered with more light or motion blur, but a lot of the time, they’d go right off the video tap.

Meanwhile, MPC recorded everything the filmmakers did and moved — every leaf, rock, tree, animal. Then, in post, all of that information would be reconverted back into Maya sets and the animation fine-tuned.

“In a nutshell, the filmmakers were imparting a live-action quality to the process — by not faking it, but by actually doing it,” says Legato. “And we still have the flexibility of full CGI.”

The Same, But Different
According to Legato, it did not take the group long to get the hang of working in VR. And the advantages are many — chief among them, time savings when it comes to planning and creating the sequence editorially, and then instantly being able to reshoot or iterate the scene inexpensively. “There is literally no downside to exploring a bold choice or an alternate angle on the concept,” he points out.

Yes, virtual filmmaking is the future, contends Legato.

So, back to the original question: Is The Lion King a VFX film or an animated film? “It’s perhaps a hybrid,” says Legato. “But, if you didn’t know how we did it and if the animals didn’t talk, you’d think it was done in the traditional manner of a live-action film. Which it is, visually speaking. You wouldn’t necessarily describe it as looking like ‘an animated film’ because it doesn’t really look like an animated film, like a Pixar or DreamWorks movie. By labeling it as such, you’re putting it into a hole that it’s not. It’s truly just a movie. How we achieved it is immaterial, as it should be.”

Legato and his colleagues call it “live action,” which it truly is. But some, including the Golden Globes, categorized it as “animation.” (They also called 2015’s The Martian and 2010’s The Tourist “comedies.”)

Call it what you will; the bottom line is that the film is breathtaking and the storytelling is amazing. And the filmmaking is inventive and pushes traditional boundaries, making it difficult to perhaps fit into a traditional category. Therefore, “beautiful,” “riveting,” “creative” and “innovative” might be the only descriptions necessary.


Karen Moltenbrey is a veteran writer, covering visual effects and post production.

Check out MPC’s VFX breakdown on the film:

Director James Mangold on Oscar-nominated Ford v Ferrari

By Iain Blair

Filmmaker James Mangold has been screenwriting, producing and directing for years. He has made films about country legends (Walk the Line), cowboys (3:10 to Yuma), superheroes (Logan) and cops (Cop Land), and has tackled mental illness (Girl Interrupted) as well.

Now he’s turned his attention to race car drivers and Formula 1 with his movie Ford v Ferrari, which has earned Mangold an Oscar nomination for Best Picture. The film also received nods for its editing, sound editing and sound mixing.

James Mangold (beard) on set.

The high-octane drama was inspired by a true-life friendship that forever changed racing history. In 1959, Carroll Shelby (Matt Damon) is on top of the world after winning the most difficult race in all of motorsports, the 24 Hours of Le Mans. But his greatest triumph is followed quickly by a crushing blow — the fearless Texan is told by doctors that a grave heart condition will prevent him from ever racing again.

Endlessly resourceful, Shelby reinvents himself as a car designer and salesman working out of a warehouse space in Venice Beach with a team of engineers and mechanics that includes hot-tempered test driver Ken Miles (Christian Bale). A champion British race car driver and a devoted family man, Miles is brilliant behind the wheel, but he’s also blunt, arrogant and unwilling to compromise.

After Shelby’s vehicles make a strong showing at Le Mans against Italy’s venerable Enzo Ferrari, Ford Motor Company recruits the firebrand visionary to design the ultimate race car, a machine that can beat even Ferrari on the unforgiving French track. Determined to succeed against overwhelming odds, Shelby, Miles and their ragtag crew battle corporate interference, the laws of physics and their own personal demons to develop a revolutionary vehicle that will outshine every competitor. The film culminates in the historic showdown between the US and Italy at the grueling 1966 24 hour Le Mans race.

Mangold’s below-the-line talent, many of whom have collaborated with the director before, includes Academy Award-nominated director of photography Phedon Papamichael; film editors Michael McCusker, ACE, and Andrew Buckland; visual effects supervisor Olivier Dumont; and composers Marco Beltrami and Buck Sanders.

L-R: Writer Iain Blair and Director James Mangold

I spoke with Mangold — whose other films include Logan, The Wolverine and Knight and Day — about making the film and his workflow.

You obviously love exploring very different subject matter in every film you make.
Yes, and I do every movie like a sci-fi film — meaning inventing a new world that has its own rules, customs, language, laws of physics and so on, and you need to set it up so the audience understands and they get it all. It’s like being a world-builder, and I feel every film should have that, as you’re entering this new world, whether it’s Walk the Line or The French Connection. And the rules and behavior are different from our own universe, and that’s what makes the story and characters interesting to me.

What sort of film did you set out to make?
Well, given all that, I wanted to make an exciting racing movie about that whole world, but it’s also that it was a moment when racing was free of all things that now turn me off about it. The cars were more beautiful then, and free of all the branding. Today, the cars are littered with all the advertising and trademarks — and it’s all nauseating to me. I don’t even feel like I’m watching a sport anymore.

When this story took place, it was also a time when all the new technology was just exploding. Racing hasn’t changed that much over the past 20 years. It’s just refining and tweaking to get that tiny edge, but back in the ‘60s they were still inventing the modern race car, and discovering aerodynamics and alternate building materials and methods. It was a brand-new world, so there was this great sense of discovery and charm along with all that.

What were the main technical challenges in pulling it all together?
Trying to do what I felt all the other racing movies hadn’t really done — taking the driving out of the CG world and putting it back in the real world, so you could feel the raw power and the romanticism of racing. A lot of that’s down to the particulates in the air, the vibrations of the camera, the way light moves around the drivers — and the reality of behavior when you’re dealing with incredibly powerful machines. So right from the start, I decided we had to build all the race cars; that was a huge challenge right there.

How early on did you start integrating post and all the VFX?
Day one. I wanted to use real cars and shoot the Le Mans and other races in camera rather than using CGI. But this is a period piece, so we did use a lot of CGI for set extensions and all the crowds. We couldn’t afford 50,000 extras, so just the first six rows or so were people in the stands; the rest were digital.

Did you do a lot of previz?

A lot, especially for Le Mans, as it was such a big, three-act sequence with so many moving parts. We used far less for Daytona. We did a few storyboards and then me and my second unit director, Darrin Prescott — who has choreographed car chases and races in such movies as Drive, Deadpool 2, Baby Driver and The Bourne Ultimatum — planned it out using matchbox cars.

I didn’t want that “previzy” feeling. Even when I do a lot of previz, whether it’s a Marvel movie or like this, I always tell my previz team “Don’t put the camera anywhere it can’t go.” One of the things that often happens when you have the ability to make your movie like a cartoon in a laboratory — which is what previz is — is that you start doing a lot of gimmicky shots and flying the camera through keyholes and floating like a drone, because it invites you to do all that crazy shit. It’s all very show-offy as a director — “Look at me!” — and a turnoff to me. It takes me out of the story, and it’s also not built off the subjective experience of your characters.

This marks your fifth collaboration with DP Phedon Papamichael, and I noticed there’s no big swooping camera moves or the beauty shot approach you see in all the car commercials.
Yes, we wanted it to look beautiful, but in a real way. There’s so much technology available now, like gyroscopic setups and arms that let you chase the cars in high-speed vehicles down tracks. You can do so much, so why do you need to do more? I’m conservative that way. My goal isn’t to brand myself through my storytelling tricks.

How tough was the shoot?
It was one of the most fun shoots I’ve ever had, with my regular crew and a great cast. But it was also very grueling, as we were outside a lot, often in 115-degree heat in the desert on blacktop. And locations were big challenges. The original Le Mans course doesn’t exist anymore like it used to be, so we used several locations in Georgia to double for it. We shot the races wide-angle anamorphic with a team of a dozen professional drivers, and with anamorphic you can shoot the cars right up into the lens — just inches away from camera, while they’d be doing 150 mph or 160 mph.

Where did you post?
All on the Fox lot at my offices. We scored at Capitol Records and mixed the score in Malibu at my composer’s home studio. I really love the post, and for me it’s all part of the same process — the same cutting and pasting I do when I’m writing, and even when I’m directing. You’re manipulating all these elements and watching it take form — and particularly in this film, where all the sound design and music and dialogue are all playing off one another and are so key. Take the races. By themselves, they look like nothing. It’s just a car whipping by. The power of it all only happens with the editing.

You had two editors — Michael McCusker and Andrew Buckland. How did that work?
Mike’s been with me for 20 years, so he’s kind of the lead. Mike and Drew take and trade scenes, and they’re good friends so they work closely together. I move back and forth between them, which also gives them each some space. It’s very collaborative. We all want it to look beautiful and elegant and well-designed, but no one’s a slave to any pre-existing ideas about structure or pace. (Check out postPerspective‘s interview with the editing duo here.)

What were the big editing challenges?
It’s a car racing movie with drama, so we had to hit you with adrenalin and then hold you with what’s a fairly procedural and process-oriented film about these guys scaling the corporate wall to get this car built and on the track. Most of that’s dramatic scenes. The flashiest editing is the races, which was a huge, year-long effort. Mike was cutting the previz before we shot a foot, and initially we just had car footage, without the actors, so that was a challenge. It all transformed once we added the actors.

Can you talk about working on the visual effects with Method’s VFX supervisor Olivier Dumont?
He did an incredible job, as no one thinks there are so many. They’re really invisible, and that’s what I love — the film feels 100% analog, but of course it isn’t. It’s impossible to build giant race tracks as they were in the ‘60s. But having real foregrounds really helped. We had very few scenes where actors were wandering around in a green void like on so many movies now. So you’re always anchored in the real world, and then all the set extensions were in softer focus or backlit.

This film really lends itself to sound.
Absolutely, as every car has its own signature sound, and as we cut rapidly from interiors to exteriors, from cars to pits and so on. The perspective aural shifts are exciting, but we also tried to keep it simple and not lose the dramatic identity of the story. We even removed sounds in the mix if they weren’t important, so we could focus on what was important.

Where did you do the DI, and how important is it to you?
At Efilm with Skip Kimball (working on Blackmagic DaVinci Resolve), and it was huge on this, especially dealing with the 24-hour race, the changing light, rain and night scenes, and having to match five different locations was a nightmare. So we worked on all that and the overall look from early on in the edit.

What’s next?
Don’t know. I’ve got two projects I’m working on. We’ll see.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Talking with Franki Ashiruka of Nairobi’s Africa Post Office

By Randi Altman

After two decades of editing award-winning film and television projects for media companies throughout Kenya and around the world, Franki Ashiruka opened Africa Post Office, a standalone, post house in Nairobi, Kenya. The studio provides color grading, animation, visual effects, motion graphics, compositing and more. In addition, they maintain a database of the Kenyan post production community that allows them to ramp up with the right artists when the need arises.

Here she talks about the company, its workflow and being a pioneer in Nairobi’s production industry.

When did you open Africa Post Office, and what was your background prior to starting this studio?
Africa Post Office (APO) opened its doors in February 2017. Prior to starting APO, I was a freelance editor with plenty of experience working with well-established media houses such as Channel 4 (UK), Fox International Channels (UK), 3D Global Leadership (Nigeria), PBS (USA), Touchdown (New Zealand), Greenstone Pictures (New Zealand) and Shadow Films (South Africa).

In terms of Kenya-based projects, I’ve worked with a number of production houses including Quite Bright Films, Fat Rain Films, Film Crew in Africa, Mojo Productions, Multichoice, Zuku, Content House and Ginger Ink Films.

I imagine female-run, independent studios in Africa are rare?
On the contrary, Kenya has reached a point where more and more women are emerging as leaders of their own companies. I actually think there are more women-led film production companies than male-led. The real challenge was that before APO, there was nothing quite like it in Nairobi. Historically, video production here was very vertical — if you shot something, you’d need to also manage post within whatever production house you were working in. There were no standalone post houses until us. That said, with my experience, even though hugely daunting, I never thought twice about starting APO. It is what I have always wanted to do, and if being the first company of our kind didn’t intimidate me, being female was never going to be a hindrance.

L-R: Franki Ashiruka, Kevin Kyalo, Carole Kinyua and Evans Wenani

What is the production and post industry like in Nairobi? 
When APO first opened, the workload was commercial-heavy, but in the last two years that has steadily declined. We’re seeing this gap filled by documentary films, corporate work and television series. Feature films are also slowly gaining traction and becoming the focus of many up-and-coming filmmakers.

What services do you provide, and what types of projects do you work on?
APO has a proven track record of successful delivery on hundreds of film and video projects for a diverse range of clients and collaborators, including major corporate entities, NGOs, advertising and PR agencies, and television stations. We also have plenty of experience mastering according to international delivery standards. We’re proud to house a complete end-to-end post ecosystem of offline and online editing suites.

Most importantly, we maintain a very thorough database of the post production community in Kenya.
This is of great benefit to our clients who come to us for a range of services including color grading, animation, visual effects, motion graphics and compositing. We are always excited to collaborate with the right people and get additional perspectives on the job at hand. One of our most notable collaborators is Ikweta Arts (Avatar, Black Panther, Game of Thrones, Hacksaw Ridge), owned and run by Yvonne Muinde. They specialize in providing VFX services with a focus in quality matte painting/digital environments, art direction, concept and post visual development art. We also collaborate with Keyframe (L’Oréal, BMW and Mitsubishi Malaysia) for motion graphics and animations.

Can you name some recent projects and the work you provided?
We are incredibly fortunate to be able to select projects that align with our beliefs and passions.

Our work on the short film Poacher (directed by Tom Whitworth) won us three global Best Editing Awards from the Short to the Point Online Film Festival (Romania, 2018), Feel the Reel International Film Festival (Glasgow, 2018) and Five Continents International Film Festival (Venezuela, 2019).

Other notable work includes three feature documentaries for the Big Story segment on China Global Television Network, directed by Juan Reina (director of the Netflix Original film Diving Into the Unknown), Lion’s Den (Quite Bright Films) an adaptation of ABC’s Shark Tank and The Great Kenyan Bake Off (Showstopper Media) adopted from the BBC series The Great British Bake Off. We also worked on Disconnect, a feature film produced by Kenya’s Tosh Gitonga (Nairobi Half Life), a director who is passionate about taking Africa’s budding film industry to the next level. We have also worked on a host of television commercials for clients extending across East Africa, including Kenya, Rwanda, South Sudan and Uganda.

What APO is most proud of though, is our clients’ ambitions and determination to contribute toward the growth of the African film industry. This truly resonates with APO’s mantra.

You recently added a MAM and some other gear. Can you talk about the need to upgrade?
Bringing on the EditShare EFS 200 nodes has significantly improved the collaborative possibilities of APO. We reached a point where we were quickly growing, and the old approach just wasn’t going to cut it.

Prior to centralizing our content, projects lived on individual hard disks. This meant that if I was editing and needed my assistant to find me a scene or a clip, or I needed VFX on something, I would have to export individual clips to different workstations. This created workflow redundancies and increased potential for versioning issues, which is something we couldn’t afford to be weighed down with.

The remote capabilities of the EditShare system were very appealing as well. Our color grading collaborator, Nic Apostoli of Comfort and Fame, is based in Cape Town, South Africa. From there, he can access the footage on the server and grade it while the client reviews with us in Nairobi. Flow media asset management also helps in this regard. We’re able to effectively organize and index clips, graphics, versions, etc. into clearly marked folders so there is no confusion about what media should be used. Collaboration among the team members is now seamless regardless of their physical location or tools used, which include the Adobe Creative Suite, Foundry Nuke, Autodesk Maya and Maxon Cinema 4D.

Any advice for others looking to break out on their own and start a post house?
Know what you want to do, and just do it! Thanks Nike …


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Directing bookend sequences for Portals, a horror anthology film

By Hasraf “HaZ” Dulull

Portals is a genre-bending feature film anthology focusing on a series of worldwide blackouts — after which millions of mysterious objects appear everywhere across the planet. While many flee from the sentient objects, some people are drawn toward and into them with horrifying consequences.

Portals

The film was in the final stages of post when writer/director Liam O’Donnell (Beyond Skyline and the upcoming Skylines film) called to see if I would like to get involved and direct some bookend sequences to add more scope and setup, which the producers felt was very much needed. I loved the premise and the world of the anthology, so I said yes. I pitched an idea for an ending, that quickly evolved into an extra segment at the end of the film, which I directed. That’s why there are officially four directors on the show, with me getting executive producer and “end-segment created by” credits.

Two of the other sequences are around 20 to 25 minutes each, and O’Donnell’s sequence was around 35 minutes. The film is 85 minutes long. Eduardo Sanchez and Gregg Hale (The Blair Witch Project) co-directed their segments. So the anthology feature film is really three long segments with my bookend sequences. The only connections among all the stories are the objects that appear, the event itself and the actual “portal,” but everything else was unique to each segment’s story. In terms of production, the only consistencies throughout the anthology were the camera language — that slight hand-held feel — and, of course, the music/sound

I had to watch the latest cut of the entire anthology film to get my head into that world, but I was given freedom to bring my own style to my sequences. That is exactly the point of an anthology — for each director to bring his or her own sensibilities to the individual segments. Besides Liam, the main producers I worked closely with on this project were Alyssa Devine and Griffin Devine from Pigrat Productions. They are fans of my first feature film, The Beyond, so they really encouraged the grounded tone I had demonstrated in that film.

The portal in Portals.

I’ve been a huge advocate of Blackmagic cameras and technology for a long time. Additionally, I knew I had to a lot to shoot in a very short time space (two days!), so I needed a camera that was light and flexible yet able to shoot 4K. I brought on cinematographer Colin Emerson, who shoots in a very loose way but always makes his stuff look cinematic. We watched the cut of the film and noticed the consistent loose nature to the cinematography on all the segments. Colin uses the Fig Rig a lot and I love the way that rig works and the BMD Pocket Cinema 4K fits nicely on it along with his DSLR lenses he likes to use. The other reason was to be able to use Blackmagic’s new BRaw format too.

We also shot the segment using a skeleton crew, which comprised of myself as director/producer; VFX supervisor/1st AD John Sellings, who also did some focus pulling; James De Taranto (sound recording); DP/camera op Colin Emerson, FX makeup artists Kate Griffith and Jay James; and our two actors, Georgina Blackledge and Dare Emmanuel. I worked with both of them on my feature film The Beyond.

The Post
One thing I wanted to make sure of was that the post team at The Institution in LA was able to take my Resolve files and literally work from that for the picture post. One of the things I did during prep of the project (before we even cast) was to shoot some tests to show what I had in mind in terms of look and feel. We also tested the BRaw and color workflow between my setup in London and the LA team. Colin and I did this during location recce. This proved to be extremely useful to ensure we set our camera to the exact specs the post house wanted. So we shot at 23.98, 4K (4096×1716) 2:39 cropped, Blackmagic color design log color space.

HaZ’s segments were captured with the Blackmagic Pocket Cinema Camera.

During the test, I did some quick color tests to show the producers in LA the tone and mood I was going for and to make sure everyone was on board before I shot it. The look was very post apocalyptic, as it’s set after the main events have happened. I wanted the locations to be a contrast with each other, one interior and one exterior with greens.

Colin is used to shooting most of his stuff on the Panasonic GH, but he had the Cinema Pocket Camera and was looking for the right project to use it on. He found he could use all of his usual lenses because the Cinema Pocket Camera has the same mount. Lenses used were the Sigma 18-35mm f/1.8 + Metabones Speedbooster; the Olympus 12mm f2; and the Lumix 35-100mm f2.8

Colin used the onboard monitor screen on the Pocket Cinema Camera, while I used a tethered external monitor — the Ikan DH5e — for directing. We used a 1TB Samsung external SSD securely attached to the rig cage along with a 64GB CFast card. The resolution we shot in was determined by the tests we did. We set up the rushes for post after each of the two days of the shoot, so during the day we would swap out drives and back things up. At the end of the day, we would bring in all the picture and sound rushes and use the amazing autosync feature in Blackmagic DaVinci Resolve to set it all up. This way, when I headed back home I could start editing right away inside Resolve.

Resolve

I have to admit, we were hesitant at first because I was shooting and capturing Log in QuickTime ProRes 4:4:4:4, and I always avoided DNG raw because of the huge file size and data transfer. But the team at Blackmagic has always been so supportive and provided us with support right up till the end of the shoot, so after testing BRaw I was impressed. We had so much control as all that information is accessed within Resolve. . I was able to set the temp look during editing, and the colorist worked from there. Skin tones were of utmost importance; because of the intimate nature of the drama, I wanted a natural look to the skin tones. I am really happy with the way they came out at the end.

They couldn’t believe how cinematic the footage was when we told them we shot using the Pocket Cinema Camera, since the other segments were shot on cameras like Red. We delivered the same 4K deliverables spec as the other segments in the film.

HaZ on set, second from right.

I used the AMD Radeon RX Vega 56 version of the Blackmagic eGPU. The reason was because I wanted to edit on my MacBook Pro (late 2017) and needed the power to run 4K in realtime. I was so impressed with how much power it provided; it was like having a new MacBook Pro without having to buy one. The eGPU had all the connectivity (two Thunderbolt and four USB-3) I needed, which is a limitation of the MacBook Pro.

The beauty of keeping everything native was that there wasn’t much work to do when porting, as it’s just plug and play. And the Resolve detects the eGPU, which you can then set as default. The BRaw format makes it all so manageable to preview and playback in real time. Also, since it’s native, Resolve doesn’t need to do any transcoding in the background. I have always been a huge fan of the tracking in Resolve, and I was able to do eye effects very easily without it being budgeted or done as a VFX shot. I was able to get the VFX render assets from the visual effects artist (Justin Martinez ) in LA and do quick-slap comps during editing. I love the idea that I can set looks and store them as memories, which I can then recall very quickly to apply on a bunch of shots. This allows me to have a slick-looking preview rough cut of the film.

Portals

I sent a hard drive containing all the organized rushes to the team in LA while I was doing the final tweaks to the edit. Once the edit was signed off, or if any last-minute notes came in, I would do them and email them my Resolve file. It was super simple, and the colorists (Oliver Ojeil) and post team (Chad Van Horn and Danny Barone) in LA appreciated the simple workflow because there really wasn’t any conforming for them to do apart from a one-click relink of media location; they would just take my Resolve file and start working away with it.

We used practical effects to keep the horror as real and grounded as possible, and used VFX to augment further. We were fortunate to be able to get special effects makeup artist Kate Griffiths. Given the tight schedule she was able to create a terrifying effect, which I won’t give away. You need to watch the film to see it! We had to shoot those make-up FX-heavy shots at the end of the day, which meant we had to be smart about how we scheduled the shoot given the hours-long make-up process. Kate was also on hand to provide effects like the liquid coming out of the eyes and sweat etc. — every detail of which the camera picked up for us so we could bring it out in the grade.

The Skype-style shots at the start of the film (phone and computer monitor shots) had their VFX screen elements placed as a separate layer so the post team in LA could grade them separately and control the filters applied on them. For some of the wide shots showing our characters entering and leaving the portal, we keyframed some movement of the 4K shot along with motion blur to give the effect of in-camera movement. I also used the camera shake within Resolve, which comes with so many options to create bespoke movement on static frames.

Portals is now available on iTunes and other VOD platforms.


HaZ Dulull is known for his sci-fi feature films The Beyond and 2036 Origin Unknown, also in television for his pilot and episodes on Disney’s Fast Layne. He is currently busy on projects at various stages of development and production at his production company, hazfilm.com.

Conductor Companion app targets VFX boutiques and freelancers

Conductor Technologies has introduced Conductor Companion, a desktop app designed to simplify the use of the cloud-based rendering service. Tailored for boutique studios and freelance artists, Companion streamlines the Conductor on-ramp and rendering experience, allowing users to easily manage and download files, write commands and handle custom submissions or plug-ins from their laptops or workstations. Along with this release, Conductor has added initial support for Blender creative software.

“Conductor was originally designed to meet the needs of larger VFX studios, focusing our efforts on maximizing efficiency and scalability when many artists simultaneously leverage the platform and optimizing how Conductor hooks into those pipelines,” explains CEO Mac Moore. “As Conductor’s user base has grown, we’ve been blown away by the number of freelance artists and small studios that have come to us for help, each of which has their own unique needs. Conductor Companion is a nod to that community, bringing all the functionality and massive render resource scale of Conductor into a user-friendly app, so that artists can focus on content creation versus pipeline management. And given that focus, it was a no-brainer to add Blender support, and we are eager to serve the passionate users of that product.”

Moore reports that this app will be the foundation of Conductor’s Intelligence Hub in the near future, “acting as a gateway to more advanced functionality like Shot Analytics and Intelligent Bid Assist. These features will leverage AI and Conductor’s cloud knowledge to help owners and freelancers make more informed business decisions as it pertains to project-to-project rendering financials.”

Conductor Companion is currently in public beta. You can download the app here.

In addition to Blender, applications currently supported by Conductor include Autodesk Maya and Arnold; Foundry’s Nuke, Cara VR, Katana, Modo and Ocula; Chaos Group’s V-Ray; Pixar’s RenderMan; Isotropix’s Clarisse; Golaem; Ephere’s Ornatrix; Yeti; and Miarmy.

The Mill opens boutique studio in Berlin

Technicolor’s The Mill has officially launched in Berlin. This new boutique studio is located in the heart of Berlin, situated in the creative hub of Mitte, near many of Germany’s agencies, production companies and brands.

The Mill has been working with German clients for years. Recent projects include the Mercedes’ Bertha Benz spot with director Sebastian Strasser; Netto’s The Easter Surprise, directed in-house by The Mill; and BMW The 8 with director Daniel Wolfe. The new studio will bring The Mill’s full range of creative services from color to experiential and interactive, as well as visual effects and design.

The Mill Berlin crew

Creative director Greg Spencer will lead the creative team. He is a multi-award winning creative, having won several VES, Cannes Lions and British Arrow awards. His recent projects include Carlsberg’s The Lake, PlayStation’s This Could Be You and Eve Cuddly Toy. Spencer also played a role in some of Mill Film’s major titles. He was the 2D supervisor for Les Misérables and also worked on the Lord of the Rings trilogy. His resume also includes campaigns for brands such as Nike and Samsung.

Executive producer Justin Stiebel moves from The Mill London, where he has been since early 2014, to manage client relationships and new business. Since joining the company, Stiebel has produced spots such as Audi’s Next Level and the Mini’s “The Faith of a Few” campaign. He has also collaborated with directors such as Sebastian Strasser, Markus Walter and Daniel Wolfe while working on brands like Mercedes, Audi and BMW.

Sean Costelloe is managing director of The Mill London and The Mill Berlin.

Main Image Caption: (L-R) Justin Stiebel and Greg Spencer

Directing Olly’s ‘Happy Inside Out’ campaign

How do you express how vitamins make you feel? Well, production company 1stAveMachine partnered with independent creative agency Yard NYC to develop the stylized “Happy Inside Out” campaign for Olly multivitamin gummies to show just that.

Beauty

The directing duo of Erika Zorzi and Matteo Sangalli, known as Mathery, highlighted the brand’s products and benefits by using rich textures, colors and lighting. They shot on an ARRI Alexa Mini. “Our vision was to tell a cohesive narrative, where each story of the supplements spoke the same visual language,” Mathery explains. “We created worlds where everything is possible and sometimes took each product’s concept to the extreme and other times added some romance to it.”

Each spot imagines various benefits of taking Olly products. The side-scrolling Energy, which features a green palette, shows a woman jumping and doing flips through life’s everyday challenges, including through her home to work, doing laundry and going to the movies. Beauty, with its pink color pallete, features another woman “feeling beautiful” while turning the heads of a parliament of owls. Meanwhile, Stress, with its purple/blue palette, features a women tied up in a giant ball of yarn, and as she unspools herself, the things that were tying her up spin away. In the purple-shaded Sleep, a lady lies in bed pulling off layer after layer of sleep masks until she just happily sleeps.

Sleep

The spots were shot with minimal VFX, other than a few greenscreen moments, and the team found itself making decisions on the fly, constantly managing logistics for stunt choreography, animal performances and wardrobe. Jogger Studios provided the VFX using Autodesk Flame for conform, cleanup and composite work. Adobe After Effects was used for all of the end tag animation. Cut+Run edited the campaign.

According to Mathery, “The acrobatic moves and obstacle pieces in the Energy spot were rehearsed on the same day of the shoot. We had to be mindful because the action was physically demanding on the talent. With the Beauty spot, we didn’t have time to prepare with the owls. We had no idea if they would move their heads on command or try to escape and fly around the whole time. For the Stress spot, we experimented with various costume designs and materials until we reached a look that humorously captured the concept.”

The campaign marks Mathery’s second collaboration with Yard NYC and Olly, who brought the directing team into the fold very early on, during the initial stages of the project. This familiarity gave everyone plenty of time to let the ideas breath.

VES Awards: The Lion King and Alita earn five noms each

The Visual Effects Society (VES) has announced its nominees for the 18th Annual VES Awards, which recognize outstanding visual effects artistry and innovation in film, animation, television, commercials and video games and the VFX supervisors, VFX producers and hands-on artists who bring this work to life. Alita: Battle Angel and The Lion King both have five nominations each; Toy Story 4 is the top animated film contender with five nominations, and Game of Thrones and The Mandalorian tie to lead the broadcast field with six nominations each.

Nominees in 25 categories were selected by VES members via events hosted by 11 VES sections, including Australia, the Bay Area, Germany, London, Los Angeles, Montreal, New York, New Zealand, Toronto, Vancouver and Washington.

The VES Awards will be held on January 29 at the Beverly Hilton Hotel. The VES Lifetime Achievement Award will be presented to Academy, DGA and Emmy-Award winning director-producer-screenwriter Martin Scorsese. The VES Visionary Award will be presented to director-producer-screenwriter Roland Emmerich. And the VES Award for Creative Excellence will be given to visual effects supervisor Sheena Duggal. Award-winning actor-comedian-author Patton Oswalt will once again host the event.

The nominees for the 18th Annual VES Awards in 25 categories are:

 

Outstanding Visual Effects in a Photoreal Feature

 

ALITA: BATTLE ANGEL

Richard Hollander

Kevin Sherwood

Eric Saindon

Richard Baneham

Bob Trevino

 

AVENGERS: ENDGAME

Daniel DeLeeuw

Jen Underdahl

Russell Earl

Matt Aitken

Daniel Sudick

 

GEMINI MAN

Bill Westenhofer

Karen Murphy-Mundell

Guy Williams

Sheldon Stopsack

Mark Hawker

 

STAR WARS: THE RISE OF SKYWALKER

Roger Guyett

Stacy Bissell

Patrick Tubach

Neal Scanlan

Dominic Tuohy

 

THE LION KING

Robert Legato

Tom Peitzman

Adam Valdez

Andrew R. Jones

 

Outstanding Supporting Visual Effects in a Photoreal Feature

 

1917

Guillaume Rocheron

Sona Pak

Greg Butler

Vijay Selvam

Dominic Tuohy

 

FORD V FERRARI

Olivier Dumont

Kathy Siegel

Dave Morley

Malte Sarnes

Mark Byers

 

JOKER

Edwin Rivera

Brice Parker

Mathew Giampa

Bryan Godwin

Jeff Brink

 

THE AERONAUTS

Louis Morin

Annie Godin

Christian Kaestner

Ara Khanikian

Mike Dawson

 

THE IRISHMAN

Pablo Helman

Mitch Ferm

Jill Brooks

Leandro Estebecorena

Jeff Brink

 

Outstanding Visual Effects in an Animated Feature

 

FROZEN 2

Steve Goldberg

Peter Del Vecho

Mark Hammel

Michael Giaimo

 

KLAUS

Sergio Pablos

Matthew Teevan

Marcin Jakubowski

Szymon Biernacki

 

MISSING LINK

Brad Schiff

Travis KnightSteve Emerson

Benoit Dubuc

 

THE LEGO MOVIE 2

David Burgess

Tim Smith

Mark Theriault

John Rix

 

TOY STORY 4

Josh Cooley

Mark Nielsen

Bob Moyer

Gary Bruins

 

Outstanding Visual Effects in a Photoreal Episode

 

GAME OF THRONES; The Bells

Joe Bauer

Steve Kullback

Ted Rae

Mohsen Mousavi

Sam Conway

 

HIS DARK MATERIALS; The Fight to the Death

Russell Dodgson

James Whitlam

Shawn Hillier

Robert Harrington

 

LADY AND THE TRAMP

Robert Weaver

Christopher Raimo

Arslan Elver

Michael Cozens

Bruno Van Zeebroeck

 

LOST IN SPACE – Episode: Ninety-Seven

Jabbar Raisani

Terron Pratt

Niklas Jacobson

Juri Stanossek

Paul Benjamin

 

STRANGER THINGS – Chapter Six: E Pluribus Unum

Paul Graff

Tom Ford

Michael Maher Jr.

Martin Pelletier

Andy Sowers

 

THE MANDALORIAN; The Child

Richard Bluff

Abbigail Keller

Jason Porter

Hayden Jones

Roy Cancinon

 

Outstanding Supporting Visual Effects in a Photoreal Episode

 

CHERNOBYL; 1:23:45

Max Dennison

Lindsay McFarlane

Clare Cheetham

Paul Jones

Claudius Christian Rauch

 

LIVING WITH YOURSELF; Nice Knowing You

Jay Worth

Jacqueline VandenBussche

Chris Wright

Tristan Zerafa

 

SEE; Godflame

Adrian de Wet

Eve Fizzinoglia

Matthew Welford

Pedro Sabrosa

Tom Blacklock

 

THE CROWN; Aberfan

Ben Turner

Reece Ewing

David Fleet

Jonathan Wood

 

VIKINGS; What Happens in the Cave

Dominic Remane

Mike Borrett

Ovidiu Cinazan

Tom Morrison

Paul Byrne

 

Outstanding Visual Effects in a Real-Time Project

 

Call of Duty Modern Warfare

Charles Chabert

Chris Parise

Attila Zalanyi

Patrick Hagar

 

Control

Janne Pulkkinen

Elmeri Raitanen

Matti Hämäläinen

James Tottman

 

Gears 5

Aryan Hanbeck

Laura Kippax

Greg Mitchell

Stu Maxwell

 

Myth: A Frozen Tale

Jeff Gipson

Nicholas Russell

Brittney Lee

Jose Luis Gomez Diaz

 

Vader Immortal: Episode I

Ben Snow

Mike Doran

Aaron McBride

Steve Henricks

 

Outstanding Visual Effects in a Commercial

 

Anthem Conviction

Viktor Muller

Lenka Likarova

Chris Harvey

Petr Marek

 

BMW Legend

Michael Gregory

Christian Downes

Tim Kafka

Toya Drechsler

 

Hennessy: The Seven Worlds

Carsten Keller

Selcuk Ergen

Kiril Mirkov

William Laban

 

PlayStation: Feel The Power of Pro

Sam Driscoll

Clare Melia

Gary Driver

Stefan Susemihl

 

Purdey’s: Hummingbird

Jules Janaud

Emma Cook

Matthew Thomas

Philip Child

 

Outstanding Visual Effects in a Special Venue Project

 

Avengers: Damage Control

Michael Koperwas

Shereif Fattouh

Ian Bowie

Kishore Vijay

Curtis Hickman

 

Jurassic World: The Ride

Hayden Landis

Friend Wells

Heath Kraynak

Ellen Coss

 

Millennium Falcon: Smugglers Run

Asa Kalama

Rob Huebner

Khatsho Orfali

Susan Greenhow

 

Star Wars: Rise of the Resistance

Jason Bayever

Patrick Kearney

Carol Norton

Bill George

 

Universal Sphere

James Healy

Morgan MacCuish

Ben West

Charlie Bayliss

 

Outstanding Animated Character in a Photoreal Feature

 

ALITA: BATTLE ANGEL; Alita

Michael Cozens

Mark Haenga

Olivier Lesaint

Dejan Momcilovic

 

AVENGERS: ENDGAME; Smart Hulk

Kevin Martel

Ebrahim Jahromi

Sven Jensen

Robert Allman

 

GEMINI MAN; Junior

Paul Story

Stuart Adcock

Emiliano Padovani

Marco Revelant

 

THE LION KING; Scar

Gabriel Arnold

James Hood

Julia Friedl

Daniel Fortheringham

 

 

 

 

Outstanding Animated Character in an Animated Feature

 

FROZEN 2; The Water Nøkk

Svetla Radivoeva

Marc Bryant

Richard E. Lehmann

Cameron Black

 

KLAUS; Jesper

Yoshimishi Tamura

Alfredo Cassano

Maxime Delalande

Jason Schwartzman

 

MISSING LINK; Susan

Rachelle Lambden

Brenda Baumgarten

Morgan Hay

Benoit Dubuc

 

TOY STORY 4; Bo Peep

Radford Hurn

Tanja Krampfert

George Nguyen

Becki Rocha Tower

 

Outstanding Animated Character in an Episode or Real-Time Project

 

LADY AND THE TRAMP; Tramp

Thiago Martins

Arslan Elver

Stanislas Paillereau

Martine Chartrand

 

STRANGER THINGS 3; Tom/Bruce Monster

Joseph Dubé-Arsenault

Antoine Barthod

Frederick Gagnon

Xavier Lafarge

 

THE MANDALORIAN; The Child; Mudhorn

Terry Bannon

Rudy Massar

Hugo Leygnac

 

THE UMBRELLA ACADEMY; Pilot; Pogo

Aidan Martin

Craig Young

Olivier Beierlein

Laurent Herveic

 

Outstanding Animated Character in a Commercial

 

Apex Legends; Meltdown; Mirage

Chris Bayol

John Fielding

Derrick Sesson

Nole Murphy

 

Churchill; Churchie

Martino Madeddu

Philippe Moine

Clement Granjon

Jon Wood

 

Cyberpunk 2077; Dex

Jonas Ekman

Jonas Skoog

Marek Madej

Grzegorz Chojnacki

 

John Lewis; Excitable Edgar; Edgar

Tim van Hussen

Diarmid Harrison-Murray

Amir Bazzazi

Michael Diprose

 

 

Outstanding Created Environment in a Photoreal Feature

 

ALADDIN; Agrabah

Daniel Schmid

Falk Boje

Stanislaw Marek

Kevin George

 

ALITA: BATTLE ANGEL; Iron City

John Stevenson-Galvin

Ryan Arcus

Mathias Larserud

Mark Tait

 

MOTHERLESS BROOKLYN; Penn Station

John Bair

Vance Miller

Sebastian Romero

Steve Sullivan

 

STAR WARS: THE RISE OF SKYWALKER; Pasaana Desert

Daniele Bigi

Steve Hardy

John Seru

Steven Denyer

 

THE LION KING; The Pridelands

Marco Rolandi

Luca Bonatti

Jules Bodenstein

Filippo Preti

 

 

Outstanding Created Environment in an Animated Feature

 

FROZEN 2; Giants’ Gorge

Samy Segura

Jay V. Jackson

Justin Cram

Scott Townsend

 

HOW TO TRAIN YOUR DRAGON: THE HIDDEN WORLD; The Hidden World

Chris Grun

Ronnie Cleland

Ariel Chisholm

Philippe Brochu

 

MISSING LINK; Passage to India Jungle

Oliver Jones

Phil Brotherton

Nick Mariana

Ralph Procida

 

TOY STORY 4; Antiques Mall

Hosuk Chang

Andrew Finley

Alison Leaf

Philip Shoebottom

 

 

Outstanding Created Environment in an Episode, Commercial, or Real-Time Project

 

GAME OF THRONES; The Iron Throne; Red Keep Plaza

Carlos Patrick DeLeon

Alonso Bocanegra Martinez

Marcela Silva

Benjamin Ross

 

LOST IN SPACE; Precipice; The Trench

Philip Engström

Benjamin Bernon

Martin Bergquist

Xuan Prada

 

THE DARK CRYSTAL: AGE OF RESISTANCE; The Endless Forest

Sulé Bryan

Charles Chorein

Christian Waite

Martyn Hawkins

 

THE MANDALORIAN; Nevarro Town

Alex Murtaza

Yanick Gaudreau

Marco Tremblay

Maryse Bouchard

 

Outstanding Virtual Cinematography in a CG Project

 

ALITA: BATTLE ANGEL

Emile Ghorayeb

Simon Jung

Nick Epstein

Mike Perry

 

THE LION KING

Robert Legato

Caleb Deschanel

Ben Grossmann

AJ Sciutto

 

THE MANDALORIAN; The Prisoner; The Roost

Richard Bluff

Jason Porter

Landis Fields IV

Baz Idione

 

 

TOY STORY 4

Jean-Claude Kalache

Patrick Lin

 

Outstanding Model in a Photoreal or Animated Project

 

LOST IN SPACE; The Resolute

Xuan Prada

Jason Martin

Jonathan Vårdstedt

Eric Andersson

 

MISSING LINK; The Manchuria

Todd Alan Harvey

Dan Casey

Katy Hughes

 

THE MAN IN THE HIGH CASTLE; Rocket Train

Neil Taylor

Casi Blume

Ben McDougal

Chris Kuhn

 

THE MANDALORIAN; The Sin; The Razorcrest

Doug Chiang

Jay Machado

John Goodson

Landis Fields IV

 

Outstanding Effects Simulations in a Photoreal Feature

 

DUMBO; Bubble Elephants

Sam Hancock

Victor Glushchenko

Andrew Savchenko

Arthur Moody

 

SPIDER-MAN: FAR FROM HOME; Molten Man

Adam Gailey

Jacob Santamaria

Jacob Clark

Stephanie Molk

 

 

 

 

 

STAR WARS: THE RISE OF SKYWALKER

Don Wong

Thibault Gauriau

Goncalo Cababca

Francois-Maxence Desplanques

 

THE LION KING

David Schneider

Samantha Hiscock

Andy Feery

Kostas Strevlos

 

Outstanding Effects Simulations in an Animated Feature

 

ABOMINABLE

Alex Timchenko

Domin Lee

Michael Losure

Eric Warren

 

FROZEN 2

Erin V. Ramos

Scott Townsend

Thomas Wickes

Rattanin Sirinaruemarn

 

HOW TO TRAIN YOUR DRAGON: THE HIDDEN WORLD; Water and Waterfalls

Derek Cheung

Baptiste Van Opstal

Youxi Woo

Jason Mayer

 

TOY STORY 4

Alexis Angelidis

Amit Baadkar

Lyon Liew

Michael Lorenzen

 

Outstanding Effects Simulations in an Episode, Commercial, or Real-Time Project

 

GAME OF THRONES; The Bells

Marcel Kern

Paul Fuller

Ryo Sakaguchi

Thomas Hartmann

 

Hennessy: The Seven Worlds

Selcuk Ergen

Radu Ciubotariu

Andreu Lucio

Vincent Ullmann

 

LOST IN SPACE; Precipice; Water Planet

Juri Bryan

Hugo Medda

Kristian Olsson

John Perrigo

 

STRANGER THINGS 3; Melting Tom/Bruce

Nathan Arbuckle

Christian Gaumond

James Dong

Aleksandr Starkov

 

THE MANDALORIAN; The Child; Mudhorn

Xavier Martin Ramirez

Ian Baxter

Fabio Siino

Andrea Rosa

 

Outstanding Compositing in a Feature

 

ALITA: BATTLE ANGEL

Adam Bradley

Carlo Scaduto

Hirofumi Takeda

Ben Roberts

 

AVENGERS: ENDGAME

Tim Walker

Blake Winder

Tobias Wiesner

Joerg Bruemmer

 

CAPTAIN MARVEL; Young Nick Fury

Trent Claus

David Moreno Hernandez

Jeremiah Sweeney

Yuki Uehara

 

STAR WARS: THE RISE OF SKYWALKER

Jeff Sutherland

John Galloway

Sam Bassett

Charles Lai

 

THE IRISHMAN

Nelson Sepulveda

Vincent Papaix

Benjamin O’Brien

Christopher Doerhoff

 

 

Outstanding Compositing in an Episode

 

GAME OF THRONES; The Bells

Sean Heuston

Scott Joseph

James Elster

Corinne Teo

 

GAME OF THRONES; The Long Night; Dragon Ground Battle

Mark Richardson

Darren Christie

Nathan Abbott

Owen Longstaff

 

STRANGER THINGS 3; Starcourt Mall Battle

Simon Lehembre

Andrew Kowbell

Karim El-Masry

Miklos Mesterhazy

 

WATCHMEN; Pilot; Looking Glass

Nathaniel Larouche

Iyi Tubi

Perunika Yorgova

Mitchell Beaton

 

Outstanding Compositing in a Commercial

 

BMW Legend

Toya Drechsler

Vivek Tekale

Guillaume Weiss

Alexander Kulikov

 

Feeding America; I Am Hunger in America

Dan Giraldo

Marcelo Pasqualino

Alexander Koester

 

Hennessy; The Seven Worlds

Rod Norman

Guillaume Weiss

Alexander Kulikov

Alessandro Granella

 

PlayStation: Feel the Power of Pro

Gary Driver

Stefan Susemihl

Greg Spencer

Theajo Dharan

 

Outstanding Special (Practical) Effects in a Photoreal or Animated Project

 

ALADDIN; Magic Carpet

Mark Holt

Jay Mallet

Will Wyatt

Dickon Mitchell

 

GAME OF THRONES; The Bells

Sam Conway

Terry Palmer

Laurence Harvey

Alastair Vardy

 

TERMINATOR: DARK FATE

Neil Corbould

David Brighton

Ray Ferguson

Keith Dawson

 

THE DARK CRYSTAL: THE AGE OF RESISTANCE; She Knows All the Secrets

Sean Mathiesen

Jon Savage

Toby Froud

Phil Harvey

 

Outstanding Visual Effects in a Student Project

 

DOWNFALL

Matias Heker

Stephen Moroz

Bradley Cocksedge

 

LOVE AND FIFTY MEGATONS

Denis Krez

Josephine Roß

Paulo Scatena

Lukas Löffler

 

OEIL POUR OEIL

Alan Guimont

Thomas Boileau

Malcom Hunt

Robin Courtoise

 

THE BEAUTY

Marc Angele

Aleksandra Todorovic

Pascal Schelbli

Noel Winzen

 

 

Recreating the Vatican and Sistine Chapel for Netflix’s The Two Popes

The Two Popes, directed by Fernando Meirelles, stars Anthony Hopkins as Pope Benedict XVI and Jonathan Pryce as current pontiff Pope Francis in a story about one of the most dramatic transitions of power in the Catholic Church’s history. The film follows a frustrated Cardinal Bergoglio (the future Pope Francis) who in 2012 requests permission from Pope Benedict to retire because of his issues with the direction of the church. Instead, facing scandal and self-doubt, the introspective Benedict summons his harshest critic and future successor to Rome to reveal a secret that would shake the foundations of the Catholic Church.

London’s Union was approached in May 2017 and supervised visual effects on location in Argentina and Italy over several months. A large proportion of the film takes place within the walls of Vatican City. The Vatican was not involved in the production and the team had very limited or no access to some of the key locations.

Under the direction of production designer Mark Tildesley, the production replicated parts of the Vatican at Rome’s Cinecitta Studios, including a life-size, open ceiling, Sistine Chapel, which took two months to build.

The team LIDAR-scanned everything available and set about amassing as much reference material as possible — photographing from a permitted distance, scanning the set builds and buying every photographic book they could lay their hands on.

From this material, the team set about building 3D models — created in Autodesk Maya — of St. Peter’s Square, the Basilica and the Sistine Chapel. The environments team was tasked with texturing all of these well-known locations using digital matte painting techniques, including recreating Michelangelo’s masterpiece on the ceiling of the Sistine Chapel.

The story centers on two key changes of pope in 2005 and 2013. Those events attracted huge attention, filling St. Peter’s Square with people eager to discover the identity of the new pope and celebrate his ascension. News crews from around the world also camp out to provide coverage for the billions of Catholics all over the world.

To recreate these scenes, the crew shot at a school in Rome (Ponte Mammolo) that has the same pattern on its floor. A cast of 300 extras was shot in blocks in different positions at different times of day, with costume tweaks including the addition of umbrellas to build a library that would provide enough flexibility during post to recreate these moments at different times of day and in different weather conditions.

Union also called on Clear Angle Studios to individually scan 50 extras to provide additional options for the VFX team. This was an ambitious crowd project, so the team couldn’t shoot in the location, and the end result had to stand up at 4K in very close proximity to the camera. Union designed a Houdini-based system to deal with the number of assets and clothing in such a way that the studio could easily art-direct them as individuals, allow the director to choreograph them and deliver a believable result.

Union conducted several motion capture shoots inhouse at Union to provide some specific animation cycles that married with the occasions they were recreating. This provided even more authentic-looking crowds for the post team.

Union worked on a total of 288 VFX shots, including greenscreens, set extensions, window reflections, muzzle flashes, fog and rain and a storm that included a lightning strike on the Basilica.

In addition, the team did a significant amount of de-aging work to accommodate the film’s eight-year main narrative timeline as well as a long period in Pope Francis’ younger years.

ILM’s Pablo Helman on The Irishman‘s visual effects

By Karen Moltenbrey

When a film stars Robert De Niro, Joe Pesci and Al Pacino, well, expectations are high. These are no ordinary actors, and Martin Scorsese is no ordinary director. These are movie legends. And their latest project, Netflix’s The Irishman, is no ordinary film. It features cutting-edge de-aging technology from visual effects studio Industrial Light & Magic (ILM) and earned the film’s VFX supervisor, Pablo Helman, an Oscar nomination.

The Irishman, adapted from the book “I Heard You Paint Houses,” tells the story of an elderly Frank “The Irishman” Sheeran (De Niro), whose life is nearing the end, as he looks back on his earlier years as a truck driver-turned-mob hitman for Russell Bufalino (Pesci) and family. While reminiscing, he recalls the role he played in the disappearance of his longtime friend, Jimmy Hoffa (Al Pacino), former president of the Teamsters, who famously disappeared in 1975 at the age of 62, and whose body has never been found.

The film contains 1,750 visual effects shots, most of which involve the de-aging of the three actors. In the film, the actors are depicted at various stages of their lives — mostly younger than their present age. Pacino is the least aged of the three actors, since he enters the story about a third of the way through — from the 1940s to his disappearance three decades later. He was 78 at the time of filming, and he plays Hoffa at various ages, from age 44 to 62. De Niro, who was 76 at the time of filming, plays Sheeran at certain points from age 20 to 80. Pesci plays Bufalino between age 53 and 83.

For the significantly older Sheeran, during his introspection, makeup was used. However, making the younger versions of all three actors was much more difficult. Indeed, current technology makes it possible to create believable younger digital doubles. But, it typically requires actors to perform alone on a soundstage wearing facial markers and helmet cameras, or requires artists to enhance or create performances with CG animation. That simply would not do for this film. Neither the actors nor Scorsese wanted the tech to interfere with the acting process in any way. Recreating their performances was also off the table.

“They wanted a technology that was non-intrusive and one that would be completely separate from the performances. They didn’t want markers on their faces, they did not want to wear helmet cams and they did not want to wear the gray [markered] pajamas that we normally use,” says VFX supervisor Helman. “They also wanted to be on set with theatrical lighting, and there wasn’t going to be any kind of re-shoots of performances outside the set.”

In a nutshell, ILM needed a markerless approach that occurred on-set during filming. To this end, ILM spent two years developing Flux, a new camera system and software, whereby a three-camera rig would extract performance data from lighting and textures captured on set and translate that to 3D computer-generated versions of the actors’ younger selves.

The camera rig was developed in collaboration with The Irishman’s DP, Rodrigo Prieto, and camera maker ARRI. It included two high-resolution (3.8K) Alexa Mini witness cameras that were modified with infrared rings; the two cameras were attached to and synched up with the primary sensor camera (the director’s Red Helium 8K camera). The infrared light from the two cameras was necessary to help neutralize any shadows on the actors’ faces, since Flux does not handle shadows well, yet remained “unseen” by the production camera.

Flux, meanwhile, used that camera information and translated that into deformable geometry mesh. “Flux takes that information from the three cameras and compares it to the lighting on set, deforms the geometry and changes the geometry and the shape of the actors on a frame-by-frame basis,” says Helman.

In fact, ILM continued to develop the software as it was working on the film. “It’s kind of like running the Grand Prix while you’re building the Ferrari,” Helman adds. “Then, you get better and better, and faster and faster, and your software gets better, and you are solving problems and learning from the software. Yes, it took a long time to do, but we knew we had time to do it and make it work.”

Pablo Helman (right) on The Irishman set.

At the beginning of the project, prior to the filming, the actors were digitally scanned performing a range of facial movements using ILM’s Medusa system, as well as on a light stage, which captured texture info under different lighting conditions. All that data was then used to create a 3D contemporary digital double of each of the actors. The models were sculpted in Autodesk’s Maya and with proprietary tools running on ILM’s Zeno platform.

ILM applied the 3D models to the exact performance data of each actor captured on set with the special camera rig, so the physical performances were now digital. No keyframe animation was used. However, the characters were still contemporary to the actors’ ages.

As Helman explains, after the performance, the footage was returned to ILM, where an intense matchmove was done of the actors’ bodies and heads. “The first thing that got matchmoved was the three cameras that were documenting what the actor was doing in the performance, and then we matchmoved the lighting instruments that were lighting the actor because Flux needs that lighting information in order to work,” he says.

Helman likens Flux to a black box full of little drawers where various aspects are inserted, like the layout, the matchimation, the lighting information and so forth, and it combines all that information to come up with the geometry for the digital double.

The actual de-aging occurs in modeling using a combination of libraries that were created for each actor and connected to and referenced by Flux. Later, modelers created the age variations, starting with the youngest version of each person. Variants were then generated gradually using a slider to move through life’s timeline. This process was labor-intensive as artists had to also erase the effects of time, such as wrinkles and age spots.

Insofar as The Irishman is not an action movie, creating motion for decades-younger versions of the characters was not an issue. However, a motion analyst was on set to work with the actors as they played the younger versions of their characters. Also, some visual effects work helped thin out the younger characters.

Helman points out that Scorsese stressed that he did not want to see a younger version of the actors playing roles from the past; he wanted to see younger versions of these particular characters. “He did not want to rewind the clock and see Robert De Niro as Jimmy Conway in 1990’s Goodfellas. He wanted to see De Niro as a 30-year-younger Frank Sheeran,” he explains.

When asked which actor posed the most difficulty to de-age, Helman explains that once you crack the code of capturing the performance and then retargeting the performance to a younger variation of the character, there’s little difference. Nevertheless, De Niro had the most screen time and the widest age range.

Performance capture began about 15 years ago, and Helman sees this achievement as a natural evolution of the technology. “Eventually those [facial] markers had to go away because for actors, that’s a very interesting way to work, if you really think about it. They have to try to ignore the markers and not be distracted by all the other intrusive stuff going on,” Helman says. “That time is now gone. If you let the actors do what they do, the performances will be so much better and the shots will look so much better because there is eye contact and context with another actor.”

While this technology is a quantum leap forward, there are still improvements to be made. The camera rig needs to get smaller and the software faster — and ILM is working on both aspects, Helman says. Nevertheless, the accomplishment made here is impressive and groundbreaking — the first markerless system that captures performance on set with theatrical lighting, thanks to more than 500 artists working around the world to make this happen. As a result, it opens up the door for more storytelling and acting options — not only for de-aging, but for other types of characters too.

Commenting on his Oscar nomination, Helman said, “It was an incredible, surreal experience to work with Scorsese and the actors, De Niro, Pacino and Pesci, on this movie. We are so grateful for the trust and support we got from the producers and from Netflix, and the talent and dedication of our team. We’re honored to be recognized by our colleagues with this nomination.”


Karen Moltenbrey is a veteran writer, covering visual effects and post production.

Maxon and Red Giant to merge

Maxon, developers of pro 3D software solutions, and Red Giant, makers of tools for editors, VFX artists, and motion designers, have agreed to merge under the media and entertainment division of Nemetschek Group. The transaction is expected to close in January 2020, subject to regulatory approval and customary closing conditions.

Maxon, best known for its 3D product Cinema 4D, was formed in 1986 to provide high-end yet accessible 3D software solutions. Artists across the globe rely on Maxon products to create high-end visuals. In April of this year, Maxon acquired Redshift, developer of the GPU-accelerated Redshift render engine.

Since 2002, Red Giant has built its brand through products such as Trapcode, Magic Bullet, Universe, PluralEyes and its line of visual effects software. Its tools are used in the fields of film, broadcast and advertising.

The two companies provide tools for companies including ABC, CBS, NBC, HBO, BBC, Sky, Fox Networks, Turner Broadcasting, NFL Network, WWE, Viacom, Netflix, ITV Creative, Discovery Channel, MPC, Digital Domain, VDO, Sony, Universal, The Walt Disney Company, Blizzard Entertainment, BMW, Facebook, Apple, Google, Vitra, Nike and many more.

Main Photo: L-R: Maxon CEO Dave McGavran and Red Giant CEP Chad Bechert

Shape+Light VFX boutique opens in LA with Trent, Lehr at helm


Visual effects and design studio boutique Shape+Light has officially launched in Santa Monica. At the helm is managing director/creative director Rob Trent and executive producer Cara Lehr. Shape+Light provides visual effects, design and finishing services for agency and brand-direct clients. The studio, which has been quietly operating since this summer, has already delivered work for Nike, Apple, Gatorade, Lexus and Proctor & Gamble.

Gatorade

Trent is no stranger to running VFX boutiques. An industry veteran, he began his career as a Flame artist, working at studios including Imaginary Forces and Digital Domain, and then at Asylum VFX as a VFX supervisor/creative director before co-founding The Mission VFX in 2010. In 2015, he established Saint Studio. During his career he has worked on big campaigns, including the launch of the Apple iPhone with David Fincher, celebrating the NFL with Nike and Michael Mann, and honoring moms with Alma Har’el and P&G for the Olympics. He has also contributed to award-winning feature films such as The Curious Case of Benjamin Button, Minority Report, X-Men and Zodiac.

Lehr is an established VFX producer with over 20 years of experience in both commercials and features. She has worked for many of LA’s leading VFX studios, including Zoic Studios, Asylum VFX, Digital Domain, Brickyard VFX and Psyop. She most recently served as EP at Method Studios, where she was on staff since 2012. She has worked on ad campaigns for brands including Apple, Microsoft, Nike, ESPN, Coca Cola, Taco Bell, AT&T, the NBA, Chevrolet and more.

Maya 2020 and Arnold 6 now available from Autodesk

Autodesk has released Autodesk Maya 2020 and Arnold 6 with Arnold GPU. Maya 2020 brings animators, modelers, riggers and technical artists a host of new tools and improvements for CG content creation, while Arnold 6 allows for production rendering on both the CPU and GPU.

Maya 2020 adds more than 60 new updates, as well as performance enhancements and new simulation features to Bifrost, the visual programming environment in Maya.

Maya 2020

Release highlights include:

— Over 60 animation features and updates to the graph editor and time slider.
— Cached Playback: New preview modes, layered dynamics caching and more efficient caching of image planes.
— Animation bookmarks: Mark, organize and navigate through specific events in time and frame playback ranges.
— Bifrost for Maya: Performance improvements, Cached Playback support and new MPM cloth constraints.
— Viewport improvements: Users can interact with and select dense geometry or a large number of smaller meshes faster in the viewport and UV editors.
— Modeling enhancements: New Remesh and Retopologize features.
— Rigging improvements: Matrix-driven workflows, nodes for precisely tracking positions on deforming geometry and a new GPU-accelerated wrap deformer.

The Arnold GPU is based on Nvidia’s OptiX framework and takes advantage of Nvidia RTX technology. Arnold 6 highlights include:

— Unified renderer— Toggle between CPU and GPU rendering.
— Lights, cameras and More— Support for OSL, OpenVDB volumes, on-demand texture loading, most LPEs, lights, shaders and all cameras.
— Reduced GPU noise— Comparable to CPU noise levels when using adaptive sampling, which has been improved to yield faster, more predictable results regardless of the renderer used.
— Optimized for Nvidia RTX hardware— Scale up rendering power when production demands it.
— New USD components— Hydra render delegate, Arnold USD procedural and USD schemas for Arnold nodes and properties are now available on GitHub.

Arnold 6

— Performance improvements— Faster creased subdivisons, an improved Physical Sky shader and dielectric microfacet multiple scattering.

Maya 2020 and Arnold 6 are available now as standalone subscriptions or with a collection of end-to-end creative tools within the Autodesk Media & Entertainment Collection. Monthly, annual and three-year single-user subscriptions of Arnold are available on the Autodesk e-store.

Arnold GPU is also available to try with a free 30-day trial of Arnold 6. Arnold GPU is available in all supported plug-ins for Autodesk Maya, Autodesk 3ds Max, SideFX Houdini, Maxon Cinema 4D and Foundry Katana.

Storage for Visual Effects

By Karen Moltenbrey

When creating visual effects for a live-action film or television project, the artist digs right in. But not before the source files are received and backed up. Of course, during the process, storage again comes into play, as the artist’s work is saved and composited into the live-action file and then saved (and stored) yet again. At mid-sized Artifex Studios and the larger Jellyfish Pictures, two visual effects studios, storage might not be the sexiest part of the work they do, but it is vital to a successful outcome nonetheless.

Artifex Studios
An independent studio in Vancouver, BC, Artifex Studios is a small- to mid-sized visual effects facility producing film and television projects for networks, film studios and streaming services. Founded in 1997 by VFX supervisor Adam Stern, the studio has grown over the years from a one- to two-person operation to one staffed by 35 to 45 artists. During that time it has built up a lengthy and impressive resume, from Charmed, Descendants 3 and The Crossing to Mission to Mars, The Company You Keep and Apollo 18.

To handle its storage needs, Artifex uses the Qumulo QC24 four-node storage cluster for its main storage system, along with G-Tech and LaCie portable RAIDs and Angelbird Technologies and Samsung portable SSD drives. “We’ve been running [Qumulo] for several years now. It was a significant investment for us because we’re not a huge company, but it has been tremendously successful for us,” says Stern.

“The most important things for us when it comes to storage are speed, data security and minimal downtime. They’re pretty obvious things, but Qumulo offered us a system that eliminated one of the problems we had been having with the [previous] system bogging down as concurrent users were moving the files around quickly between compositors and 3D artists,” says Stern. “We have 40-plus people hitting this thing, pulling in 4K, 6K, 8K footage from it, rendering and [creating] 3D, and it just ticks along. That was huge for us.”

Of course, speed is of utmost importance, but so is maintaining the data’s safety. To this end, the new system self-monitors, taking its own snapshots to maintain its own health and making sure there are constantly rotating levels of backups. Having the ability to monitor everything about the system is a big plus for the studio as well.

Because data safety and security is non-negotiable, Artifex uses Google Cloud services along with Qumulo for incremental storage, every night incrementally backing up to Google Cloud. “So while Qumulo is doing its own snapshots incrementally, we have another hard-drive system from Synology, which is more of a prosumer NAS system, whose only job is to do a local current backup,” Stern explains. “So in-house, we have two local backups between Qumulo and Synology, and then we have a third backup going to the cloud every night that’s off-site. When a project is complete, we archive it onto two sets of local hard drives, and one leaves the premises and the other is stored here.” At this point, the material is taken off the Qumulo system, and seven days later, the last of the so-called snapshots is removed.

As soon as data comes into Artifex — either via Aspera, Signiant’s Media Shuttle or hard disks — the material is immediately transferred to the Qumulo system, and then it is cataloged and placed into the studio’s ftrack database, which the studio uses for shot tracking. Then, as Stern says, the floodgates open, and all the artists, compositors, 3D team members and admin coordination team members access the material that resides on the Qumulo system.

Desktops at the studio have local storage, generally an SSD built into the machine, but as Stern points out, that is a temporary solution used by the artists while working on a specific shot, not to hold studio data.

Artifex generally works on a handful of projects simultaneously, including the Nickelodeon horror anthology Are You Afraid of the Dark? “Everything we do here requires storage, and we’re always dealing with high-resolution footage, and that project was no exception,” says Stern. For instance, the series required Artifex to simulate 10,000 CG cockroaches spilling out of every possible hole in a room — work that required a lot of high-speed caching.

“FX artists need to access temporary storage very quickly to produce those simulations. In terms of the Qumulo system, we need it to retrieve files at the speed our effects artists can simulate and cache, and make sure they are able to manage what can be thousands and thousands of files generated just within a few hours.”

Similarly, for Netflix’s Wu Assassins, the studio generated multiple simulations of CG smoke and fog within SideFX’s Side Effects Houdini and again had to generate thousands and thousands of cache files for all the particles and volume information. Just as it did with the caching for the CG cockroaches, the current system handled caching for the smoke and fog quite efficiently.

At this point, Stern says the vendor is doing some interesting things that his company has not yet taken advantage of. For instance, today one of the big pushes is working in the cloud and integrating that with infrastructures and workflows. “I know they are working on that, and we’re looking into that,” he adds. There are also some new equipment features, “bleeding-edge stuff” Artifex has not explored yet. “It’s OK to be cutting-edge, but bleeding-edge is a little scary for us,” Stern notes. “I know they are always playing with new features, but just having the important foundation of speed and security is right where we are at the moment.”

Jellyfish Pictures
When it comes to big projects with big storage needs, Jellyfish Pictures is no fish out of water. The studio works on myriad projects, from Hollywood blockbusters like Star Wars to high-end TV series like Watchmen to episodic animation like Floogals and Dennis & Gnasher: Unleashed! Recently, it has embarked on an animated feature for DreamWorks and has a dedicated art department that works on visual development for substantial VFX projects and children’s animated TV content.

To handle all this work, Jellyfish has five studios across the UK: four in London and one in Sheffield, in the north of England. What’s more, in early December, Jellyfish expanded further with a brand-new virtual studio in London seating over 150 artists — increasing its capacity to over 300 people. In line with this expansion, Jellyfish is removing all on-site infrastructure from its existing locales and moving everything to a co-location. This means that all five present locations will be wholly virtual as well, making Jellyfish the largest VFX and animation studio in the world operating this way, contends CTO Jeremy Smith.

“We are dealing with shows that have very large datasets, which, therefore, require high-performance computing. It goes without saying, then, that we need some pretty heavy-duty storage,” says Smith.

Not only must the storage solution be able to handle Jellyfish’s data needs, it must also fit into its operational model. “Even though we work across multiple sites, we don’t want our artists to feel that. We need a storage system that can bring together all locations into one centralized hub,” Smith explains. “As a studio, we do not rely on one storage hardware vendor; therefore, we need to work with a company that is hardware-agnostic in addition to being able to operate in the cloud.”

Also, Jellyfish is a TPN-assessed studio and thus has to work with vendors that are TPN compliant — another serious, and vital, consideration when choosing its storage solution. TPN is an initiative between the Motion Picture Association of America (MPAA) and the Content Delivery and Security Association (CDSA) that provides a set of requirements and best practices around preventing leaks, breaches and hacks of pre-released, high-valued media content.

With all those factors in mind, Jellyfish uses PixStor from Pixit Media for its storage solution. PixStor is a software-defined storage solution that allows the studio to use various hardware storage from other vendors under the hood. With PixStor, data moves seamlessly through many tiers of storage — from fast flash and disk tiers to cost-effective, high-capacity object storage to the cloud. In addition, the studio uses NetApp storage within a different part of the same workflow on Dell R740 hardware and alternates between SSD and spinning disks, depending on the purpose of the data and the file size.

“We’ve future-proofed our studio with the Mellanox SN2100 switch for the heavy lifting, and for connecting our virtual workstations to the storage, we are using several servers from the Dell N3000 series,” says Smith.

As a wholly virtual studio, Jellyfish has no storage housed locally; it all sits in a co-location, which is accessed through remote workstations powered by Teradici’s PCoIP technology.

According to Smith, becoming a completely virtual studio is a new development for Jellyfish. Nevertheless, the facility has been working with Pixit Media since 2014 and launched its first virtual studio in 2017, “so the building blocks have been in place for a while,” he says.

Prior to moving all the infrastructure off-site, Jellyfish ran its storage system out of its Brixton and Soho studios locally. Its own private cloud from Brixton powered Jellyfish’s Soho and Sheffield studios. Both PixStor storage solutions in Brixton and Soho were linked with the solution’s PixCache. The switches and servers were still from Dell and Mellanox but were an older generation.

“Way back when, before we adopted this virtual world we are living in, we still worked with on-premises and inflexible storage solutions. It limited us in terms of the work we could take on and where we could operate,” says Smith. “With this new solution, we can scale up to meet our requirements.”

Now, however, using Mellanox SN2100, which has 100GbE, Jellyfish can deal with obscene amounts of data, Smith contends. “The way the industry is moving with 4K and 8K, even 16K being thrown around, we need to be ready,” he says.

Before the co-location, the different sites were connected through PixCache; now the co-location and public cloud are linked via Ngenea, which pre-caches files locally to the render node before the render starts. Furthermore, the studio is able to unlock true multi-tenancy with a single storage namespace, rapidly deploying logical TPN-accredited data separation and isolation and scaling up services as needed. “Probably two of the most important facets for us in running a successful studio: security and flexibility,” says Smith.

Artists access the storage via their Teradici Zero Clients, which, through the Dell switches, connect users to the standard Samba SMB network. Users who are working on realtime clients or in high resolution are connected to the Pixit storage through the Mellanox switch, where PixStor Native Client is used.

“Storage is a fundamental part of any VFX and animation studio’s workflow. Implementing the correct solution is critical to the seamless running of a project, as well as the security and flexibility of the business,” Smith concludes. “Any good storage system is invisible to the user. Only the people who build it will ever know the precision it takes to get it up and running — and that is the sign you’ve got the perfect solution.”


Karen Moltenbrey is a veteran writer, covering visual effects and post production.

Reallusion’s Headshot plugin for realistic digi-doubles via AI

Reallusion has introduced a plugin for Character Creator 3 to help create realistic-looking digital doubles. According to the company, the Headshot plugin uses AI technology to automatically generate a digital human in minutes from one single photo, and those characters are fully rigged for voice lipsync, facial expression and full body animation.

Headshot allows game developers and virtual production teams to quickly funnel a cast of digital doubles into iClone, Unreal, Unity, Maya, ZBrush and more. The idea is to allow the digital humans to go anywhere they like and give creators a solution to rapidly develop, iterate and collaborate in realtime.

The plugin has two AI modes: Auto Mode and Pro Mode. Auto Mode is a one-click solution for creating mid-rez digital human crowds. This process allows one-click head and hair creation for realtime 3D head models. It also generates a separate 3D hair mesh with alpha mask to soften edge lines. The 3D hair is fully compatible with Character Creator’s conformable hair format (.ccHair). Users can add them into their hair library, and apply them to other CC characters.

Headshot Pro Mode offers full control of the 3D head generation process with advanced features such as Image Matching, Photo Reprojection and Custom Mask with up to 4,096-texture resolution.

The Image Matching Tool overlays an image reference plane for advanced head shape refinement and lens correction. With Photo Reprojection, users can easily fix the texture-to-mesh discrepancies resulting from face morph change.

Using high-rez source images and Headshot’s 1,000-plus morphs, users can get a scan-quality digital human face in 4K texture details. Additional textures include normal, AO, roughness, metallic, SSS and Micro Normal for more realistic digital human rendering.

The 3D Head Morph System is designed to achieve the professional and detailed look of 3D scan models. The 3D sculpting design allow users to hover over a control area and use directional mouse drags to adjust the corresponding mesh shape, from full head and face sculpting to individual features — head contour, face, eyes, nose, mouth and ears with more than 1,000 head morphs. It is now free with a purchase of the Headshot plugin.

The Headshot plugin for Character Creator is $199 and comes with the content pack Headshot Morph 1,000+ ($99). Character Creator 3 Pipeline costs $199.

Framestore VFX will open in Mumbai in 2020

Oscar-winning creative studio Framestore will be opening a full-service visual effects studio in Mumbai in 2020 to target India’s booming creative industry. The studio will be located in the Nesco IT Park in Goregaon, in the center of Mumbai’s technology district. The news hammers home Framestore’s continued interest in India, after having made a major investment in Jesh Krishna Murthy’s VFX studio, Anibrain, in 2017.

“Mumbai represents a rolling of wheels that were set in motion over two years ago,” says Framestore founder/CEO William Sargent. “Our investment in Anibrain has grown considerably, and we continue in our partnership with Jesh Krishna Murthy to develop and grow that business. Indeed, they will become a valued production partner to our Mumbai offering.”

Framestore looks to make considerable hires in the coming months, aiming to build an initial 500-strong team with existing Framestore talent combined with the best of local Indian expertise. Mumbai will work alongside the global network, including London and Montreal, to create a cohesive virtual team delivering high-quality international work.

“Mumbai has become a center of excellence in digital filmmaking. There’s a depth of talent that can deliver to the scale of Hollywood with the color and flair of Bollywood,” Sargent continues. “It’s an incredibly vibrant city and its presence on the international scene is holding us all to a higher standard. In terms of visual effects, we will set the standard here as we did in Montreal almost eight years ago.”

 

London’s Freefolk beefs up VFX team

Soho-based visual effects studio Freefolk, which has seen growth in its commercials and longform work, has grown its staff to meet this demand. As part of the uptick in work, Freefolk promoted Cheryl Payne from senior producer to head of commercial production. Additionally, Laura Rickets has joined as senior producer, and 2D artist Bradley Cocksedge has been added to the commercials VFX team.

Payne, who has been with Freefolk since the early days, has worked on some of the studio’s biggest commercials, including; Warburtons for Engine, Peloton for Dark Horses and Cadburys for VCCP.

Rickets comes to Freefolk with over 18 years of production experience working at some of the biggest VFX houses in London, including Framestore, The Mill and Smoke & Mirrors, as well as agency side for McCann. Since joining the team, Rickets has VFX-produced work on the I’m A Celebrity IDs, a set of seven technically challenging and CG-heavy spots for the new series of the show as well as ads for the Rugby World Cup and Who Wants to Be a Millionaire?.

Cocksedge is a recent graduate who joins from Framestore, where he was working as an intern on Fantastic Beasts: The Crimes of Grindelwald. While in school at the University of Hertfordshire, he interned at Freefolk and is happy to be back in a full-time position.

“We’ve had an exciting year and have worked on some really stand-out commercials, like TransPennine for Engine and the beautiful spot for The Guardian we completed with Uncommon, so we felt it was time to add to the Freefolk family,” says Fi Kilroe, Freefolk’s co-managing director/executive producer.

Main Image: (L-R) Cheryl Payne, Laura Rickets and Bradley Cocksedge

Behind the Title: MPC’s CD Morten Vinther

This creative director/director still jumps on the Flame and also edits from time to time. “I love mixing it up and doing different things,” he says.

NAME: Morten Vinther

COMPANY: Moving Picture Company, Los Angeles

CAN YOU DESCRIBE YOUR COMPANY?
From original ideas all the way through to finished production, we are an eclectic mix of hard-working and passionate artists, technologists and creatives who push the boundaries of what’s possible for our clients. We aim to move the audience through our work.

WHAT’S YOUR JOB TITLE?
Creative Director and Director

WHAT DOES THAT ENTAIL?
I guide our clients through challenging shoots and post. I try to keep us honest in terms of making sure that our casting is right and the team is looked after and has the appropriate resources available for the tasks ahead, while ensuring that we go above and beyond on quality and experience. In addition to this, I direct projects, pitch on new business and develop methodology for visual effects.

American Horror Story

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
I still occasionally jump on Flame and comp a job — right now I’m editing a commercial. I love mixing it up and doing different things.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Writing treatments. The moments where everything is crystal clear in your head and great ideas and concepts are rushing onto paper like an unstoppable torrent.

WHAT’S YOUR LEAST FAVORITE?
Writing treatments. Staring at a blank page, writing something and realizing how contrived it sounds before angrily deleting everything.

WHAT IS YOUR MOST PRODUCTIVE TIME OF THE DAY?
Early mornings. A good night’s sleep and freshly ground coffee creates a fertile breeding ground for pure clarity, ideas and opportunities.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I would be carefully malting barley for my next small batch of artisan whisky somewhere on the Scottish west coast.

Adidas Creators

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I remember making a spoof commercial at my school when I was about 13 years old. I became obsessed with operating cameras and editing, and I began to study filmmakers like Scorsese and Kubrick. After a failed career as a shopkeeper, a documentary production company in Copenhagen took mercy on me, and I started as an assistant editor.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
American Horror Story, Apple Unlock, directed by Dougal Wilson, and Adidas Creators, directed by Stacy Wall.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
If I had to single one out, it would probably be Apple’s Unlock commercial. The spot looks amazing, and the team was incredibly creative on this one. We enjoyed a great collaboration between several of our offices, and it was a lot of fun putting it together.

Apple’s Unlock

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
My phone, laptop and PlayStation.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
Some say social media rots your brains. That’s probably why I’m an Instagram addict.

CARE TO SHARE YOUR FAVORITE MUSIC TO WORK TO?
Odesza, SBTRKT, Little Dragon, Disclosure and classic reggae.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I recently bought a motorbike, and I spin around LA and Southern California most weekends. Concentrating on how to survive the next turn is a great way for me to clear the mind.

Director Robert Eggers talks about his psychological thriller The Lighthouse

By Iain Blair

Writer/director Robert Eggers burst onto the scene when his feature film debut, The Witch, won the Directing Award in the US Dramatic category at the 2015 Sundance Film Festival. He followed up that success by co-writing and directing another supernatural, hallucinatory horror film, The Lighthouse, which is set in the maritime world of the late 19th century.

L-R: Director Robert Eggers and cinematographer Jarin Blaschke on set.

The story begins when two lighthouse keepers (Willem Dafoe and Robert Pattinson) arrive on a remote island off the coast of New England for their month-long stay. But that stay gets extended as they’re trapped and isolated due to a seemingly never-ending storm. Soon, the two men engage in an escalating battle of wills, as tensions boil over and mysterious forces (which may or may not be real) loom all around them.

The Lighthouse has the power of an ancient myth. To tell this tale, which was shot in black and white, Eggers called on many of those who helped him create The Witch, including cinematographer Jarin Blaschke, production designer Craig Lathrop, composer Mark Korven and editor Louise Ford.

I recently talked to Eggers, who got his professional start directing and designing experimental and classical theater in New York City, about making the film, his love of horror and the post workflow.

Why does horror have such an enduring appeal?
My best argument is that there’s darkness in humanity, and we need to explore that. And horror is great at doing that, from the Gothic to a bad slasher movie. While I may prefer authors who explore the complexities in humanity, others may prefer schlocky films with jump scares that make you spill your popcorn, which still give them that dose of darkness. Those films may not be seriously probing the darkness, but they can relate to it.

This film seems more psychological than simple horror.
We’re talking about horror, but I’m not even sure that this is a horror film. I don’t mind the label, even though most wannabe auteurs are like, “I don’t like labels!” It started with an idea my brother Max had for a ghost story set in a lighthouse, which is not what this movie became. But I loved the idea, which was based on a true story. It immediately evoked a black and white movie on 35mm negative with a boxy aspect ratio of 1.19:1, like the old movies, and a fusty, dusty, rusty, musty atmosphere — the pipe smoke and all the facial hair — so I just needed a story that went along with all of that. (Laughs) We were also thinking a lot about influences and writers from the time — like Poe, Melville and Stevenson — and soaking up the jargon of the day. There were also influences like Prometheus and Proteus and God knows what else.

Casting the two leads was obviously crucial. What did Willem and Robert bring to their roles?
Absolute passion and commitment to the project and their roles. Who else but Willem can speak like a North Atlantic pirate stereotype and make it totally believable? Robert has this incredible intensity, and together they play so well against each other and are so well suited to this world. And they both have two of the best faces ever in cinema.

What were the main technical challenges in pulling it all together, and is it true you actually built the lighthouse?
We did. We built everything, including the 70-foot tower — a full-scale working lighthouse, along with its house and outbuildings — on Cape Forchu in Nova Scotia, which is this very dramatic outcropping of volcanic rock. Production designer Craig Lathrop and his team did an amazing job, and the reason we did that was because it gave us far more control than if we’d used a real lighthouse.

We scouted a lot but just couldn’t find one that suited us, and the few that did were far too remote to access. We needed road access and a place with the right weather, so in the end it was better to build it all. We also shot some of the interiors there as well, but most of them were built on soundstages and warehouses in Halifax since we knew it’d be very hard to shoot interiors and move the camera inside the lighthouse tower itself.

Your go-to DP, Jarin Blaschke, shot it. Talk about how you collaborated on the look and why you used black and white.
I love the look of black and white, because it’s both dreamlike and also more realistic than color in a way. It really suited both the story and the way we shot it, with the harsh landscape and a lot of close-ups of Willem and Robert. Jarin shot the film on the Panavision Millennium XL2, and we also used vintage Baltar lenses from the 1930s, which gave the film a great look, as they make the sea, water and sky all glow and shimmer more. He also used a custom cyan filter by Schneider Filters that gave us that really old-fashioned look. Then by using black and white, it kept the overall look very bleak at all times.

How tough was the shoot?
It was pretty tough, and all the rain and pounding wind you see onscreen is pretty much real. Even on the few sunny days we had, the wind was just relentless. The shoot was about 32 days, and we were out in the elements in March and April of last year, so it was freezing cold and very tough for the actors. It was very physically demanding.

Where did you post?
We did it all in New York at Harbor Post, with some additional ADR work at Goldcrest in London with Robert.

Do you like the post process?
I love post, and after the very challenging shoot, it was such a relief to just get in a warm, dry, dark room and start cutting and pulling it all together.

Talk about editing with Louise Ford, who also cut The Witch. How did that work?
She was with us on the shoot at a bed and breakfast, so I could check in with her at the end of the day. But it was so tough shooting that I usually waited until the weekends to get together and go over stuff. Then when we did the stage work at Halifax, she had an edit room set up there, and that was much easier.

What were the big editing challenges?
The DP and I developed such a specific and detailed cinema language without a ton of coverage and with little room for error that we painted ourselves into a corner. So that became the big challenge… when something didn’t work. It was also about getting the running time down but keeping the right pace since the performances dictate the pace of the edit. You can’t just shorten stuff arbitrarily. But we didn’t leave a lot of stuff on the cutting room floor. The assembly was just over two hours and the final film isn’t much shorter.

All the sound effects play a big role. Talk about the importance of sound and working on them with sound designer Damian Volpe, whose credits include Can You Ever Forgive Me?, Leave No Trace, Mudbound, Drive, Winter’s Bone and Margin Call.
It’s hugely important in this film, and Louise and I did a lot of work in the picture edit to create temps for Damian to inspire him. And he was so relentless in building up the sound design, and even creating weird sounds to go with the actual light, and to go with the score by Mark Korven, who did The Witch, and all the brass and unusual instrumentation he used on this. So the result is both experimental and also quite traditional, I think.

There are quite a few VFX shots. Who did them, and what was involved?
We had MELS and Oblique in Quebec and Brainstorm Digital in New York also did some. The big one was that the movie’s set on an island but we shot on a peninsula, which also had a lighthouse further north, which unfortunately didn’t look at all correct, so we framed it out a lot but we had to erase it for some of the time. And our period-correct sea ship broke down and had to be towed around by other ships, so there was a lot of clean up. Also with all the safety cables we had to use for cliff shots with the actors.

Where did you do the DI, and how important is it to you?
We did it at Harbor with colorist Joe Gawler, and it was hugely important although it was fairly simple because there’s very little latitude on the Double-X film stock we used. We did a lot of fine detail work to finesse it, but it was a lot quicker than if it’d been in color.

Did the film turn out the way you hoped?
No, they always change and surprise you, but I’m very proud of what we did.

What’s next?
I’m prepping another period piece, but it’s not a horror film. That’s all I can say.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Alkemy X adds Albert Mason as head of production

Albert Mason has joined VFX house Alkemy X as head of production. He comes to Alkemy X with over two decades of experience in visual effects and post production. He has worked on projects directed by such industry icons as Peter Jackson on the Lord of the Rings trilogy, Tim Burton on Alice in Wonderland and Robert Zemeckis on The Polar Express. In his new role at Alkemy X, he will use his experience in feature films to target the growing episodic space.

A large part of Alkemy X’s work has been for episodic visual effects, with credits that include Amazon Prime’s Emmy-winning original series, The Marvelous Mrs. Maisel, USA’s Mr. Robot, AMC’s Fear the Walking Dead, Netflix’s Maniac, NBC’s Blindspot and Starz’s Power.

Mason began his career at MTV’s on-air promos department, sharpening his production skills on top series promo campaigns and as a part of its newly launched MTV Animation Department. He took an opportunity to transition into VFX, stepping into a production role for Weta Digital and spending three years working globally on the Lord of the Rings trilogy. He then joined Sony Pictures Imageworks, where he contributed to features including Spider-Man 3 and Ghost Rider. He has also produced work for such top industry shops as Logan, Rising Sun Pictures and Greymatter VFX.

“[Albert’s] expertise in constructing advanced pipelines that embrace emerging technologies will be invaluable to our team as we continue to bolster our slate of VFX work,” says Alkemy X president/CEO Justin Wineburgh.

2019 HPA Award winners announced

The industry came together on November 21 in Los Angeles to celebrate its own at the 14th annual HPA Awards. Awards were given to individuals and teams working in 12 creative craft categories, recognizing outstanding contributions to color grading, sound, editing and visual effects for commercials, television and feature film.

Rob Legato receiving Lifetime Achievement Award from presenter Mike Kanfer. (Photo by Ryan Miller/Capture Imaging)

As was previously announced, renowned visual effects supervisor and creative Robert Legato, ASC, was honored with this year’s HPA Lifetime Achievement Award; Peter Jackson’s They Shall Not Grow Old was presented with the HPA Judges Award for Creativity and Innovation; acclaimed journalist Peter Caranicas was the recipient of the very first HPA Legacy Award; and special awards were presented for Engineering Excellence.

The winners of the 2019 HPA Awards are:

Outstanding Color Grading – Theatrical Feature

WINNER: “Spider-Man: Into the Spider-Verse”
Natasha Leonnet // Efilm

“First Man”
Natasha Leonnet // Efilm

“Roma”
Steven J. Scott // Technicolor

Natasha Leonnet (Photo by Ryan Miller/Capture Imaging)

“Green Book”
Walter Volpatto // FotoKem

“The Nutcracker and the Four Realms”
Tom Poole // Company 3

“Us”
Michael Hatzer // Technicolor

 

Outstanding Color Grading – Episodic or Non-theatrical Feature

WINNER: “Game of Thrones – Winterfell”
Joe Finley // Sim, Los Angeles

 “The Handmaid’s Tale – Liars”
Bill Ferwerda // Deluxe Toronto

“The Marvelous Mrs. Maisel – Vote for Kennedy, Vote for Kennedy”
Steven Bodner // Light Iron

“I Am the Night – Pilot”
Stefan Sonnenfeld // Company 3

“Gotham – Legend of the Dark Knight: The Trial of Jim Gordon”
Paul Westerbeck // Picture Shop

“The Man in The High Castle – Jahr Null”
Roy Vasich // Technicolor

 

Outstanding Color Grading – Commercial  

WINNER: Hennessy X.O. – “The Seven Worlds”
Stephen Nakamura // Company 3

Zara – “Woman Campaign Spring Summer 2019”
Tim Masick // Company 3

Tiffany & Co. – “Believe in Dreams: A Tiffany Holiday”
James Tillett // Moving Picture Company

Palms Casino – “Unstatus Quo”
Ricky Gausis // Moving Picture Company

Audi – “Cashew”
Tom Poole // Company 3

 

Outstanding Editing – Theatrical Feature

Once Upon a Time… in Hollywood

WINNER: “Once Upon a Time… in Hollywood”
Fred Raskin, ACE

“Green Book”
Patrick J. Don Vito, ACE

“Rolling Thunder Revue: A Bob Dylan Story by Martin Scorsese”
David Tedeschi, Damian Rodriguez

“The Other Side of the Wind”
Orson Welles, Bob Murawski, ACE

“A Star Is Born”
Jay Cassidy, ACE

 

Outstanding Editing – Episodic or Non-theatrical Feature (30 Minutes and Under)

VEEP

WINNER: “Veep – Pledge”
Roger Nygard, ACE

“Russian Doll – The Way Out”
Todd Downing

“Homecoming – Redwood”
Rosanne Tan, ACE

“Withorwithout”
Jake Shaver, Shannon Albrink // Therapy Studios

“Russian Doll – Ariadne”
Laura Weinberg

 

Outstanding Editing – Episodic or Non-theatrical Feature (Over 30 Minutes)

WINNER: “Stranger Things – Chapter Eight: The Battle of Starcourt”
Dean Zimmerman, ACE, Katheryn Naranjo

“Chernobyl – Vichnaya Pamyat”
Simon Smith, Jinx Godfrey // Sister Pictures

“Game of Thrones – The Iron Throne”
Katie Weiland, ACE

“Game of Thrones – The Long Night”
Tim Porter, ACE

“The Bodyguard – Episode One”
Steve Singleton

 

Outstanding Sound – Theatrical Feature

WINNER: “Godzilla: King of Monsters”
Tim LeBlanc, Tom Ozanich, MPSE // Warner Bros.
Erik Aadahl, MPSE, Nancy Nugent, MPSE, Jason W. Jennings // E Squared

“Shazam!”
Michael Keller, Kevin O’Connell // Warner Bros.
Bill R. Dean, MPSE, Erick Ocampo, Kelly Oxford, MPSE // Technicolor

“Smallfoot”
Michael Babcock, David E. Fluhr, CAS, Jeff Sawyer, Chris Diebold, Harrison Meyle // Warner Bros.

“Roma”
Skip Lievsay, Sergio Diaz, Craig Henighan, Carlos Honc, Ruy Garcia, MPSE, Caleb Townsend

“Aquaman”
Tim LeBlanc // Warner Bros.
Peter Brown, Joe Dzuban, Stephen P. Robinson, MPSE, Eliot Connors, MPSE // Formosa Group

 

Outstanding Sound – Episodic or Non-theatrical Feature

WINNER: “The Haunting of Hill House – Two Storms”
Trevor Gates, MPSE, Jason Dotts, Jonathan Wales, Paul Knox, Walter Spencer // Formosa Group

“Chernobyl – 1:23:45”
Stefan Henrix, Stuart Hilliker, Joe Beal, Michael Maroussas, Harry Barnes // Boom Post

“Deadwood: The Movie”
John W. Cook II, Bill Freesh, Mandell Winter, MPSE, Daniel Colman, MPSE, Ben Cook, MPSE, Micha Liberman // NBC Universal

“Game of Thrones – The Bells”
Tim Kimmel, MPSE, Onnalee Blank, CAS, Mathew Waters, CAS, Paula Fairfield, David Klotz

“Homecoming – Protocol”
John W. Cook II, Bill Freesh, Kevin Buchholz, Jeff A. Pitts, Ben Zales, Polly McKinnon // NBC Universal

 

Outstanding Sound – Commercial 

WINNER: John Lewis & Partners – “Bohemian Rhapsody”
Mark Hills, Anthony Moore // Factory

Audi – “Life”
Doobie White // Therapy Studios

Leonard Cheshire Disability – “Together Unstoppable”
Mark Hills // Factory

New York Times – “The Truth Is Worth It: Fearlessness”
Aaron Reynolds // Wave Studios NY

John Lewis & Partners – “The Boy and the Piano”
Anthony Moore // Factory

 

Outstanding Visual Effects – Theatrical Feature

WINNER: “The Lion King”
Robert Legato
Andrew R. Jones
Adam Valdez, Elliot Newman, Audrey Ferrara // MPC Film
Tom Peitzman // T&C Productions

“Avengers: Endgame”
Matt Aitken, Marvyn Young, Sidney Kombo-Kintombo, Sean Walker, David Conley // Weta Digital

“Spider-Man: Far From Home”
Alexis Wajsbrot, Sylvain Degrotte, Nathan McConnel, Stephen Kennedy, Jonathan Opgenhaffen // Framestore

“Alita: Battle Angel”
Eric Saindon, Michael Cozens, Dejan Momcilovic, Mark Haenga, Kevin Sherwood // Weta Digital

“Pokemon Detective Pikachu”
Jonathan Fawkner, Carlos Monzon, Gavin Mckenzie, Fabio Zangla, Dale Newton // Framestore

 

Outstanding Visual Effects – Episodic (Under 13 Episodes) or Non-theatrical Feature

Game of Thrones

WINNER: “Game of Thrones – The Bells”
Steve Kullback, Joe Bauer, Ted Rae
Mohsen Mousavi // Scanline
Thomas Schelesny // Image Engine

“Game of Thrones – The Long Night”
Martin Hill, Nicky Muir, Mike Perry, Mark Richardson, Darren Christie // Weta Digital

“The Umbrella Academy – The White Violin”
Everett Burrell, Misato Shinohara, Chris White, Jeff Campbell, Sebastien Bergeron

“The Man in the High Castle – Jahr Null”
Lawson Deming, Cory Jamieson, Casi Blume, Nick Chamberlain, William Parker, Saber Jlassi, Chris Parks // Barnstorm VFX

“Chernobyl – 1:23:45”
Lindsay McFarlane
Max Dennison, Clare Cheetham, Steven Godfrey, Luke Letkey // DNEG

 

Outstanding Visual Effects – Episodic (Over 13 Episodes)

Team from The Orville – Outstanding VFX, Episodic, Over 13 Episodes (Photo by Ryan Miller/Capture Imaging)

WINNER: “The Orville – Identity: Part II”
Tommy Tran, Kevin Lingenfelser, Joseph Vincent Pike // FuseFX
Brandon Fayette, Brooke Noska // Twentieth Century FOX TV

“Hawaii Five-O – Ke iho mai nei ko luna”
Thomas Connors, Anthony Davis, Chad Schott, Gary Lopez, Adam Avitabile // Picture Shop

“9-1-1 – 7.1”
Jon Massey, Tony Pirzadeh, Brigitte Bourque, Gavin Whelan, Kwon Choi // FuseFX

“Star Trek: Discovery – Such Sweet Sorrow Part 2”
Jason Zimmerman, Ante Dekovic, Aleksandra Kochoska, Charles Collyer, Alexander Wood // CBS Television Studios

“The Flash – King Shark vs. Gorilla Grodd”
Armen V. Kevorkian, Joshua Spivack, Andranik Taranyan, Shirak Agresta, Jason Shulman // Encore VFX

The 2019 HPA Engineering Excellence Awards were presented to:

Adobe – Content-Aware Fill for Video in Adobe After Effects

Epic Games — Unreal Engine 4

Pixelworks — TrueCut Motion

Portrait Displays and LG Electronics — CalMan LUT based Auto-Calibration Integration with LG OLED TVs

Honorable Mentions were awarded to Ambidio for Ambidio Looking Glass; Grass Valley, for creative grading; and Netflix for Photon.

Creating With Cloud: A VFX producer’s perspective

By Chris Del Conte

The ‘90s was an explosive era for visual effects, with films like Jurassic Park, Independence Day, Titanic and The Matrix shattering box office records and inspiring a generation of artists and filmmakers, myself included. I got my start in VFX working on seaQuest DSV, an Amblin/NBC sci-fi series that was ground-breaking for its time, but looking at the VFX of modern films like Gemini Man, The Lion King and Ad Astra, it’s clear just how far the industry has come. A lot of that progress has been enabled by new technology and techniques, from the leap to fully digital filmmaking and emergence of advanced viewing formats like 3D, Ultra HD and HDR to the rebirth of VR and now the rise of cloud-based workflows.

In my nearly 25 years in VFX, I’ve worn a lot of hats, including VFX producer, head of production and business development manager. Each role involved overseeing many aspects of a production and, collectively, they’ve all shaped my perspective when it comes to how the cloud is transforming the entire creative process. Thanks to my role at AWS Thinkbox, I have a front-row seat to see why studios are looking at the cloud for content creation, how they are using the cloud, and how the cloud affects their work and client relationships.

Chris Del Conte on the set of the IMAX film Magnificent Desolation.

Why Cloud?
We’re in a climate of high content demand and massive industry flux. Studios are incentivized to find ways to take on more work, and that requires more resources — not just artists, but storage, workstations and render capacity. Driving a need to scale, this trend often motivates studios to consider the cloud for production or to strengthen their use of cloud in their pipelines if already in play. Cloud-enabled studios are much more agile than traditional shops. When opportunities arise, they can act quickly, spinning resources up and down at a moment’s notice. I realize that for some, the concept of the cloud is still a bit nebulous, which is why finding the right cloud partner is key. Every facility is different, and part of the benefit of cloud is resource customization. When studios use predominantly physical resources, they have to make decisions about storage and render capacity, electrical and cooling infrastructure, and staff accommodations up front (and pay for them). Using the cloud allows studios to adjust easily to better accommodate whatever the current situation requires.

Artistic Impact
Advanced technology is great, but artists are by far a studio’s biggest asset; automated tools are helpful but won’t deliver those “wow moments” alone. Artists bring the creativity and talent to the table, then, in a perfect world, technology helps them realize their full potential. When artists are free of pipeline or workflow distractions, they can focus on creating. The positive effects spill over into nearly every aspect of production, which is especially true when cloud-based rendering is used. By scaling render resources via the cloud, artists aren’t limited by the capacity of their local machines. Since they don’t have to wait as long for shots to render, artists can iterate more fluidly. This boosts morale because the final results are closer to what artists envisioned, and it can improve work-life balance since artists don’t have to stick around late at night waiting for renders to finish. With faster render results, VFX supervisors also have more runway to make last-minute tweaks. Ultimately, cloud-based rendering enables a higher caliber of work and more satisfied artists.

Budget Considerations
There are compelling arguments for shifting capital expenditures to operational expenditures with the cloud. New studios get the most value out of this model since they don’t have legacy infrastructure to accommodate. Cloud-based solutions level the playing field in this respect; it’s easier for small studios and freelancers to get started because there’s no significant up-front hardware investment. This is an area where we’ve seen rapid cloud adoption. Considering how fast technology changes, it seems ill-advised to limit a new studio’s capabilities to today’s hardware when the cloud provides constant access to the latest compute resources.

When a studio has been in business for decades and might have multiple locations with varying needs, its infrastructure is typically well established. Some studios may opt to wait until their existing hardware has fully depreciated before shifting resources to the cloud, while others dive in right away, with an eye on the bigger picture. Rendering is generally a budgetary item on project bids, but with local hardware, studios are working to recoup a sunk cost. Using the cloud, render compute can be part of a bid and becomes a negotiable item. Clients can determine the delivery timeline based on render budget, and the elasticity of cloud resources allows VFX studios to pick up more work. (Even the most meticulously planned productions can run into 911 issues ahead of delivery, and cloud-enabled studios have bandwidth to be the hero when clients are in dire straits.)

Looking Ahead
When I started in VFX, giant rooms filled with racks and racks of servers and hardware were the norm, and VFX studios were largely judged by the size of their infrastructure. I’ve heard from an industry colleague about how their VFX studio’s server room was so impressive that they used to give clients tours of the space, seemingly a visual reminder of the studio’s vast compute capabilities. Today, there wouldn’t be nearly as much to view. Modern technology is more powerful and compact but still requires space, and that space has to be properly equipped with the necessary electricity and cooling. With cloud, studios don’t need switchers and physical storage to be competitive off the bat, and they experience fewer infrastructure headaches, like losing freon in the AC.

The cloud also opens up the available artist talent pool. Studios can dedicate the majority of physical space to artists as opposed to machines and even hire artists in remote locations on a per-project or long-term basis. Facilities of all sizes are beginning to recognize that becoming cloud-enabled brings a significant competitive edge, allowing them to harness the power to render almost any client request. VFX producers will also start to view facility cloud-enablement as a risk management tool that allows control of any creative changes or artistic embellishments up until delivery, with the rendering output no longer a blocker or a limited resource.

Bottom line: Cloud transforms nearly every aspect of content creation into a near-infinite resource, whether storage capacity, render power or artistic talent.


Chris Del Conte is senior EC2 business development manager at AWS Thinkbox.

Motorola’s next-gen Razr gets a campaign for today

Many of us have fond memories of our Razr flip phone. At the time, it was the latest and greatest. Then new technology came along, and the smartphone era was born. Now Motorola is asking, “Why can’t you have both?”

Available as of November 13, the new Razr fits in a palm or pocket when shut and flips open to reveal an immersive, full-length touch screen. There is a display screen called the Quick View when closed and the larger Flex View when open — and the two displays are made to work together. Whatever you see on Quick View then moves to the larger Flex View display when you flip it open.

In order to help tell this story, Motorola called on creative shop Los York to help relaunch the Razr. Los York created the new smartphone campaign to tap into the Razr’s original DNA and launch it for today’s user.

Los York developed a 360 campaign that included films, social, digital, TV, print and billboards, with visuals in stores and on devices (wallpapers, ringtones, startup screens). Los York treated the Razr as a luxury item and a piece of art, letting the device reveal itself unencumbered by taglines and copy. The campaign showcases the Razr as a futuristic, high-end “fashion accessory” that speaks to new industry conversations, such as advancing tech along a utopian or dystopian future.

The campaign features a mix of live action and CG. Los York shot on a Panavision DXL with Primo 70 lenses. CG was created using Maxon Cinema 4D with Redshift and composited in Adobe After Effects. The piece was edited in-house on Adobe Premiere.

We reached out to Los York CEO and founder Seth Epstein to find out more:

How much of this is live action versus CG?
The majority is CG, but, originally, the piece was intended to be entirely CG. Early in the creative process, we defined the world in which the new Razr existed and who would belong there. As we worked on the project, we kept feeling that bringing our characters to life in live action and blending the worlds. The proper live action was envisioned after the fact, which is somewhat unusual.

What were some of the most challenging aspects of this piece?
The most challenging part of the project was the fact that the project happened over a period of nine months. Wisely, the product release needed to push, and we continued to evolve the project over time, which is a blessing and a curse.

How did it feel taking on a product with a lot of history and then rebranding it for the modern day?
We felt the key was to relaunch an iconic product like the Razr with an eye to the future. The trap of launching anything iconic is falling back on the obvious retro throwback references, which can come across as too obvious. We dove into the original product and campaigns to extract the brand DNA of 2004 using archetype exercises. We tapped into the attitude and voice of the Razr at that time — and used that attitude as a starting point. We also wanted to look forward and stand three years in the future and imagine what the tone and campaign would be then. All of this is to say that we wanted the new Razr to extract the power of the past but also speak to audiences in a totally fresh and new way.

Check out the campaign here.

Blur Studio uses new AMD Threadripper for Terminator: Dark Fate VFX

By Dayna McCallum

AMD has announced new additions to its high-end desktop processor family. Built for demanding desktop and content creation workloads, the 24-core AMD Ryzen Threadripper 3960X and the 32-core AMD Ryzen Threadripper 3970X processors will be available worldwide November 25.

Tim Miller on the set of Dark Fate.

AMD states that the powerful new processors provide up to 90 percent more performance and up to 2.5 times more available storage bandwidth than competitive offerings, per testing and specifications by AMD performance labs. The 3rd Gen AMD Ryzen Threadripper lineup features two new processors built on 7nm “Zen 2” core architecture, claiming up to 88 PCIe 4.0 lanes and 144MB cache with 66 percent better power efficiency.

Prior to the official product launch, AMD made the 3rd Gen Threadrippers available to LA’s Blur Studio for work on the recent Terminator: Dark Fate and continued a collaboration with the film’s director — and Blur Studio founder — Tim Miller.

Before the movie’s release, AMD hosted a private Q&A with Miller, moderated by AMD’s James Knight. Please note that we’ve edited the lively conversation for space and taken a liberty with some of Miller’s more “colorful” language. (Also watch this space to see if a wager is won that will result in Miller sporting a new AMD tattoo.) Here is the Knight/Miller conversation…

So when we dropped off the 3rd Gen Threadripper to you guys, how did your IT guys react?
Like little children left in a candy shop with no adult supervision. The nice thing about our atmosphere here at Blur is we have an open layout. So when (bleep) like these new AMD processors drops in, you know it runs through the studio like wildfire, and I sit out there like everybody else does. You hear the guys talking about it, you hear people giggling and laughing hysterically at times on the second floor where all the compositors are. That’s where these machines really kick ass — busting through these comps that would have had to go to the farm, but they can now do it on a desktop.

James Knight

As an artist, the speed is crucial. You know, if you have a machine that takes 15 minutes to render, you want to stop and do something else while you wait for a render. It breaks your whole chain of thought. You get out of that fugue state that you produce the best art in. It breaks the chain between art and your brain. But if you have a machine that does it in 30 seconds, that’s not going to stop it.

But really, more speed means more iterations. It means you deal with heavier scenes, which means you can throw more detail at your models and your scenes. I don’t think we do the work faster, necessarily, but the work is much higher quality. And much more detailed. It’s like you create this vacuum, and then everybody rushes into it and you have this silly idea that it is really going to increase productivity, but what it really increases most is quality.

When your VFX supervisor showed you the difference between the way it was done with your existing ecosystem and then with the third-gen Threadripper, what were you thinking about?
There was the immediate thing — when we heard from the producers about the deadline, shots that weren’t going to get done for the trailer, suddenly were, which was great. More importantly, you heard from the artists. What you started to see was that it allows for all different ways of working, instead of just the elaborate pipeline that we’ve built up — to work on your local box and then submit it to the farm and wait for that render to hit the queue of farm machines that can handle it, then send that render back to you.

It has a rhythm that is at times tiresome for the artists, and I know that because I hear it all the time. Now I say, “How’s that comp coming and when are we going to get it, tick tock?” And they say, “Well, it’s rendering in the background right now, as I’m watching them work on another comp or another piece of that comp.” That’s pretty amazing. And they’re doing it all locally, which saves so much time and frustration compared to sending it down the pipeline and then waiting for it to come back up.

I know you guys are here to talk about technology, but the difference for the artists is the instead of working here until 1:00am, they’re going home to put their children to bed. That’s really what this means at the end of the day. Technology is so wonderful when it enables that, not just the creativity of what we do, but the humanity… allowing artists to feel like they’re really on the cutting edge, but also have a life of some sort outside.

Endoskeleton — Terminator: Dark Fate

As you noted, certain shots and sequences wouldn’t have made it in time for the trailer. How important was it for you to get that Terminator splitting in the trailer?
 Marketing was pretty adamant that that shot had to be in there. There’s always this push and pull between marketing and VFX as you get closer. They want certain shots for the trailer, but they’re almost always those shots that are the hardest to do because they have the most spectacle in them. And that’s one of the shots. The sequence was one of the last to come together because we changed the plan quite a bit, and I kept changing shots on Dan (Akers, VFX supervisor). But you tell marketing people that they can’t have something, and they don’t really give a (bleep) about you and your schedule or the path of that artist and shot. (Laughing)

Anyway, we said no. They begged, they pleaded, and we said, “We’ll try.” Dan stepped up and said, “Yeah, I think I can make it.” And we just made it, but that sounds like we were in danger because we couldn’t get it done fast enough. All of this was happening in like a two-day window. If you didn’t notice (in the trailer), that’s a Rev 7. Gabriel Luna is a Rev 9, which is the next gen. But the Rev 7s that you see in his future flashback are just pure killers. They’re still the same technology, which is looking like metal on the outside and a carbon endoskeleton that splits. So you have to run the simulation where the skeleton separates through the liquid that hangs off of an inch string; it’s a really hard simulation to do. That’s why we thought maybe it wasn’t going to get done, but running the simulation on the AMD boxes was lightning fast.

 

 

 

Todd Phillips talks directing Warner Bros.’ Joker

By Iain Blair

Filmmaker Todd Phillips began his career in comedy, most notably with the blockbuster franchise The Hangover, which racked up $1.4 billion at the box office globally. He then leveraged that clout and left his comedy comfort zone to make the genre-defying War Dogs.

Todd Phillips directing Joaquin Phoenix

Joker puts comedy even further in his rearview mirror. This bleak, intense, disturbing and chilling tragedy has earned over a $1 billion worldwide since its release, making it the seventh-highest-grossing film of 2019 and the highest-grossing R-rated film of all time. Not surprisingly, Joker was celebrated by the Academy, earning a total of 11 Oscar nods, including two for Phillips.

Directed, co-written and produced by Phillips (nominated for Directing and Screenplay), Joker is the filmmaker’s original vision of the infamous DC villain — an origin story infused with the character’s more traditional mythologies. Phillips’ exploration of Arthur Fleck, who is portrayed — and fully inhabited — by three-time Oscar-nominee Joaquin Phoenix, is of a man struggling to find his way in Gotham’s fractured society. Longing for any light to shine on him, he tries his hand as a stand-up comic but finds the joke always seems to be on him. Caught in a cyclical existence between apathy, cruelty and, ultimately, betrayal, Arthur makes one bad decision after another that brings about a chain reaction of escalating events in this powerful, allegorical character study.

Phoenix is joined by Oscar-winner Robert De Niro, who plays TV host Murray Franklin, and a cast that includes Zazie Beetz, Frances Conroy, Brett Cullen, Marc Maron, Josh Pais and Leigh Gill.

Behind the scenes, Phillips was joined by a couple of frequent collaborators in DP Lawrence Sher, ASC, and editor Jeff Groth. Also on the journey were Oscar-nominated co-writer Scott Silver, production designer Mark Friedberg and Oscar-winning costume designer Mark Bridges. Hildur Guðnadóttir provided the music.

Joker was produced by Phillips and actor/director Bradley Cooper, under their Joint Effort banner, and Emma Tillinger Koskoff.

I recently talked to Phillips, whose credits include Borat (for which he earned an Oscar nod for Best Adapted Screenplay), Due Date, Road Trip and Old School, about making the film, his love of editing and post.

You co-wrote this very complex, timely portrait of a man and a city. Was that the appeal for you?
Absolutely, 100 percent. While it takes place in the late ‘70s and early ‘80s, and we wrote it in 2016, it was very much about making a movie that deals with issues happening right now. Movies are often mirrors of society, and I feel this is exactly that.

Do you think that’s why so many people have been offended by it?
I do. It’s really resonated with audiences. I know it’s also been somewhat divisive, and a lot of people were saying, “You can’t make a movie about a guy like this — it’s irresponsible.” But do we want to pretend that these people don’t exist? When you hold up a mirror to society, people don’t always like what they see.

Especially when we don’t look so good.
(Laughs) Exactly.

This is a million miles away from the usual comic-book character and cartoon violence. What sort of film did you set out to make?
We set out to make a tragedy, which isn’t your usual Hollywood approach these days, for sure.

It’s hard to picture any other actor pulling this off. What did Joachin bring to the role?
When Scott and I wrote it, we had him in mind. I had a picture of him as my screensaver on my laptop — and he still is. And then when I pitched this, it was with him in mind. But I didn’t really know him personally, even though we created the character “in his voice.” Everything we wrote, I imagined him saying. So he was really in the DNA of the whole film as we wrote it, and he brought the vulnerability and intensity needed.

You’d assume that he’d jump at this role, but I heard it wasn’t so simple getting him.
You’re right. Getting him was a bit of a thing because it wasn’t something he was looking to do — to be in a movie set in the comic book world. But we spent a lot of timing talking about it, what it would be, what it means and what it says about society today and the lack of empathy and compassion that we have now. He really connected with those themes.

Now, looking back, it seems like an obvious thing for him to do, but it’s hard for actors because the business has changed so much and there’s so many of these superhero movies and comic book films now. Doing them is a big thing for an actor, because then you’re in “that group,” and not every actor wants to be in that group because it follows you, so to speak. A lot of actors have done really well in superhero movies and have done other things too, but it’s a big step and commitment for an actor. And he’d never really been in this kind of film before.

What were the main technical challenges in pulling it all together?
I really wanted to shoot on location all around New York City, and that was a big challenge because it’s far harder than it sounds. But it was so important to the vibe and feel of the movie. So many superhero movies use lots of CGI, but I needed that gritty reality of the actual streets. And I think that’s why it’s so unsettling to people because it does feel so real. Luckily, we had Emma Tillinger Koskoff, who’s one of the great New York producers. She was key in getting locations.

Did you do a lot of previz?
I don’t usually do that much. We did it once for War Dogs and it worked well, but it’s a really slow and annoying process to some extent. As crazy as it sounds, we tried it once on the big Murray Franklin scene with De Niro at the end, which is not a scene you’d normally previz — it’s just two guys sitting on a couch. But it was a 12-page scene with so many camera angles, so we began to previz it and then just abandoned it half-way through. The DP and I were like, “This isn’t worth it. We’ll just do it like we always do and just figure it out as we go.” But previz is an amazing tool. It just needed more time and money than we had, and definitely more patience than I have.

Where did you post?
We started off at my house, where Jeff and I had an Avid setup. We also had a satellite office at 9000 Sunset, where all the assistants were. VFX and our VFX supervisor Edwin Rivera were also based out of there along with our music editor, and that’s where most of it was done. Our supervising sound editor was Alan Robert Murray, a two-time Oscar-winner for his work on American Sniper and Letters From Iwo Jima, and we did the Atmos sound mix on the lot at Warners with Tom Ozanich and Dean Zupancic.

Talk about editing with Jeff Groth. What were the big editing challenges?
There are a lot of delusions in Arthur’s head, so it was a big challenge to know when to hide them and when to reveal them. The scene order in the final film is pretty different from the scripted order, and that’s all about deciding when to reveal information. When you write the script, every scene seems important, and everything has to happen in this order, but when you edit, it’s like, “What were we thinking? This could move here, we can cut this, and so on.”

Todd Phillips on set with Robert DeNiro

That’s what’s so fun about editing and why I love it and post so much. I see my editor as a co-writer. I think every director loves editing the most, because let’s face it — directors are all control freaks, and you have the most control in post and the editing room. So for me at least, I direct movies and go through all the stress of production and shooting just to get to the editing room. It’s all stuff I just have to deal with so I can then sit down and actually make the movie. So it’s the final draft of the script and I very much see it as a writing exercise.

Post is your last shot at getting the script right, and the most fun part of making a movie is the first 10 to 12 weeks of editing. The worst part is the final stretch of post, all that detail work and watching the movie 400 times. You get sick of it, and it’s so hard to be objective. This ended up taking 20 weeks before we had the first cut. Usually you get 10 for the director’s cut, but I asked Warners for more time and they were like, “OK.”

Visual effects play a big role in the film. How many were there?
More than you’d think, but they’re not flashy. I told Edwin early on, if you do your job right, no one will guess there are any VFX shots at all. He had a great team, and we used various VFX houses, including Scanline, Shade and Branch.

There’s a lot of blood, and I’m guessing that was all enhanced a lot?
In fact, there was no real blood — not a drop — used on set, and that amazes people when I tell them. That’s one of the great things about VFX now — you can do all the blood work in post. For instance, traditionally, when you film a guy being shot on the subway, you have all the blood spatters and for take two, you have to clean all that up and repaint the walls and reset, and it takes 45 minutes. This way, with VFX, you don’t have to deal with any of that. You just do a take, do it again until it’s right, and add all the blood in post. That’s so liberating.

L-R: Iain Blair and Todd Phillips

What was the most difficult VFX shot to do?
I’d say the scene with Randall at his apartment, and all that blood tracking on the walls and on Arthur’s face and hands is pretty amazing, and we spent the most time on all that, getting it right.

Where did you do the DI, and how important is it to you?
At Company 3 with my regular colorist Jill Bogdanowicz, and it’s vital for the look. I only began doing DIs on the first Hangover, and the great thing about it is you can go in and surgically fix anything. And if you have a great DP like Larry Sher, who’s shot the last six movies for me, you don’t get lost in the maze of possibilities, and I trust him more than I trust myself sometimes.

We shot it digitally, though the original plan was to shoot 65mm large format, and when that fell through to shoot 35mm. Then Larry and I did a lot of tests and decided we’d shoot digital and make it look like film. And thanks to the way he lit and all the work he and Jill did, it has this weird photochemical feel and look. It’s not quite film, but it’s definitely not digital. It’s somewhere in the middle, its own thing.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Behind the Title: Sarofsky EP Steven Anderson

This EP’s responsibilities range gamut “from managing our production staff to treating clients to an amazing dinner.”

Company: Chicago’s Sarofsky

Can you describe your company?
We like to describe ourselves as a design-driven production company. I like to think of us as that but so much more. We can be a one-stop shop for everything from concept through finish, or we can partner with a variety of other companies and just be one piece of the puzzle. It’s like ordering from a Chinese menu — you get to pick what items you want.

What’s your job title, and what does the job entail?
I’m executive producer, and that means different things at different companies and industries. Here at Sarofsky, I am responsible for things that run the gamut from managing our production staff to treating clients to an amazing dinner.

Sarofsky

What would surprise people the most about what falls under that title?
I also run payroll, and I am damn good at it.

How has the VFX industry changed in the time you’ve been working?
It used to be that when you told someone, “This is going to take some time to execute,” that’s what it meant. But now, everyone wants everything two hours ago. On the flip side, the technology we now have access to has streamlined the production process and provided us with some terrific new tools.

Why do you like being on set for shoots? What are the benefits?
I always like being on set whenever I can because decisions are being made that are going to affect the rest of the production paradigm. It’s also a good opportunity to bond with clients and, sometimes, get some kick-ass homemade guacamole.

Did a particular film inspire you along this path in entertainment?
I have been around this business for quite a while, and one of the reasons I got into it was my love of film and filmmaking. I can’t say that one particular film inspired me to do this, but I remember being a young kid and my dad taking me to see The Towering Inferno in the movie theater. I was blown away.

What’s your favorite part of the job?
Choosing a spectacular bottle of wine for a favorite client and watching their face when they taste it. My least favorite has to be chasing down clients for past due invoices. It gets old very quickly.

What is your most productive time of the day?
It’s 6:30am with my first cup of coffee sitting at my kitchen counter before the day comes at me. I get a lot of good thinking and writing done in those early morning hours.

Original Bomb Pop via agency VMLY&R

If you didn’t have this job, what would you be doing instead?
I would own a combo bookstore/wine shop where people could come and enjoy two of my favorite things.

Why did you choose this profession?
I would say this profession chose me. I studied to be an actor and made my living at it for several years, but due to some family issues, I ended up taking a break for a few years. When I came back, I went for a job interview at FCB and the rest is history. I made the move from agency producing to post executive producer five years ago and have not looked back since.

Can you briefly explain one or more ways Sarofsky is addressing the issue of workplace diversity in its business?
We are a smallish women-owned business, and I am a gay man; diversity is part of our DNA. We always look out for the best talent but also try to ensure we are providing opportunities for people who may not have access to them. For example, one of our amazing summer interns came to us through a program called Kaleidoscope 4 Kids, and we all benefited from the experience.

Name some recent projects you have worked on, which are you most proud of, and why?
My first week here at EP, we went to LA for the friends and family screening of Guardians of the Galaxy, and I thought, what an amazing company I work for! Marvel Studios is a terrific production partner, and I would say there is something special about so many of our clients because they keep coming back. I do have a soft spot for our main title for Animal Kingdom just because I am a big Ellen Barkin fan.

Original Bomb Pop via agency VMLY&R

Name three pieces of technology you can’t live without.
I’d be remiss if I didn’t say my MacBook and iPhone, but I also wouldn’t want to live without my cooking thermometer, as I’ve learned how to make sourdough bread this year, and it’s essential.

What social media channels do you follow?
I am a big fan of Instagram; it’s just visual eye candy and provides a nice break during the day. I don’t really partake in much else unless you count NPR. They occupy most of my day.

Do you listen to music while you work? Care to share your favorite music to work to?
I go in waves. Sometimes I do but then I won’t listen to anything for weeks. But I recently enjoyed listening to “Ladies and Gentleman: The Best of George Michael.” It was great to listen to an entire album, a rare treat.

What do you do to de-stress from it all?
I get up early and either walk or do some type of exercise to set the tone for the day. It’s also so important to unplug; my partner and I love to travel, so we do that as often as we can. All that and a 2006 Chateau Margaux usually washes away the day in two delicious sips.

Filmmaker Hasraf “HaZ” Dulull talks masterclass on sci-fi filmmaking

By Randi Altman

Hasraf “HaZ” Dulull is a producer/director and a hands-on VFX and post pro. His most recent credits include the features films 2036 Origin Unknown and The Beyond, the Disney TV series Fast Layne and the Disney Channel original movies Under the Sea — A Descendants Story, which takes place between Descendants 2 and 3. Recently, Dulull developed a masterclass on Sci-Fi Filmmaking, which can be bought or rented.

Why would this already very busy man decide to take on another project and one that is a little off his current path? Well, we reached out to find out.

Why, at this point in your career, did you think it was important to create this masterclass?
I have seen other masterclasses out there to do with filmmaking and they were always academic based, which turned me off. The best ones were the ones that were taught by actual filmmakers who had made commercial projects, films or TV shows… not just short films. So I knew that if I was to create and deliver a masterclass, I would do it after having made a couple of feature films that have been released out there in the world. I wanted to lead by example and experience.

When I was in LA explaining to studio people, executives and other filmmakers how I made my feature films, they were impressed and fascinated with my process. They were amazed that I was able to pull off high-concept sci-fi films on tight budgets and schedules but still produce a film that looked expensive to make.

When I was researching existing masterclasses or online courses as references, I found that no one was actually going through the entire process. Instead they were offering specialized training in either cinematography or VFX, but there wasn’t anything about how to break down a script and put a budget and schedule together; how to work with locations to make your film work; how to use visual effects smartly in production; how to prepare for marketing and delivering your film for distribution. None of these things were covered as a part of a general masterclass, so I set out to fill that void with my masterclass series.

Clearly this genre holds a special place in your heart. Can you talk about why?
I think it’s because the genre allows for so much creative freedom because sci-fi relies on world-building and imagination. Because of this freedom, it leads to some “out of this world” storytelling and visuals, but on the flip side it may influence the filmmaker to be too ambitious on a tight budget. This could lead to making cheap-looking films because of the over ambitious need to create amazing worlds. Not many filmmakers know how to do this in a fiscally sensible way and they may try to make Star Wars on a shoestring budget. So this is why I decided to use the genre of sci-fi in this masterclass to share my experience of smart filmmaking to achieve commercially successful results.

How did you decide on what topics to cover? What was your process?
I thought about the questions the people and studio executives were asking me when I was in those LA meetings, which pretty much boiled down to, “How did you put the movie together for that tight budget and schedule?” When answering that question, I ended up mapping out my process and the various stages and approaches I took in preproduction, production and post production, but also in the deliverables stage and marketing and distribution stage too. As an indie filmmaker, you really need to get a good grasp on that part to ensure your film is able to be released by the distributors and received commercially.

I also wanted each class/episode to have a variety of timings and not go more than around 10 minutes (the longest one is around 12 minutes, and the shortest is three minutes). I went with a more bite-sized approach to make the experience snappy, fun yet in-depth to allow the viewers to really soak in the knowledge. It also allows for repeat viewing.

Why was it important to teach these classes yourself?
I wanted it to feel raw and personal when talking about my experience of putting two sci-fi feature films together. Plus I wanted to talk about the constant problem solving, which is what filmmaking is all about. Teaching the class myself allowed me to get this all out of my system in my voice and style to really connect with the audience intimately.

Can you talk about what the experience will be like for the student?
I want the students to be like flies on the wall throughout the classes — seeing how I put those sci-fi feature films together. By the end of the series, I want them to feel like they have been on an entire production, from receiving a script to the releasing of the movie. The aim was to inspire others to go out and make their film. Or to instill confidence in those who have fears of making their film, or for existing filmmakers to learn some new tips and tricks because in this industry we are always learning on each project.

Why the rental and purchase options? What have most people been choosing?
Before I released it, one of the big factors that kept me up nights was how to make this accessible and affordable for everyone. The idea of renting is for those who can’t afford to purchase it but would love to experience the course. They can do so at a cut-down price but can only view within the 48-hour window. Whereas the purchase price is a little higher price-wise but you get to access it as many times as you like. It’s pretty much the same model as iTunes when you rent or buy a movie.

So far I have found that people have been buying more than renting, which is great, as this means audiences want to do repeat viewings of the classes.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Terminator: Dark Fate director Tim Miller

By Iain Blair

He said he’d be back, and he meant it. Thirty-five years after he first arrived to menace the world in the 1984 classic The Terminator, Arnold Schwarzenegger has returned as the implacable killing machine in Terminator: Dark Fate, the latest installment of the long-running franchise.

And he’s not alone in his return. Terminator: Dark Fate also reunites the film’s producer and co-writer James Cameron with original franchise star Linda Hamilton for the first time in 28 years in a new sequel that picks up where Terminator 2: Judgment Day left off.

When the film begins, more than two decades have passed since Sarah Connor (Hamilton) prevented Judgment Day, changed the future and re-wrote the fate of the human race. Now, Dani Ramos (Natalia Reyes) is living a simple life in Mexico City with her brother (Diego Boneta) and father when a highly advanced and deadly new Terminator — a Rev-9 (Gabriel Luna) — travels back through time to hunt and kill her. Dani’s survival depends on her joining forces with two warriors: Grace (Mackenzie Davis), an enhanced super-soldier from the future, and a battle-hardened Sarah Connor. As the Rev-9 ruthlessly destroys everything and everyone in its path on the hunt for Dani, the three are led to a T-800 (Schwarzenegger) from Sarah’s past that might be their last best hope.

To helm all the on-screen mayhem, black humor and visual effects, Cameron handpicked Tim Miller, whose credits include the global blockbuster Deadpool, one of the highest grossing R-rated films of all time (it grossed close to $800 million). Miller then assembled a close-knit team of collaborators that included director of photography Ken Seng (Deadpool, Project X), editor Julian Clarke (Deadpool, District 9) and visual effects supervisor Eric Barba (The Curious Case of Benjamin Button, Oblivion).

Tim Miller on set

I recently talked to Miller about making the film, its cutting-edge VFX, the workflow and his love of editing and post.

How daunting was it when James Cameron picked you to direct this?
I think there’s something wrong with me because I don’t really feel fear as normal people do. It just manifests as a sense of responsibility, and with this I knew I’d never measure up to Jim’s movies but felt I could do a good job. Jim was never going to tell this story, and I wanted to see it, so it just became more about the weight of that sense of responsibility, but not in a debilitating way. I felt pretty confident I could carry this off. But later, the big anxiety was not to let down Linda Hamilton. Before I knew her, it wasn’t a thing, but later, once I got to know her I really felt I couldn’t mess it up (laughs).

This is still Cameron’s baby even though he handed over the directing to you. How hands-on was he?
He was busy with Avatar, but he was there for a lot of the early meetings and was very involved with the writing and ideas, which was very helpful thematically. But he wasn’t overbearing on all that. Then later when we shot, he wanted to write a few of the key scenes, which he did, and then in the edit he was in and out, but he never came into my edit room. He’d give notes and let us get on with it.

What sort of film did you set out to make?
A continuation of Sarah’s story. I never felt it was John’s story to me. It was always about a mother’s love for a son, and I felt like there was a real opportunity here. And that that story hadn’t been told — partly because the other sequels never had Linda. Once she wanted to come back, it was always the best possible story. No one else could be her or Arnold’s character.

Any surprises working with them?
Before we shot, people were telling me, “You got to be ready, we can’t mess around. When Arnold walks on set you’d better be rolling!” Sure enough, when he walked on he’d go, “And…” (Laughs) He really likes to joke around. With Linda — and the other actors — it was a love-fest. They’re both such nice, down-to-earth people, and I like a collegial atmosphere. I’m not a screamer. I’m very prepared, and I feel if you just show up on time, you’re already ahead of the game as a director.

What were the main technical challenges in pulling it all together?
They were all different for each big action set piece, and fitting it all into a schedule was tough, as we had a crazy amount of VFX. The C-5 plane sequence was far and away the biggest challenge to do and [SFX supervisor] Neil Corbould and his team designed and constructed all the effects rigs for the movie. The C-5 set was incredible, with two revolving sets, one vertical and one horizontal. It was so big you could put a bus in it, and it was able to rotate 360 degrees and tilt in either direction at the same time.

You just can’t simulate that reality of zero gravity on the actors. And then after we got it all in camera, which took weeks, our VFX guy Eric Barba finished it off. The other big one was the whole underwater scene, where the Humvee falls over the top of a dam and goes underwater as it’s swept down a river. For that, we put the Humvee on a giant scissor lift that could take it all the way under, so the water rushes in and fills it up. It’s really safe to do, but it feels frighteningly realistic for the actors.

This is only my second movie, so I’m still learning, but the advantage is I’m really willing to listen to any advice from the smart people around me on set on how best to do all this stuff.

How early on did you start integrating post and all the VFX?
Right from the start. I use previz a lot, as I come from that environment and I’m very comfortable with it, and that becomes the template for all of production to work from. Sometimes it’s too much of a template and treated like a bible, but I’m like, “Please keep thinking. Is there a better idea?” But it’s great to get everyone on the same page, so very early on you see what’s VFX, what’s live-action only, what’s a combination, and you can really plan your shoot. We did over 45 minutes of previz, along with storyboards. We did tons of postviz. My director’s cut had no blue/green at all. It was all postviz for every shot.

Tim Miller and Linda Hamilton

DP Ken Seng, who did Deadpool with you, shot it. Talk about how you collaborated on the look.
We didn’t really have time to plan shot lists that much since we moved so much and packed so much into every day. A lot of it was just instinctive run-and-gun, as the shoot was pretty grueling. We shot in Madrid and [other parts of] Spain, which doubled for Mexico. Then we did studio work in Budapest. The script was in flux a lot, and Jim wrote a few scenes that came in late, and I was constantly re-writing and tweaking dialogue and adjusting to the locations because there’s the location you think you’ll get and then the one you actually get.

Where did you post?
All at Blur, my company where we did Deadpool. The edit bays weren’t big enough for this though, so we spilled over into another building next door. That became Terminator HQ with the main edit bay and several assistant bays, plus all the VFX and compositing post teams. Blur also helped out with postviz and previz.

Do you like the post process?
I love post! I was an animator and VFX guy first, so it’s very natural to me, and I had a lot of the same team from Deadpool, which was great.

Talk about editing with Julian Clarke who cut Deadpool. How did that work?
It was the same set up. He’d be back here in LA cutting while we shot. He’s so fast; he’d be just one day behind me — I’ve never met anyone who works as hard. Then after the shoot, we’d edit all day and then I’d deal with VFX reviews for hours.

Can you talk about how Adobe Creative Cloud helped the post and VFX teams achieve their creative and technical goals?
I’m a big fan, and that started back on Deadpool as David Fincher was working closely with Adobe to make Premiere something that could beat Avid. We’re good friends — we’re doing our animated Netflix show Love, Death & Robots together — and he was like, “Dude, you gotta use this tool,” so we used it on Deadpool. It was still a little rocky on that one, but overall it was a great experience, and we knew we’d use it on this one. Adobe really helped refine it and the workflow, and it was a huge leap.

What were the big editing challenges?
(Laughs) We just shot too much movie. We had many discussions about cutting one or more of the action scenes, but in the end, we just took out some of the action from all of them, instead of cutting a particular set piece. But it’s tricky cutting stuff and still making it seamless, especially in a very heavily choreographed sequence like the C-5.

VFX plays a big role. How many were there?
Over 2,500 — a huge amount. The VFX on this were so huge it became a bit of a problem, to be honest.

L-R: Writer Iain Blair and director Tim Miller

How did you work with VFX supervisor Eric Barba.
He did a great job and oversaw all the vendors, including ILM, who did most of them. We tried to have them do all the character-based stuff, to keep it in one place, but in the end, we also had Digital Domain, Method, Blur, UPP, Cantina, and some others. We also brought on Jeff White from ILM since it was more than Eric could handle.

Talk about the importance of sound and music.
Tom Holkenborg, who scored Deadpool, did another great job. We also reteamed with sound design and mixer Craig Henighan and we did the mix at Fox. They’re both crucial in a film like this, but I’m the first to admit music’s not my strength. Luckily, Julian Clarke is excellent with that and very focused. He worked hard at pulling it all together. I love sound design and we talked about all the spotting, and Julian managed a lot of that too for me because I was so busy with the VFX.

Where did you do the DI and how important is it to you?
It’s huge, and we did it at Company 3 with Tim Stipan, who did Deadpool. I like to do a lot of reframing, adding camera shake and so on. It has a subtle but important effect on the overall film.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.