Tag Archives: visual effects

VFX supervisor Jay Worth talks Season 2 of Netflix’s Altered Carbon

By Barry Goch

Netflix’s Altered Carbon is now streaming Season 2, with a new lead in Anthony Mackie as Takeshi Kovacs in a new skin. He’s the only surviving soldier of a group of elite interstellar warriors, continuing his centuries-old quest to find his lost love, Quellcrist Falconer (Renée Elise Goldsberry). After decades of planet-hopping and searching the galaxy, Kovacs finds himself recruited back to his home planet of Harlan’s World with the promise of finding Quell. In this world of Altered Carbon, lives can be continued after death by taking on a new skin and using the person’s stack — or brain.

Jay Worth — Credit: Rob Flate

As you can imagine, there are a ton of visual effects used to tell Takeshi’s story. To find out more, we reached out to Jay Worth, an Emmy Award-winning VFX supervisor with 15 years of experience working in visual effects. His credits include Fringe, Person of Interest and Westworld, for which he won the Emmy for Outstanding Special Visual Effects in 2017.

How did you get involved in Altered Carbon?
I have a relationship with showrunner Alison Schapker. We go way back to the good old days of Fringe and a few other things. I had worked with the head of visual effects and post for Skydance, Dieter Ismagil, and then I had just come off of working on a Netflix show. It worked out for all three of those parties to come together and have me join the team. It was a fun bit of a reunion for us to get back together.

At what point did you come on board for Season 2?
I came in after it was shot in order to usher things through post and the final creative push through the final delivery. VFX producer Tony Meagher and I were able to keep the ball rolling and push it through to the final. The VFX team at Double Negative and the other vendors that we had were really able to carry it through from the beginning to the end as well.

Tell us about your review process. Where were you based?
We were in Los Angeles — the showrunners, Tony Meagher and I — but the rest of the team was in Toronto: our VFX coordinator, VFX editor, post team and DI facility (Deluxe Toronto). The VFX vendors were spread across Canada. The interesting thing for us was how to set up the review process while being in Los Angeles. We relied really completely on ClearView and that amazing technology. We were able to do editorial reviews and full-range color UHD review sessions for final VFX shots. It was a beautiful process. Being able to review many things in the edit and make a checklist was useful. Then we needed to look at this one in color, so being able to go downstairs and just flip a switch in our bay and have our beautifully calibrated setup was amazing. That afforded us the ability to work seamlessly even though we weren’t all in the same place.

This was the first time I had done a show that was so remote. I’ve done many shows where editorial is in one place and the VFX team is in another, but this was the first time I’d done something this ambitious. We did everything remotely, from editorial reviews to effects reviews to color and even the sound, and it was really an amazing, far more seamless process than I thought it would be when we started. The team at Skydance, the production team and the post team really had all the variables dialed in, and it was really painless considering we were spread out. The editorial team and the VFX team on the show side were just phenomenal in terms of how they were able to coordinate with everybody.

       
Before and After

This production predates the COVID-19 restrictions. Do you think that would have impacted your production?
It would have been a challenge, but not impossible. We would have probably ended up having more ClearView boxes for the team in order to work remotely. I’ve worked recently on other shows that have the colorists working from home, and they’re all tapping into the same box; it just happens to be a pipeline issue. It was doable before, but now there’s just a little bit more back and forth to set up the pipeline.

What was the hardest sequence on “Broken Angels,” the last episode of the season, and why?
One of the larger challenges in visual effects is how to convey something visually from a story perspective and still have it feel real and organic. A lot of times, it ends up being a more challenging hurdle to get over from a visual standpoint when the storytellers are trusting you to help convey these different story points. That’s really where visual effects shine: When you are willing to take on that risk and that narrative responsibility, that’s really where the fun lies.

For the finale, it was telling the story of Angelfire. People kind of understand the overarching idea of satellites and weapons from space, but we had to help people understand the communication between them. We also needed them to understand how it connects to the older technology and what that’s going to mean for our characters. That was by far the biggest challenge for that episode and for the season.

Tell us about the look development of the Angelfire.
It was definitely a journey, but it started with the page and trying to visualize it. Alison Schapker and EP James Middleton had written up what these moments were going to be: a communication tower and a force field around a planet they didn’t quite understand. That was part of the mystery for the viewers and the characters as they were going through the season.

Our goal, from a visual effects standpoint, was to show this ancient-yet-modern communication and to figure out how to visually tell the story of how these things are communicating … that they’re all kind of like-minded and they’re protective. We key that up when Danica fires off the rocket with the rebels attached to them so we can see firsthand what these orbitals can do. Then we see Angelfire come down on the soldiers in the forest.

We’re starting to understand more and more what this thing does so that we can understand what the sacrifice really means … to figure out what the orbitals are and how they could look and feel organic and threatening as well as benign and ultimately destructive. I feel like we ended at a point where it makes sense and it all works together, but at the beginning, when you have a blank canvas, it’s a rather daunting task to figure out what it all should look like.

We had so many conversations about how to depict Angelfire. Should it be more like glass breaking? Should it be like lightning? Should it be like a wave? Should it just crackle? Should it splash in? We had so many iterations of things that just didn’t feel or look quite right. It didn’t convey what we wanted it to convey. “It looks too digital; it looks fake.” To end up with something that felt integrated into the environment and the sky was a testament not only to the team’s perseverance but to Alison’s and James’ patience, leadership and ability to explain creatively what they were going for. I’m really happy with where we finally landed.

How did you lock in the final look?
We wanted it to feel organic and real for the audience. We had a lot of different meetings to talk about what perspective we were going to take — how high up we need to be, how close we need to be to understand that they were communicating with each other and still firing — and whether those different perspectives should be down on the ground or up in the sky. We figured it out with editorial while we were locking episodes, which is a fairly normal process when you’re dealing with full CG shots mixed with pieces that we shot on the day.

We obviously had numerous versions of animatics, and we had to figure out how it was going to work in the edit before we could lock down animation and timing. Honestly, for the final moments when Kovacs sacrificed himself and Angelfire was going off, we were tweaking those with editorial, and our editorial team did a phenomenal job of helping us realize the moment.

Any people or companies that you want to give a shout-out to?
Bob Munroe (a production-side VFX supervisor) and Tony Meagher. All the work they did was groundwork for everything that ended up on the screen. And all the vendors, like Double Negative, Mavericks, Spin, Switch and Krow. Also our VFX coordinating team and everybody up in Toronto. They were the backbone of everything we did this season. And it was just so much fun to work with Alison and James and the team.

Any advice for people wanting to work in visual effects?
From my standpoint, there are not enough people on the show side of things, and if they have a passion for it, there’s a lot of opportunity to get into that.

I would say try to find your lane. Is it on the artist side? Is it on the coordinating and producing side? There are so many resources out there now. And now that the technology is available for everybody, it’s an amazing opportunity for creatives to get together and collaborate and to make things that that are compelling.

When I’m on a show or in the office, I can tell which PA or assistant has a fascination with VFX, and I always encourage them to come along. I have hired from within many times. It’s about trying to educate yourself and figure out what your passion is, and realizing there’s space for almost any role when it comes to visual effects. That’s the exciting thing about it.


Barry Goch is senior finishing artist at The Foundation and an instructor in post production at UCLA Extension.

VFX supervisors talk Amazing Stories and Stargirl

By Iain Blair

Even if you don’t know who Crafty Apes are, you’ve definitely seen their visual effects work in movies such as Deadpool 2, La La Land, Captain America: Civil War and Little Women, and in episodics like Star Trek: Picard and Westworld. The full-service VFX company was founded by Chris LeDoux, Jason Sanford and Tim LeDoux and has locations in Atlanta, Los Angeles, Baton Rouge, Vancouver, Albuquerque and New York, and its roster of creative and production supervisors offers a full suite of services, including set supervision, VFX consultation, 2D compositing and CG animation, digital cosmetics, previsualization and look development.

Aldo Ruggiero

Recently, Crafty Apes worked on two high-profile projects — the reboot of Steven Spielberg’s classic TV series Amazing Stories for Apple TV+ and the Disney+’s Stargirl.

Let’s take a closer look at their work on both. First up is Amazing Stories and Crafty Apes VFX supervisor Aldo Ruggiero.

How many VFX did you have to create for the show?
The first season has five episodes, and we created VFX for two episodes — “The Heat” and “Dynoman and the Volt!!” I was on the set for the whole of those shoots, and we worked out all the challenges and problems we had to solve day by day. But it wasn’t like we got the plates and then figured out there was a problem. We were very well-prepared and we were all based in Atlanta where all the shooting took place, which was a big help. We worked very closely with Mark Stetson, who was the VFX supervisor for the whole show, and because they were shooting three shows at once, he couldn’t always be on set, so he wanted us there every day. Mark really inspired me just to take charge and to solve any problems and challenges.
What were the main challenges?
Of the two episodes, “Dynoman and the Volt!” was definitely the most challenging to do, as we had this entire rooftop sequence, and it was quite complicated, as half was done with bluescreen and half was done using a real roof. We had about 40 shots cutting back and forth between them, and we had to create this 360-degree environment that matched the real roof seamlessly. Doing scenes like that, with all the continuity involved and making it totally photo-real, is very challenging. To do a one-off shot is really easy compared with that, as it may take 20 man-days to do. But this took about 300 man-days to get it done — to match every detail exactly and all the color and so on. The work we did for the other episode, “The Heat,” was less challenging technically and more subtle. We did a lot of crowd replacement and a lot of clean-up, as Atlanta was doubling for other locations.

It’s been 35 years since the original Amazing Stories first aired. How involved was Spielberg, who also acts as EP on this?
He was more involved with the writing than the actual production, and I think the finale of “Dynoman and the Volt!!” was completely his idea. He wasn’t on the set, but he gave us some notes, which were very specific, very concise and pointed. And of course, visual effects and all the technology have advanced so much since then.

Gabriel Sanchez

What tools did you use?
We used Foundry Nuke for compositing and Autodesk Maya for 3D animation, plus a ton more. We finished all the work months ago, so I was happy to finally just see the finished result on TV. It turned out really well I think.

Stargirl
I spoke with VFX supervisor Gabriel Sanchez, a frequent collaborator with Wes Anderson. He talked about creating the VFX and the pipeline for Stargirl, the musical romantic drama about teenage angst and first love, based on the best-selling YA novel of the same name, and directed by Julia Hart (Fast Color).

How many VFX did you have to create for the film, and how closely did you work with Julia Hart?
While you usually meet the director in preproduction, I didn’t meet Julia until we got on set since I’d been so busy with other jobs. We did well over 200 shots at our offices in El Segundo, and we worked very closely together, especially in post. Originally, I was brought on board to be on the set to oversee all the crowd duplication for the football game, but once we got into post, it evolved into something much bigger and more complex.

Typically during bidding and even doing the script breakdown, we always know there’ll be invisible VFX, but you don’t know exactly what they’ll be until you get into post. So during preproduction on this, the big things we knew we’d have to do up front were the football and crowd scenes, maybe with some stunt work, and the CG pet rat.

What were the main challenges?
The football game was complex, because they wanted not just the crowd duplication, but also to create one long, seamless take because it’s the half-time performance. So we blocked it and did it in sections, trying to create the 360 so we could go around the band and so on.

The big challenge was then doing all those cuts together in a seamless take, but there were issues, like where the crowd would maybe creep in during the 360, or we’d have a shadow or we’d see the crane or a light. So that kind of set the tone, and we’d know what we had to clean up in post.

Another issue was a shot wherein it was raining and we had raindrops bouncing off a barn door onto the camera, which created this really weird long streak on the lens, and we had to remove that. We also had to change the façade of the school a bit, and we had a do a lot of continuity fixes. So once we began doing all that stuff, which is fairly normal in a movie, then it all evolved in post into a lot more complex and creative work.

What did it entail?
Sometimes, in terms of performance, you might like a take of how an actress speaks her lines technically, but prefer another take of how an actor replies or responds, so we had a lot of split screens to make the performance come together. We also had to re-adjust the timing of the actors’ lip movements sometimes to sync up with the audio, which they wanted to off-set. And there were VFX shots we created in post where we had no coverage.

For instance, Julia needed a bike in front of a garage for a shot that was never filmed, so I had to scan through everything, find footage, then basically create a matte painting of the garage and find a bike from another take, but it still didn’t quite work. In the end, I had to take the bike frame from one take, the wheels from another and then assemble it all. When Julia saw it, she said, ‘Perfect!’ That’s when she realized what was feasible with VFX, depending on the time and budget we had.

How many people were on your team?
I had about 10 artists and two teams. One worked on the big long seamless 360 shot, and then another team worked on all the other shots. I did most of the finishing of the long halftime show sequence on Autodesk Flame, with assistance from three other artists on Nuke, and I parceled out various bits to them — “take out this shadow,” “remove this lens flare” and so on — and did the complete assembly to make it feel seamless on Flame. I also did all the timing of the crowd plates on Flame. Ultimately, the whole job took us about two months to complete, and it was demanding but a lot of fun to work on.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Tom Kendall

Picture Shop VFX and Ghost merge, Tom Kendall named president

Ghost artists at work in Copenhagen studio.

Streamland Media (formerly Picture Head Holdings) has consolidated its visual effects offerings under the Ghost VFX brand. Picture Shop’s visual effects division will merge with Ghost VFX to service feature film, television and interactive media clients. LA-based Picture Shop, as part of the Streamland Media Group, acquired Denmark’s Ghost VFX in January.

Tom Kendall, who headed Picture Shop VFX, will move into the the role of president for Ghost VFX, based out of the Los Angeles facility. Jeppe Nygaard Christensen, Ghost co-founder and EVP, and Phillip Prahl, Ghost SVP, will continue to operate out of the Copenhagen studio.

“I’m very excited about combining both teams,” says Kendall. “It strengthens our award-winning VFX services worldwide, while concentrating our growing team of talent and expertise under one global brand. With strategic focus on the customer experience, we are confident that Ghost VFX will continue to be a partner of choice for leading storytellers around the world.”

Over the years, Ghost has contributed to more than 70 feature films and titles. Some of Ghost’s work includes Star Wars: The Rise of Skywalker, The Mandalorian, The Walking DeadSee, Black Panther and Star Trek Discovery. Recent Picture Shop VFX credits include Hawaii Five-O, Magnum P.I., The Walking Dead and Fear the Walking Dead.

The Streamland Media Group includes Picture Shop, Formosa Group, Picture Head, Ghost VFX, The Farm and Finalé, with locations in the US, Denmark, Canada and the UK.

Arch platform launches for cloud-based visual effects

Arch Platform Technologies, a provider of cloud-based infrastructure for content creation, has made its secure, scalable, cloud-based visual effects platform available commercially. The Arch platform is designed for movie studios, productions and VFX companies and enables them to leverage a VFX infrastructure in the cloud from anywhere in the world.

An earlier iteration of the Arch platform was only available to those companies who were already working with Hollywood-based Vitality VFX, where the technology was created by Guy Botham. Now, Arch is making its next-generation version of its “rent vs. own” cloud-based VFX platform commercially available broadly to movie studios, productions and VFX companies. This version was well along in its development when COVID-19 arrived, making it a very timely offering.

By moving VFX to the cloud, the platform lets VFX teams scale up and down quickly from anywhere and build and manage capacity with cloud-based workstations, renderfarms, storage and workflow management – all in a secure environment.

“We engineered a robust Infrastructure as a Service (IaaS), which now enables a group of VFX artists to collaborate on the same infrastructure as if they were using an on-premises system,” says Botham. “Networked workstations can be added in minutes nearly anywhere in the world, including at an artist’s home, to create a small to large VFX studio environment running all the industry-standard software and plugins.”

Recently, Solstice Studios, a Hollywood distribution and production studio, used the Arch platform for the VFX work on the studio’s upcoming first movie, Unhinged. The platform has also been used by VFX companies Track VFX and FatBelly VFX and is now commercially available to the industry.

The Embassy opens in Culver City with EP Kenny Solomon leading charge

Vancouver-based visual effects and production studio The Embassy is opening an office in LA office in Culver City. EP Kenny Solomon will head up the operation. The move comes following the studio’s growth in film, advertising and streaming, and a successful 2019.The LA-based office will allow The Embassy to have a direct connection and point of contact with its growing US client base and provide front-end project support and creative development while Vancouver — offering pipeline and technology infrastructure — remains the heart of operations.

New studio head Solomon has worked in the film, TV and streaming industries for the past 20 years, launching and operating a number of companies. The most recent of which was Big Block Media Holdings — an Emmy-, Cannes-, Webby-, Promax- and Clio-winning integrated media company founded by Solomon nine years ago.

“We have a beautiful studio in Culver City with infrastructure to quickly staff up to 15 artists 2D, 3D and design, a screening room, conference room, edit bay and wonderful outdoor space for a late night ping-pong match and a local Golden Road beer or two,” says Solomon. “Obviously, everyone is WFH right now but at a moment’s notice we are able to scale accordingly. And Vancouver will always be our heartbeat and main production hub.”

“We have happily been here in Vancouver for the past 17 years plus,” says The Embassy president Winston Helgason. “I’ve seen the global industry go through its ups and downs, and yet we continue to thrive. The last few months have been a difficult period of uncertainty and business interruption and, while we are operating successfully out of current WFH restrictions, I can’t wait to open up to our full potential once the world is a little more back to normal.”

In 2020, The Embassy reunited with Area 23/FCB and RSA director Robert Stromberg (Maleficent) to craft a series of fantastical VFX environments for Emgality’s new campaign. The team has also been in full production for the past 16 months on all VFX work for Warrior Nun, an upcoming 10-episode series for Netflix. The Embassy was responsible for providing everything from concept art to pre-production, on-set supervision and almost 700 visual VFX shots for the show. The team in Vancouver is working both remotely and in the studio to deliver the full 10 episodes.

Solomon is excited to get to work, saying that he always respected The Embassy’s work, even while he was competing with them when he was at CafeFX/The Syndicate and Big Block.

As part of the expansion, The Embassy has also added a number of new reps to the team — Sarah Gitersonke joins for Midwest representation, and Kelly Flint and Sarah Lange join for East Coast.

Dolores McGinley heads Goldcrest London’s VFX division

London’s Goldcrest Post, a picture and audio post studio, has launched a visual effects division at its Lexington Street location. It will be led by VFX vet Dolores McGinley, whose first task is to assemble a team of artists that will provide services for both new and existing clients.

During the COVID-19 crisis, all Goldcrest staff is working from home except the colorists, who are coming in as needed and working alone in the grading suites. McGinley and her team will move into the Goldcrest facility when lockdown has ended.

“Having been immersed in such a diverse range of projects over the past five years, we identified the need to expand into VFX some time ago,” explains Goldcrest MD Patrick Malone. “We know how essential an integrated VFX service is to our continued success as a leading supplier of creative post solutions to the film and broadcast community.

“As a successful VFX artist in her own right, Dolores is positioned to interpret the client’s brief and offer constructive creative input throughout the production process. She will also draw upon her considerable experience working with colorists to streamline the inclusion of VFX into the grade and guarantee we are able to meet the specific creative requirements of our clients.”

With over two decades of creative experience, McGinley joins Goldcrest having held various senior roles within the London VFX community. Recent examples of her work include The Crown, Giri/Haji and Good Omens.

Working From Home: VFX house The Molecule

By Randi Altman

With the COVID-19 crisis affecting all aspects of our industry, we’ve been talking to companies that have set up remote workflows to meet their clients’ needs. One of those studios is The Molecule, which is based in New York and has a location in LA as well. The Molecule has focused on creating visual effects for episodics and films since its inception in 2005.

Blaine Cone 

The Molecule artists are currently working on series such as Dickinson and Little Voice (AppleTV+), Billions (Showtime), Genius: Aretha (NatGeo), Schooled and For Life (ABC) and The Stranger (Quibi). And on the feature side, there is Stillwater (Focus Features) and Bliss (Amazon). Other notable projects include The Plot Against America (HBO), Fosse/Verdon (FX) and The Sinner (USA).

In order to keep these high-profile projects flowing, head of production Blaine Cone and IT manager Kevin Hopper worked together to create the studio’s work-from-home setup.

Let’s find out more…

In the weeks leading up to the shutdown, what were you doing to prepare?
Blaine Cone: We had already been investigating and testing various remote workflows in an attempt to find a secure solution we could extend to artists who weren’t readily available to join us in house. Once we realized this would be a necessity for everyone in the company, we accelerated our plans. In the weeks before the lockdown, we had increasingly larger groups of artists work from home to gradually stress-test the system.

How difficult was it to get that set up?
Cone: We were fortunate to have a head start on our remote secure platform. Because we decided to tie into AWS, as well as into our own servers and farm (custom software running on a custom-built hypervisor server on Dell machines), it took a little while, but once we saw the need to fast-track it we were able to refine our solution pretty quickly. We’re still optimizing and improving behind the scenes, but the artists have been able to work uninterrupted since the beginning.

Kevin Hopper

What was your process in choosing the right tools to make this work?
Kevin Hopper: We have been dedicated to nailing down TPN-compliant remote work practices for the better part of a year now. We knew that there was a larger market of artists available for us to tap into if we could get a remote work solution configured properly from a security standpoint. We looked through a few companies offering full remote working suites via Teradici PCOIP setups and ultimately decided to configure our own images and administer them to our users ourselves. This route gives us the most flexibility and allows us to accurately and effectively mirror our required security standards.

Did employees bring home their workstations/monitors? How is that working?
Cone: In the majority of cases, employees are using their home workstations and monitors to tap into their dedicated AWS instance. In fact, the home setup could be relatively modest because they were tapping into a very strong machine on the cloud. In a few cases, we sent home 4K monitors with individuals so they could better look at their work..

Can you describe your set up and what tools you are using?
Cone: We are using Teradici to give artists access to dedicated, powerful and secure AWS machines to work off of files on our server. This is set up for Nuke, Maya, Houdini, Mocha, Syntheyes, Krita, Resolve, Mari and Substance Painter. We spin up the AWS instances in the morning and then down again after the workday is over. It allows us to scale as necessary, and it limits the amount of technical troubleshooting and support we might have to do otherwise. We have our own internal workflow tools built into the workflow just as we did when artists were at our office. It’s been relatively seamless.

Fosse/Verdon

How are you dealing with the issues of security while artists are working remotely?
Cone: Teradici gives us the security we need to ensure that the data exists only on our servers. It limits the artists from web traffic as well.

How is this allowing you to continue creating visual effects for shows?
Cone: It’s really not dissimilar to how we normally work. The most challenging change has been the lack of in-person interaction. Shotgun, which we use to manage our shots, still serves as our creative hub, but Slack has become an even more integral aspect of our communication workflow as we’ve gone remote. We’ve also set up regular team calls, video chats and more to make up for the lack of interpersonal interaction inherent in a remote scenario.

Can you talk about review and approval on shots?
Cone: Our supervisors are all set up with Teradici to review shots securely. They also have 4K monitors. In some cases, artists are doing Region of Interest to review their work. We’ve continued our regular methods of delivery to our clients so that they can review and approve as necessary.

How many artists do you have working remotely right now?
Cone: Between supervisors, producers, artists and support staff in NY and LA, we have about 50 remote users working on a daily basis. Our Zoom chats are a lot of fun. In a strange way, this has brought us all closer together than ever before.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

 

VFX turn Minnesota into Alabama for indie film Tuscaloosa

By Randi Altman

Director Philip Harder’s Tuscaloosa is a 1970s coming-of-age story that follows recent college graduate Billy Mitchell as he falls in love with a psychiatric patient from his dad’s mental hospital. As you can imagine, the elder Mitchell is not okay with the relationship or the interest his son is taking in the racial tensions that are heating up in Alabama.

As a period-piece, Tuscaloosa required a good amount of visual effects work, and Minneapolis-based Splice served as the picture’s main post and VFX house. Splice called the newly launched and local boutique Nocturnal Robot for overspill and to help turn current-day Minnesota, where the film was shot, into 1970s Tuscaloosa, Alabama.

Jeremy Wanek

Nocturnal Robot’s owner, editor and VFX artist, Jeremy Wanek and artist Conrad Flemming provided a variety of effects, from removing foliage to adding store signs and period cars to rebuilding a Tuscaloosa street. Let’s find out more.

How early did you get involved?
Nocturnal Robot got involved as the edit was nearing picture lock. Splice was bidding on the project’s VFX at the time and it became apparent that they were going to need some support due to the volume of complex shots and constrained budget.

Splice was the main VFX house on the film, and they provided editing as well?
Yes, Splice handled the edit and was the main hub for the VFX work. Clayton Condit edited the movie, along with Kyle Walczak as additional editor. The VFX team was led by Ben Watne. Splice handled around 50 shots, while my team handled around 20, and then Rude Cock Productions (led by the now LA-based Jon Julsrud) jumped in toward the end to finish up some miscellaneous shots, and finally, The Harbor Picture Company tackled some last-minute effects during finishing — so lots of support on the VFX front!

What direction were you given from the client?
Phillip Harder and I met at Splice and went through the shots that concerned him most. Primarily, these discussions centered on details that would lend themselves well to the transformation of modern-day Minnesota, where the movie was shot, to 1970s Alabama, where the movie takes place.

Were you on set?
We were brought in well after the movie had been shot, which is usually the case on a lot of the indie films we work on.

      
Before and After: Period car addition

Can you talk about prepro? Did you do any and if so in what tool?
No prepro, just the discussion I had with the director before we started working. As far as tools, he loved using his laser pointer to point out details (laughs).

Speaking of tools, what did you use on the show, and can you talk about review and approvals?
Our team was very small for this project, since my company had just officially launched. It was just me, as VFX supervisor/VFX artist along with VFX artist Conrad Flemming. We did our compositing in Adobe After Effects, sometimes using Red Giant tools as well. Digital cleanup was via Adobe Photoshop, and planar tracking was done using BorisFX Mocha Pro. We did 3D work in Maxon Cinema 4D, as well as Video Copilot’s Element 3D plugin for Adobe After Effects.

I would stop by Splice, there Kyle Walczak (who was doing some additional editing at the time) would transfer footage over to a hard drive for me. Then, it was a simple workflow between me and Conrad. I worked on my Mac Pro trash can, while Conrad worked on his PC. I sent him shots via my Google Drive. For review and final delivery we used Splice’s personal FTP site.

   
Before and After: Foliage removal

A lot of the review process was sending emails back and forth with Phil. This worked out okay because we were able to get most shots approved quickly. Right after this project we started using Frame.io, and I wish I would have been using that on this one — it’s much cleaner and more efficient.

Can you talk about what types of VFX you provided, and did they pick Minnesota because the buildings more closely resembled Tuscaloosa of the ‘70s?
Phil picked Minnesota because it’s where he lives, and he realized that present-day Alabama doesn’t look much like it did in the 70s. In Minnesota he could pull the resources he had access to and stretch his budget further. They shot at a lot of great timeless locations and brought in some period cars and lots of wardrobe to really sell it. They did an incredible job during production, so VFX-wise, it was mostly just enhancing here and there.

Can you talk about some of those more challenging scenes?
There were two shots in particular that were challenging, and we handled each case very differently. I had lengthy discussions with Phil about how to handle them. If you watch our VFX reel, they are the first and last shots shown.

In the first shot, we see our two lead characters pull up to a restaurant. Phil wanted to change the environment and add a parking lot of period-appropriate cars. He had some images that were photographed in the ’70s and he wanted to composite them into the live-action plate the crew had shot. It was really interesting trying to blend something that was nearly 50 years old into a high-quality Alexa shot. It took a lot of cleanup work on the old images since they were littered with artifacts, as well as cutting them up to fit into the shot more seamlessly. It also involved adding some CG period cars. It was a fun challenge, and once it was all put together it created a unique look.

       
Before and After: Bama Theater

In the second challenging shot, the live-action plate featured a modern-day Minnesota street with a few period vehicles driving down it. We needed to transform this, as you’d expect, into a 70s Alabama street — this time featuring the Bama Theater. This involved a lot of detailed work. I had Conrad focus most of his attention on this shot because I knew he could pull it off in the limited time we had, and his attention to the period’s details would go a long way. There wasn’t a lot of reference material for us to analyze from the ’70s that was taken on that particular street, so we did our best looking at other images we could find from the time and area.

Phil had a lot of notes and details to help us along. We had live-action plates shot on the Red camera to build upon — some buildings, period cars, the extras walking around and a handful of other small objects. But because so much had to be reconstructed, the shot had to be put together from scratch.

Some of the things we noticed in images from the ’70s that we implemented were removing lots of the foliage/trees, adding the fancy signs above stores and adding the stoplights that hung on wires, among other details. It also involved adding lots of CG cars to the environment to fill out the street and add some movement to the foreground and background.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Director Vincent Lin discusses colorful Seagram’s Escapes spot

By Randi Altman

Valiant Pictures, a New York-based production house, recently produced a commercial spot featuring The Bachelor/Bachelorette host Chris Harrison promoting Seagram’s Escapes and its line of alcohol-based fruit drinks. A new addition to the product line is Tropical Rosé, which was co-developed by Harrison and contains natural passion fruit, dragon fruit and rosé flavors.

Valiant’s Vincent Lin directed the piece, which features Harrison in a tropical-looking room — brightened with sunny pinks and yellows thanks to NYC’s Nice Shoes — describing the rosé and signing off with the Seagram’s Escapes brand slogan, “Keep it colorful!”

Here, director Lin — and his DP Alexander Chinnici — talks about the project’s conception, shoot and post.

How early did you get involved? Did Valiant act as the creative agency on this spot?
Valiant has a long-standing history with the Seagram’s Escapes brand team, and we were fortunate enough to have the opportunity to brainstorm a few ideas with them early on for their launch of Seagram’s Escapes Tropical Rosé with Chris Harrison. The creative concept was developed by Valiant’s in-house creative agency, headed by creative directors Nicole Zizila and Steven Zizila, and me. Seagram’s was very instrumental in the creative for the project, and we collaborated to make sure it felt fresh and new — like an elevated evolution of their “Keep It Colorful” campaign rather than a replacement.

Clearly, it’s meant to have a tropical vibe. Was it shot greenscreen?
We had considered doing this greenscreen, which would open up some interesting options, but also it would pose some challenges. What was important for this campaign creatively was to seamlessly take Chris Harrison to the magical world of Seagram’s Escapes Tropical Rosé. A practical approach was chosen so it didn’t feel too “out of this world,” and the live action still felt real and relatable. We had considered putting Chris in a tropical location — either in greenscreen or on location — but we really wanted to play to Chris’ personality and strengths and have him lead us to this world, rather than throw him into it. Plus, they didn’t sign off on letting us film in the Maldives. I tried (smiles).

L-R: Vincent Lin and Alex Chinnici

What was the spot shot on?
Working with the very talented DP Alex Chinnici, he recommended shooting on the ARRI Alexa for many reasons. I’ll let Alex answer this one.

Alex Chinnici: Some DPs would likely answer with something sexier  like, “I love the look!” But that is ignoring a lot of the technical realities available to us these days. A lot of these cameras are wonderful. I can manipulate the look, so I choose a camera based on other reasons. Without an on-set live, color-capable DIT, I had to rely on the default LUT seen on set and through post. The Alexa’s default LUT is my preference among the digital cameras. For lighting and everyone on the set, we start in a wonderful place right off the bat. Post houses also know it so well, along with colorists and VFX. Knowing our limitations and expecting not to be entirely involved, I prefer giving these departments the best image/file possible.

Inherently, the color, highlight retention and skin tone are wonderful right off the bat without having to bend over backward for anyone. With the Alexa, you end up being much closer to the end rather than having to jump through hoops to get there like you would with some other cameras. Lastly, the reliability is key. With the little time that we had, and a celebrity talent, I would never put a production through the risk of some new tech. Being in a studio, we had full control but still, I’d rather start in a place of success and only make it better from there.

What about the lenses?
Chinnici: I chose the Zeiss Master Primes for similar reasons. While sharp, they are not overbearing. With some mild filtration and very soft and controlled lighting, I can adjust that in other ways. Plus, I know that post will beautify anything that needs it; giving them a clean, sharp image (especially considering the seltzer can) is key.

I shot at a deeper stop to ensure that the lenses are even cleaner and sharper, although the Master Primes do hold up very well wide open. I also wanted the Seagram’s can to be in focus as much as possible and for us to be able to see the set behind Chris Harrison, as opposed to a very shallow depth of field. I also wanted to ensure little to no flares, solid contrast, sharpness across the field and no surprises.

Thanks Alex. Back to you Vincent. How did you work with Alex to get the right look?
There was a lot of back and forth between Alex and me, and we pulled references to discuss. Ultimately, we knew the two most important things were to highlight Chris Harrison and the product. We also knew we wanted the spot to feel like a progression from the brand’s previous work. We decided the best way to do this was to introduce some dimensionality by giving the set depth with lighting, while keeping a clean, polished and sophisticated aesthetic. We also introduced a bit of camera movement to further pull the audience in and to compose the shots it in a way that all the focus would be on Chris Harrison to bring us into that vibrant CG world.

How did you work with Nice Shoes colorist Chris Ryan to make sure the look stayed on point? 
Nice Shoes is always one of our preferred partners, and Chris Ryan was perfect for the job. Our creatives, Nicole and Steven, had worked with him a number of times. As with all jobs, there are certain challenges and limitations, and we knew we had to work fast. Chris is not only detail oriented, creative and a wizard with color correction, but also able to work efficiently.

He worked on a FilmLight Baselight system off the Alexa raw files. The color grading really brought out the saturation to further reinforce the brand’s slogan, “Keep It Colorful,” but also to manage the highlights and whites so it felt inviting and bright throughout, but not at all sterile.

What about the VFX? Can you talk about how that was accomplished? 
Much like the camera work, we wanted to continue giving dimensionality to the spot by having depth in each of our CG shots. Not only depth in space but also in movement and choreography. We wanted the CG world to feel full of life and vibrant in order to highlight key elements of the beverage — the flavors, dragonfruit and passionfruit — and give it a sense of motion that draws you in while making you believe there’s a world outside of it. We wanted the hero to shine in the center and the animation to play out as if a kaleidoscope or tornado was pulling you in closer and closer.

We sought the help of creative production studio Taylor James tto build the CG elements. We chose to work with a core of 3ds Max artists who could do a range of tasks using Autodesk 3ds Max and Chaos Group’s V-Ray (we also use Maya and Arnold). We used Foundry Nuke to composite all of the shots and integrate the CGI into the footage. The 3D asset creation, animation and lighting were constructed and rendered in Autodesk Maya, with compositing done in Adobe After Effects.

One of the biggest challenges was making sure the live action felt connected to the CG world, but with each still having its own personality. There is a modern and clean feel to these spots that we wanted to uphold while still making it feel fun and playful with colors and movement. There were definitely a few earlier versions that we went a bit crazy with and had to scale down a bit.

Does a lot of your work feature live action and visual effects combined?
I think of VFX like any film technique: It’s simply a tool for directors and creatives to use. The most essential thing is to understand the brand, if it’s a commercial, and to understand the story you are trying to tell. I’ve been fortunate to do a number of spots that involve live-action and VFX now, but truth be told, VFX almost always sneaks its way in these days.

Even if I do a practical effect, there are limitless possibilities in post production and VFX. Anything from simple cleanup to enhancing, compositing, set building and extending — it’s all possible. It’d be foolish not to consider it as a viable tool. Now, that’s not to say you should rely solely on VFX to fix problems, but if there’s a way it can improve your work, definitely use it. For this particular project, obviously, the CG was crucial to let us really be immersed in a magical world at the level of realism and proximity we desired.

Anything challenging about this spot that you’d like to share?
Chris Harrison was terrible to work with and refused to wear a shirt for some reason … I’m just kidding! Chris was one of the most professional, humblest and kindest celebrity talents that I’ve had the pleasure to work with. This wasn’t a simple endorsement for him; he actually did work closely with Seagram’s Escapes over several months to create and flavor-test the Tropical Rosé beverage.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Missing Link, The Lion King among VES Award winners

The Visual Effects Society (VES), the industry’s global professional honorary society, held its 18th Annual VES Awards, the yearly celebration that recognizes outstanding visual effects artistry and innovation in film, animation, television, commercials, video games and special venues.

Comedian Patton Oswalt served as host for the 9th time to the more than 1,000 guests gathered at the Beverly Hilton to celebrate VFX talent in 25 awards categories. The Lion King was named the photoreal feature winner, garnering three awards. Missing Link was named top animated film, winning two awards. The Mandalorian was named best photoreal episode and garnered two awards, with Game of Thrones and Stranger Things 3 also winning two awards each. Hennessy: The Seven Worlds topped the commercial field with two wins.

Andy Serkis presented the VES Award for Creative Excellence to visual effects supervisor Sheena Duggal. Joey King presented the VES Visionary Award to director-producer-screenwriter Roland Emmerich. VFX supervisor Pablo Helman presented the Lifetime Achievement Award to director/producer/screenwriter Martin Scorsese, who accepted via video from New York. Scorsese’s The Irishman also picked up two awards, including Outstanding Supporting Visual Effects in a Photoreal Feature.

Presenters also included: directors J.J. Abrams, Jon Favreau, Rian Johnson and Josh Cooley.

Winners of the 18th Annual VES Awards are as follows:

Outstanding Visual Effects in a Photoreal Feature

THE LION KING

Robert Legato

Tom Peitzman

Adam Valdez

Andrew R. Jones

 

Outstanding Supporting Visual Effects in a Photoreal Feature

THE IRISHMAN

Pablo Helman

Mitchell Ferm

Jill Brooks

Leandro Estebecorena

Jeff Brink

 

Outstanding Visual Effects in an Animated Feature

MISSING LINK

Brad Schiff

Travis Knight

Steve Emerson

Benoit Dubuc

 

Outstanding Visual Effects in a Photoreal Episode

THE MANDALORIAN; The Child

Richard Bluff

Abbigail Keller

Jason Porter

Hayden Jones

Roy K. Cancino

 

Outstanding Supporting Visual Effects in a Photoreal Episode

CHERNOBYL; 1:23:45

Max Dennison

Lindsay McFarlane

Clare Cheetham

Paul Jones

Claudius Christian Rauch

 

Outstanding Visual Effects in a Real-Time Project

Control

Janne Pulkkinen

Elmeri Raitanen

Matti Hämäläinen

James Tottman

 

Outstanding Visual Effects in a Commercial

Hennessy: The Seven Worlds

Carsten Keller

Selçuk Ergen

Kiril Mirkov

William Laban

 

Outstanding Visual Effects in a Special Venue Project

Star Wars: Rise of the Resistance

Jason Bayever

Patrick Kearney

Carol Norton

Bill George

 

Outstanding Animated Character in a Photoreal Feature

ALITA: BATTLE ANGEL; Alita

Michael Cozens

Mark Haenga

Olivier Lesaint

Dejan Momcilovic

 

Outstanding Animated Character in an Animated Feature

MISSING LINK; Susan

Rachelle Lambden

Brenda Baumgarten

Morgan Hay

Benoit Dubuc

 

Outstanding Animated Character in an Episode or Real-Time Project

STRANGER THINGS 3; Tom/Bruce Monster

Joseph Dubé-Arsenault

Antoine Barthod

Frederick Gagnon

Xavier Lafarge

 

Outstanding Animated Character in a Commercial

Cyberpunk 2077; Dex

Jonas Ekman

Jonas Skoog

Marek Madej

Grzegorz Chojnacki

 

Outstanding Created Environment in a Photoreal Feature

THE LION KING; The Pridelands

Marco Rolandi

Luca Bonatti

Jules Bodenstein

Filippo Preti

 

Outstanding Created Environment in an Animated Feature

TOY STORY 4; Antiques Mall

Hosuk Chang

Andrew Finley

Alison Leaf

Philip Shoebottom

 

Outstanding Created Environment in an Episode, Commercial, or Real-Time Project

GAME OF THRONES; The Iron Throne; Red Keep Plaza

Carlos Patrick DeLeon

Alonso Bocanegra Martinez

Marcela Silva

Benjamin Ross

 

Outstanding Virtual Cinematography in a CG Project

THE LION KING

Robert Legato

Caleb Deschanel

Ben Grossmann

AJ Sciutto

 

Outstanding Model in a Photoreal or Animated Project

THE MANDALORIAN; The Sin; The Razorcrest

Doug Chiang

Jay Machado

John Goodson

Landis Fields IV

 

Outstanding Effects Simulations in a Photoreal Feature

STAR WARS: THE RISE OF SKYWALKER

Don Wong

Thibault Gauriau

Goncalo Cababca

François-Maxence Desplanques

 

Outstanding Effects Simulations in an Animated Feature

FROZEN 2

Erin V. Ramos

Scott Townsend

Thomas Wickes

Rattanin Sirinaruemarn

 

Outstanding Effects Simulations in an Episode, Commercial, or Real-Time Project

STRANGER THINGS 3; Melting Tom/Bruce

Nathan Arbuckle

Christian Gaumond

James Dong

Aleksandr Starkov

 

Outstanding Compositing in a Feature

THE IRISHMAN

Nelson Sepulveda

Vincent Papaix

Benjamin O’Brien

Christopher Doerhoff

 

Outstanding Compositing in an Episode

GAME OF THRONES; The Long Night; Dragon Ground Battle

Mark Richardson

Darren Christie

Nathan Abbott

Owen Longstaff

 

Outstanding Compositing in a Commercial

Hennessy: The Seven Worlds

Rod Norman

Guillaume Weiss

Alexander Kulikov

Alessandro Granella

 

Outstanding Special (Practical) Effects in a Photoreal or Animated Project

THE DARK CRYSTAL: THE AGE OF RESISTANCE; She Knows All the Secrets

Sean Mathiesen

Jon Savage

Toby Froud

Phil Harvey

 

Outstanding Visual Effects in a Student Project

THE BEAUTY

Marc Angele

Aleksandra Todorovic

Pascal Schelbli

Noel Winzen

 

VFX-heavy Skyworth OLED TV spot via The-Artery

The-Artery created a spot for Skyworth’s latest version of its W81|W81 Pro Wallpaper OLED TV, which debuted last month at the “See the Wonder” event at CES 2020.

Created using The-Artery‘s newly opened Resolve-based color room and expanded design capabilities —spearheaded by colorist Stephen Picano and design director Lauren Indovina — the commercial features a couple swimming through space-like waters, children battling origami dragons while floating in a paper boat and a traveler treking through snowy tundras while glowing jellyfish float overhead. Publicis, Skyworth’s agency, wanted the ad to reflect “the wonder” of the company’s newest television model.

“The campaign, helmed by director Eli Sverdlov, was very director-led in a way that I’ve never seen before,” explains The-Artery’s EP/MD, Deborah Sullivan. “Of course, there was still ongoing dialogue with the client and agency, but the level of creative control that was entrusted is almost unheard of. Everything was open from start to finish, including the ideation phase, color grading and design — to name a few. Our team had a lot of fun jumping straight into the edit to develop and launch what we consider as a high-end conceptual throwback to the nineties.”

Sverdlov agrees: “Our flexible creative process was in a condensed schedule and required a very unique collaboration. We were practically creating the ideas and visuals while editing and sourcing footage.”

Due to the production’s long shooting schedule and tight deadlines, the visual effects were designed via Autodesk Flame in realtime, all under one roof, while filming took place in Serbia. Additional footage was carefully curated as well as color graded and cut to fit the tone and flow of the rest of the piece. The creature imagery such as the jellyfish was done via CG.

In addition to Flame and Resolve, The-Artery called on SideFX Houdini, Autodesk Maya, Maxon’s RedShift, Otoy’s Octane, Autodesk’s Arnold, Adobe After Effects and Maxon’s Cinema 4D.

Rob Legato talks The Lion King‘s Oscar-nominated visual effects

By Karen Moltenbrey

There was a lot of buzz before — and after — this summer’s release of Disney’s remake of the animated classic The Lion King. And what’s not to love? From the animals to the African savannas, Disney brought the fabled world of Simba to life in what is essentially a “live-action” version of the beloved 1994 2D feature of the same name. Indeed, the filmmakers used tenets of live-action filmmaking to create The Lion King, and themselves call it a visual effects film. However, there are those who consider this remake, like the original, an animated movie, as 2019’s The Lion King used cutting-edge CGI for the photoreal beasts and environments.

Rob Legato

Whether you call it “live action” or “animation,” one thing’s for sure. This is no ordinary film. And, it was made using no ordinary production process. Rather, it was filmed entirely in virtual reality. And it’s been nominated for a Best Visual Effects Oscar this year.

“Everything in it is a visual effect, created in the same way that we would make a visual effects-oriented film, where we augment or create the backgrounds or create computer-generated characters for a scene or sequence. But in this case, that spanned the entire movie,” says VFX supervisor Rob Legato. “We used a traditional visual effects pipeline and hired MPC, which is a visual effects studio, not an animation house.”

MPC, which created the animals and environments, crafted all the elements, which were CG, and handled the virtual production, working with Magnopus to develop the necessary tools that would take the filmmakers from previz though shooting and, eventually, into post production. Even the location scouting occurred within VR, with Legato, director Jon Favreau and others, including cinematographer Caleb Deschanel, simultaneously walking through the sets and action by using HTC Vive headsets.

Caleb Deschanel (headset) and Rob Legato. Credit: Michael Legato

The Animations and Environments
MPC, known for its photorealistic animals and more, had worked with Disney and Favreau on the 2016 remake of The Jungle Book, which was shot within a total greenscreen environment and used realistic CG characters and sets with the exception of the boy Mowgli. (It also used VR, albeit for previsualization only.) The group’s innovative effort for that work won an Oscar for visual effects. Apparently that was just the tip of the spear, so to speak, as the team upped its game with The Lion King, making the whole production entirely CG and taking the total filmmaking process into virtual reality.

“It had to look as believable as possible. We didn’t want to exaggerate the performances or the facial features, which would make them less realistic,” says Legato of the animal characters in The Lion King.

The CG skeletons were built practically bone for bone to match their real-life counterparts, and the digital fur matched the hair variations of the various species found in nature. The animators, meanwhile, studied the motion of the real-life animals and moved the digital muscles accordingly.

“Your eye picks up when [the animal] is doing something that it can’t really do, like if it stretches its leg too far or doesn’t have the correct weight distribution that’s affecting the other muscles when it puts a paw down,” says Legato, contending that it is almost impossible to tell the CG version of the characters from the real thing in a non-talking shot or a still frame.

To craft the animals and environments, the MPC artists used Autodesk’s Maya as the main animation program, along with SideFX Houdini for water and fire simulations and Pixar’s RenderMan for rendering. MPC also used custom shaders and tools, particularly for the fur, mimicking that of the actual animal. “A lion has so many different types of hair — short hair around the body, the bushy mane, thick eyebrow hairs and whiskers. And every little nuance was recreated and faithfully reproduced,” Legato adds.

MPC artists brought to life dozens and dozens of animals for the film and then generated many more unique variations — from lions to mandrills to hyenas to zebras and more, even birds and bugs. And then the main cast and background animals were placed within a photoreal environment, where they were shot with virtual cameras that mimicked real cameras.

The world comprises expansive, open landscapes. “There were many, many miles of landscapes that were constructed,” says Legato. The filmmakers would film within pockets that were dressed and populated for different scenes, from Pride Rock to the interior of a cave to the savanna to the elephant graveyard — all built in CGI.

“Everything was simulated to be the real thing, so the sum total of the illusion is that it’s all real. And everything supports each other — the grounds, the characters, what they are physically doing. The sum total of that adds up to where your brain just says, ‘OK, this must be real. I’ll stop looking for flaws and will now just watch the story,’” says Legato. “That was the creative intent behind it.”

Virtual Production
All the virtual camera work was accomplished within Unity’s engine, so all the assets were ported in and out of that game engine. “Everyone would then know where our cameras were, what our camera moves were, how we were following the action, our lens choices, where the lights were placed … all those things,” says Legato.

Magnopus created the VR tools specific for the film, which ran on top of Unity to get the various work accomplished, such as the operation of the cameras. “We had a crane, dolly and other types of cameras encoded so that it basically drove its mate in the computer. For instance, we created a dolly and then had a physical dolly with encoders on it, so everything was hand operated, and we had a dolly grip and a camera assistant pulling focus. There was someone operating the cameras, and sometimes there was a crane operator. We did SteadiCam as well through an actual SteadiCam with a sensor on it to work with OptiTrack [motion capture that was used to track the camera],” explains Legato. “We built a little rig for the SteadiCam as well as one for a drone we’d fly around the stage, and we’d create the illusion that it was a helicopter shot while flying around Africa.”

Because the area within VR was so vast, a menu system was created so the filmmakers could locate one another within the virtual environment, making location scouting much easier. They also could take snapshots of different areas and angles and share them with the group. “We were standing next to each other [on stage], but within the virtual environment, we could be miles apart and not see each other because we’re maybe behind trees or rocks.”

As Legato points out, the menu tool is pretty robust. “We basically built a game of film production. Everything was customizable,” he says. Using iPads, the group could play the animation. As the camera was in operation, they could stop the animation, wind it backward, speed it forward, shoot it in slow motion or faster motion. “These options were all accessible to us,” he adds.

Legato provides the following brief step-by-step overview of how the virtual production occurred. First, the art department created the sets — Africa with the trees, ponds, rivers, mountains, waterfalls and so forth. “Based on the script, you know somewhat where you need to be [in the set],” he says. Production designer James Chinlund would make a composite background, and then they — along with Favreau, Deschanel and animation supervisor Andrew Jones — would go into VR.

“We had built these full-size stationary chess pieces of the animals, and in VR, we’d have these tools that let us grab a lion, for instance, or a meerkat, and position them, and then we’d look through the lens and start from there,” says Legato. “We would either move them by hand or puppeteer a simple walk cycle to get the idea of the blocking.”

Jones and his team would animate that tableau and port it back into the game engine as an animation cycle. “We’d find camera angles and augment them. We’d change some of the animation or slow it down or move the animals in slightly different positions. And then we’d shoot it like it’s on a live-action stage,” explains Legato. “We’d put a dolly track down, cover the action with various types of lenses, create full-coverage film dailies… We could shoot the same scene in as many different angles as we’d wish. We could then play it out to a video deck and start editing it right away.” The shots they liked might get rendered with more light or motion blur, but a lot of the time, they’d go right off the video tap.

Meanwhile, MPC recorded everything the filmmakers did and moved — every leaf, rock, tree, animal. Then, in post, all of that information would be reconverted back into Maya sets and the animation fine-tuned.

“In a nutshell, the filmmakers were imparting a live-action quality to the process — by not faking it, but by actually doing it,” says Legato. “And we still have the flexibility of full CGI.”

The Same, But Different
According to Legato, it did not take the group long to get the hang of working in VR. And the advantages are many — chief among them, time savings when it comes to planning and creating the sequence editorially, and then instantly being able to reshoot or iterate the scene inexpensively. “There is literally no downside to exploring a bold choice or an alternate angle on the concept,” he points out.

Yes, virtual filmmaking is the future, contends Legato.

So, back to the original question: Is The Lion King a VFX film or an animated film? “It’s perhaps a hybrid,” says Legato. “But, if you didn’t know how we did it and if the animals didn’t talk, you’d think it was done in the traditional manner of a live-action film. Which it is, visually speaking. You wouldn’t necessarily describe it as looking like ‘an animated film’ because it doesn’t really look like an animated film, like a Pixar or DreamWorks movie. By labeling it as such, you’re putting it into a hole that it’s not. It’s truly just a movie. How we achieved it is immaterial, as it should be.”

Legato and his colleagues call it “live action,” which it truly is. But some, including the Golden Globes, categorized it as “animation.” (They also called 2015’s The Martian and 2010’s The Tourist “comedies.”)

Call it what you will; the bottom line is that the film is breathtaking and the storytelling is amazing. And the filmmaking is inventive and pushes traditional boundaries, making it difficult to perhaps fit into a traditional category. Therefore, “beautiful,” “riveting,” “creative” and “innovative” might be the only descriptions necessary.


Karen Moltenbrey is a veteran writer, covering visual effects and post production.

Check out MPC’s VFX breakdown on the film:

Picture Shop VFX acquires Denmark’s Ghost VFX

Burbank’s Picture Shop VFX has acquired Denmark’s Ghost VFX. This Copenhagen-base studio, founded in 1999, provides high-end visual work for film, television and several streaming platforms. The move helps Picture Shop “increase its services worldwide and broaden its talent and expertise,” according to Picture Shop VFX’s president Tom Kendall.

Over the years, Ghost has contributed to more than 70 feature films and titles. Some of Ghost’s work includes Star Wars: The Rise of Skywalker, The Mandalorian, The Walking Dead, See, Black Panther and Star Trek Discovery.

“As we continue to expand our VFX footprint into the international market, I am extremely excited to have Ghost join Picture Shop VFX,” says Bill Romeo, president of Picture Head Holdings.

Christensen says the studio takes up three floors and 13,000 square feet in a “vintage and beautifully renovated office building” in Copenhagen. Their main tools are Autodesk Maya, Foundry Nuke and SideFX Houdini.

“We are really looking forward to a tight-nit collaboration with all the VFX teams in the Picture Shop group,” says Christensen. “Right now Ghost will continue servicing current clients and projects, but we’re really looking forward to exploring the massive potential of being part of a larger and international family.”

Picture Shop VFX is a division of Picture Head Holdings. Picture Head Holdings has locations in Los Angeles, Vancouver, the United Kingdom, and Denmark.

Main Image: Ghost artists at work.

VES Awards: The Lion King and Alita earn five noms each

The Visual Effects Society (VES) has announced its nominees for the 18th Annual VES Awards, which recognize outstanding visual effects artistry and innovation in film, animation, television, commercials and video games and the VFX supervisors, VFX producers and hands-on artists who bring this work to life. Alita: Battle Angel and The Lion King both have five nominations each; Toy Story 4 is the top animated film contender with five nominations, and Game of Thrones and The Mandalorian tie to lead the broadcast field with six nominations each.

Nominees in 25 categories were selected by VES members via events hosted by 11 VES sections, including Australia, the Bay Area, Germany, London, Los Angeles, Montreal, New York, New Zealand, Toronto, Vancouver and Washington.

The VES Awards will be held on January 29 at the Beverly Hilton Hotel. The VES Lifetime Achievement Award will be presented to Academy, DGA and Emmy-Award winning director-producer-screenwriter Martin Scorsese. The VES Visionary Award will be presented to director-producer-screenwriter Roland Emmerich. And the VES Award for Creative Excellence will be given to visual effects supervisor Sheena Duggal. Award-winning actor-comedian-author Patton Oswalt will once again host the event.

The nominees for the 18th Annual VES Awards in 25 categories are:

 

Outstanding Visual Effects in a Photoreal Feature

 

ALITA: BATTLE ANGEL

Richard Hollander

Kevin Sherwood

Eric Saindon

Richard Baneham

Bob Trevino

 

AVENGERS: ENDGAME

Daniel DeLeeuw

Jen Underdahl

Russell Earl

Matt Aitken

Daniel Sudick

 

GEMINI MAN

Bill Westenhofer

Karen Murphy-Mundell

Guy Williams

Sheldon Stopsack

Mark Hawker

 

STAR WARS: THE RISE OF SKYWALKER

Roger Guyett

Stacy Bissell

Patrick Tubach

Neal Scanlan

Dominic Tuohy

 

THE LION KING

Robert Legato

Tom Peitzman

Adam Valdez

Andrew R. Jones

 

Outstanding Supporting Visual Effects in a Photoreal Feature

 

1917

Guillaume Rocheron

Sona Pak

Greg Butler

Vijay Selvam

Dominic Tuohy

 

FORD V FERRARI

Olivier Dumont

Kathy Siegel

Dave Morley

Malte Sarnes

Mark Byers

 

JOKER

Edwin Rivera

Brice Parker

Mathew Giampa

Bryan Godwin

Jeff Brink

 

THE AERONAUTS

Louis Morin

Annie Godin

Christian Kaestner

Ara Khanikian

Mike Dawson

 

THE IRISHMAN

Pablo Helman

Mitch Ferm

Jill Brooks

Leandro Estebecorena

Jeff Brink

 

Outstanding Visual Effects in an Animated Feature

 

FROZEN 2

Steve Goldberg

Peter Del Vecho

Mark Hammel

Michael Giaimo

 

KLAUS

Sergio Pablos

Matthew Teevan

Marcin Jakubowski

Szymon Biernacki

 

MISSING LINK

Brad Schiff

Travis KnightSteve Emerson

Benoit Dubuc

 

THE LEGO MOVIE 2

David Burgess

Tim Smith

Mark Theriault

John Rix

 

TOY STORY 4

Josh Cooley

Mark Nielsen

Bob Moyer

Gary Bruins

 

Outstanding Visual Effects in a Photoreal Episode

 

GAME OF THRONES; The Bells

Joe Bauer

Steve Kullback

Ted Rae

Mohsen Mousavi

Sam Conway

 

HIS DARK MATERIALS; The Fight to the Death

Russell Dodgson

James Whitlam

Shawn Hillier

Robert Harrington

 

LADY AND THE TRAMP

Robert Weaver

Christopher Raimo

Arslan Elver

Michael Cozens

Bruno Van Zeebroeck

 

LOST IN SPACE – Episode: Ninety-Seven

Jabbar Raisani

Terron Pratt

Niklas Jacobson

Juri Stanossek

Paul Benjamin

 

STRANGER THINGS – Chapter Six: E Pluribus Unum

Paul Graff

Tom Ford

Michael Maher Jr.

Martin Pelletier

Andy Sowers

 

THE MANDALORIAN; The Child

Richard Bluff

Abbigail Keller

Jason Porter

Hayden Jones

Roy Cancinon

 

Outstanding Supporting Visual Effects in a Photoreal Episode

 

CHERNOBYL; 1:23:45

Max Dennison

Lindsay McFarlane

Clare Cheetham

Paul Jones

Claudius Christian Rauch

 

LIVING WITH YOURSELF; Nice Knowing You

Jay Worth

Jacqueline VandenBussche

Chris Wright

Tristan Zerafa

 

SEE; Godflame

Adrian de Wet

Eve Fizzinoglia

Matthew Welford

Pedro Sabrosa

Tom Blacklock

 

THE CROWN; Aberfan

Ben Turner

Reece Ewing

David Fleet

Jonathan Wood

 

VIKINGS; What Happens in the Cave

Dominic Remane

Mike Borrett

Ovidiu Cinazan

Tom Morrison

Paul Byrne

 

Outstanding Visual Effects in a Real-Time Project

 

Call of Duty Modern Warfare

Charles Chabert

Chris Parise

Attila Zalanyi

Patrick Hagar

 

Control

Janne Pulkkinen

Elmeri Raitanen

Matti Hämäläinen

James Tottman

 

Gears 5

Aryan Hanbeck

Laura Kippax

Greg Mitchell

Stu Maxwell

 

Myth: A Frozen Tale

Jeff Gipson

Nicholas Russell

Brittney Lee

Jose Luis Gomez Diaz

 

Vader Immortal: Episode I

Ben Snow

Mike Doran

Aaron McBride

Steve Henricks

 

Outstanding Visual Effects in a Commercial

 

Anthem Conviction

Viktor Muller

Lenka Likarova

Chris Harvey

Petr Marek

 

BMW Legend

Michael Gregory

Christian Downes

Tim Kafka

Toya Drechsler

 

Hennessy: The Seven Worlds

Carsten Keller

Selcuk Ergen

Kiril Mirkov

William Laban

 

PlayStation: Feel The Power of Pro

Sam Driscoll

Clare Melia

Gary Driver

Stefan Susemihl

 

Purdey’s: Hummingbird

Jules Janaud

Emma Cook

Matthew Thomas

Philip Child

 

Outstanding Visual Effects in a Special Venue Project

 

Avengers: Damage Control

Michael Koperwas

Shereif Fattouh

Ian Bowie

Kishore Vijay

Curtis Hickman

 

Jurassic World: The Ride

Hayden Landis

Friend Wells

Heath Kraynak

Ellen Coss

 

Millennium Falcon: Smugglers Run

Asa Kalama

Rob Huebner

Khatsho Orfali

Susan Greenhow

 

Star Wars: Rise of the Resistance

Jason Bayever

Patrick Kearney

Carol Norton

Bill George

 

Universal Sphere

James Healy

Morgan MacCuish

Ben West

Charlie Bayliss

 

Outstanding Animated Character in a Photoreal Feature

 

ALITA: BATTLE ANGEL; Alita

Michael Cozens

Mark Haenga

Olivier Lesaint

Dejan Momcilovic

 

AVENGERS: ENDGAME; Smart Hulk

Kevin Martel

Ebrahim Jahromi

Sven Jensen

Robert Allman

 

GEMINI MAN; Junior

Paul Story

Stuart Adcock

Emiliano Padovani

Marco Revelant

 

THE LION KING; Scar

Gabriel Arnold

James Hood

Julia Friedl

Daniel Fortheringham

 

 

 

 

Outstanding Animated Character in an Animated Feature

 

FROZEN 2; The Water Nøkk

Svetla Radivoeva

Marc Bryant

Richard E. Lehmann

Cameron Black

 

KLAUS; Jesper

Yoshimishi Tamura

Alfredo Cassano

Maxime Delalande

Jason Schwartzman

 

MISSING LINK; Susan

Rachelle Lambden

Brenda Baumgarten

Morgan Hay

Benoit Dubuc

 

TOY STORY 4; Bo Peep

Radford Hurn

Tanja Krampfert

George Nguyen

Becki Rocha Tower

 

Outstanding Animated Character in an Episode or Real-Time Project

 

LADY AND THE TRAMP; Tramp

Thiago Martins

Arslan Elver

Stanislas Paillereau

Martine Chartrand

 

STRANGER THINGS 3; Tom/Bruce Monster

Joseph Dubé-Arsenault

Antoine Barthod

Frederick Gagnon

Xavier Lafarge

 

THE MANDALORIAN; The Child; Mudhorn

Terry Bannon

Rudy Massar

Hugo Leygnac

 

THE UMBRELLA ACADEMY; Pilot; Pogo

Aidan Martin

Craig Young

Olivier Beierlein

Laurent Herveic

 

Outstanding Animated Character in a Commercial

 

Apex Legends; Meltdown; Mirage

Chris Bayol

John Fielding

Derrick Sesson

Nole Murphy

 

Churchill; Churchie

Martino Madeddu

Philippe Moine

Clement Granjon

Jon Wood

 

Cyberpunk 2077; Dex

Jonas Ekman

Jonas Skoog

Marek Madej

Grzegorz Chojnacki

 

John Lewis; Excitable Edgar; Edgar

Tim van Hussen

Diarmid Harrison-Murray

Amir Bazzazi

Michael Diprose

 

 

Outstanding Created Environment in a Photoreal Feature

 

ALADDIN; Agrabah

Daniel Schmid

Falk Boje

Stanislaw Marek

Kevin George

 

ALITA: BATTLE ANGEL; Iron City

John Stevenson-Galvin

Ryan Arcus

Mathias Larserud

Mark Tait

 

MOTHERLESS BROOKLYN; Penn Station

John Bair

Vance Miller

Sebastian Romero

Steve Sullivan

 

STAR WARS: THE RISE OF SKYWALKER; Pasaana Desert

Daniele Bigi

Steve Hardy

John Seru

Steven Denyer

 

THE LION KING; The Pridelands

Marco Rolandi

Luca Bonatti

Jules Bodenstein

Filippo Preti

 

 

Outstanding Created Environment in an Animated Feature

 

FROZEN 2; Giants’ Gorge

Samy Segura

Jay V. Jackson

Justin Cram

Scott Townsend

 

HOW TO TRAIN YOUR DRAGON: THE HIDDEN WORLD; The Hidden World

Chris Grun

Ronnie Cleland

Ariel Chisholm

Philippe Brochu

 

MISSING LINK; Passage to India Jungle

Oliver Jones

Phil Brotherton

Nick Mariana

Ralph Procida

 

TOY STORY 4; Antiques Mall

Hosuk Chang

Andrew Finley

Alison Leaf

Philip Shoebottom

 

 

Outstanding Created Environment in an Episode, Commercial, or Real-Time Project

 

GAME OF THRONES; The Iron Throne; Red Keep Plaza

Carlos Patrick DeLeon

Alonso Bocanegra Martinez

Marcela Silva

Benjamin Ross

 

LOST IN SPACE; Precipice; The Trench

Philip Engström

Benjamin Bernon

Martin Bergquist

Xuan Prada

 

THE DARK CRYSTAL: AGE OF RESISTANCE; The Endless Forest

Sulé Bryan

Charles Chorein

Christian Waite

Martyn Hawkins

 

THE MANDALORIAN; Nevarro Town

Alex Murtaza

Yanick Gaudreau

Marco Tremblay

Maryse Bouchard

 

Outstanding Virtual Cinematography in a CG Project

 

ALITA: BATTLE ANGEL

Emile Ghorayeb

Simon Jung

Nick Epstein

Mike Perry

 

THE LION KING

Robert Legato

Caleb Deschanel

Ben Grossmann

AJ Sciutto

 

THE MANDALORIAN; The Prisoner; The Roost

Richard Bluff

Jason Porter

Landis Fields IV

Baz Idione

 

 

TOY STORY 4

Jean-Claude Kalache

Patrick Lin

 

Outstanding Model in a Photoreal or Animated Project

 

LOST IN SPACE; The Resolute

Xuan Prada

Jason Martin

Jonathan Vårdstedt

Eric Andersson

 

MISSING LINK; The Manchuria

Todd Alan Harvey

Dan Casey

Katy Hughes

 

THE MAN IN THE HIGH CASTLE; Rocket Train

Neil Taylor

Casi Blume

Ben McDougal

Chris Kuhn

 

THE MANDALORIAN; The Sin; The Razorcrest

Doug Chiang

Jay Machado

John Goodson

Landis Fields IV

 

Outstanding Effects Simulations in a Photoreal Feature

 

DUMBO; Bubble Elephants

Sam Hancock

Victor Glushchenko

Andrew Savchenko

Arthur Moody

 

SPIDER-MAN: FAR FROM HOME; Molten Man

Adam Gailey

Jacob Santamaria

Jacob Clark

Stephanie Molk

 

 

 

 

 

STAR WARS: THE RISE OF SKYWALKER

Don Wong

Thibault Gauriau

Goncalo Cababca

Francois-Maxence Desplanques

 

THE LION KING

David Schneider

Samantha Hiscock

Andy Feery

Kostas Strevlos

 

Outstanding Effects Simulations in an Animated Feature

 

ABOMINABLE

Alex Timchenko

Domin Lee

Michael Losure

Eric Warren

 

FROZEN 2

Erin V. Ramos

Scott Townsend

Thomas Wickes

Rattanin Sirinaruemarn

 

HOW TO TRAIN YOUR DRAGON: THE HIDDEN WORLD; Water and Waterfalls

Derek Cheung

Baptiste Van Opstal

Youxi Woo

Jason Mayer

 

TOY STORY 4

Alexis Angelidis

Amit Baadkar

Lyon Liew

Michael Lorenzen

 

Outstanding Effects Simulations in an Episode, Commercial, or Real-Time Project

 

GAME OF THRONES; The Bells

Marcel Kern

Paul Fuller

Ryo Sakaguchi

Thomas Hartmann

 

Hennessy: The Seven Worlds

Selcuk Ergen

Radu Ciubotariu

Andreu Lucio

Vincent Ullmann

 

LOST IN SPACE; Precipice; Water Planet

Juri Bryan

Hugo Medda

Kristian Olsson

John Perrigo

 

STRANGER THINGS 3; Melting Tom/Bruce

Nathan Arbuckle

Christian Gaumond

James Dong

Aleksandr Starkov

 

THE MANDALORIAN; The Child; Mudhorn

Xavier Martin Ramirez

Ian Baxter

Fabio Siino

Andrea Rosa

 

Outstanding Compositing in a Feature

 

ALITA: BATTLE ANGEL

Adam Bradley

Carlo Scaduto

Hirofumi Takeda

Ben Roberts

 

AVENGERS: ENDGAME

Tim Walker

Blake Winder

Tobias Wiesner

Joerg Bruemmer

 

CAPTAIN MARVEL; Young Nick Fury

Trent Claus

David Moreno Hernandez

Jeremiah Sweeney

Yuki Uehara

 

STAR WARS: THE RISE OF SKYWALKER

Jeff Sutherland

John Galloway

Sam Bassett

Charles Lai

 

THE IRISHMAN

Nelson Sepulveda

Vincent Papaix

Benjamin O’Brien

Christopher Doerhoff

 

 

Outstanding Compositing in an Episode

 

GAME OF THRONES; The Bells

Sean Heuston

Scott Joseph

James Elster

Corinne Teo

 

GAME OF THRONES; The Long Night; Dragon Ground Battle

Mark Richardson

Darren Christie

Nathan Abbott

Owen Longstaff

 

STRANGER THINGS 3; Starcourt Mall Battle

Simon Lehembre

Andrew Kowbell

Karim El-Masry

Miklos Mesterhazy

 

WATCHMEN; Pilot; Looking Glass

Nathaniel Larouche

Iyi Tubi

Perunika Yorgova

Mitchell Beaton

 

Outstanding Compositing in a Commercial

 

BMW Legend

Toya Drechsler

Vivek Tekale

Guillaume Weiss

Alexander Kulikov

 

Feeding America; I Am Hunger in America

Dan Giraldo

Marcelo Pasqualino

Alexander Koester

 

Hennessy; The Seven Worlds

Rod Norman

Guillaume Weiss

Alexander Kulikov

Alessandro Granella

 

PlayStation: Feel the Power of Pro

Gary Driver

Stefan Susemihl

Greg Spencer

Theajo Dharan

 

Outstanding Special (Practical) Effects in a Photoreal or Animated Project

 

ALADDIN; Magic Carpet

Mark Holt

Jay Mallet

Will Wyatt

Dickon Mitchell

 

GAME OF THRONES; The Bells

Sam Conway

Terry Palmer

Laurence Harvey

Alastair Vardy

 

TERMINATOR: DARK FATE

Neil Corbould

David Brighton

Ray Ferguson

Keith Dawson

 

THE DARK CRYSTAL: THE AGE OF RESISTANCE; She Knows All the Secrets

Sean Mathiesen

Jon Savage

Toby Froud

Phil Harvey

 

Outstanding Visual Effects in a Student Project

 

DOWNFALL

Matias Heker

Stephen Moroz

Bradley Cocksedge

 

LOVE AND FIFTY MEGATONS

Denis Krez

Josephine Roß

Paulo Scatena

Lukas Löffler

 

OEIL POUR OEIL

Alan Guimont

Thomas Boileau

Malcom Hunt

Robin Courtoise

 

THE BEAUTY

Marc Angele

Aleksandra Todorovic

Pascal Schelbli

Noel Winzen

 

 

Recreating the Vatican and Sistine Chapel for Netflix’s The Two Popes

The Two Popes, directed by Fernando Meirelles, stars Anthony Hopkins as Pope Benedict XVI and Jonathan Pryce as current pontiff Pope Francis in a story about one of the most dramatic transitions of power in the Catholic Church’s history. The film follows a frustrated Cardinal Bergoglio (the future Pope Francis) who in 2012 requests permission from Pope Benedict to retire because of his issues with the direction of the church. Instead, facing scandal and self-doubt, the introspective Benedict summons his harshest critic and future successor to Rome to reveal a secret that would shake the foundations of the Catholic Church.

London’s Union was approached in May 2017 and supervised visual effects on location in Argentina and Italy over several months. A large proportion of the film takes place within the walls of Vatican City. The Vatican was not involved in the production and the team had very limited or no access to some of the key locations.

Under the direction of production designer Mark Tildesley, the production replicated parts of the Vatican at Rome’s Cinecitta Studios, including a life-size, open ceiling, Sistine Chapel, which took two months to build.

The team LIDAR-scanned everything available and set about amassing as much reference material as possible — photographing from a permitted distance, scanning the set builds and buying every photographic book they could lay their hands on.

From this material, the team set about building 3D models — created in Autodesk Maya — of St. Peter’s Square, the Basilica and the Sistine Chapel. The environments team was tasked with texturing all of these well-known locations using digital matte painting techniques, including recreating Michelangelo’s masterpiece on the ceiling of the Sistine Chapel.

The story centers on two key changes of pope in 2005 and 2013. Those events attracted huge attention, filling St. Peter’s Square with people eager to discover the identity of the new pope and celebrate his ascension. News crews from around the world also camp out to provide coverage for the billions of Catholics all over the world.

To recreate these scenes, the crew shot at a school in Rome (Ponte Mammolo) that has the same pattern on its floor. A cast of 300 extras was shot in blocks in different positions at different times of day, with costume tweaks including the addition of umbrellas to build a library that would provide enough flexibility during post to recreate these moments at different times of day and in different weather conditions.

Union also called on Clear Angle Studios to individually scan 50 extras to provide additional options for the VFX team. This was an ambitious crowd project, so the team couldn’t shoot in the location, and the end result had to stand up at 4K in very close proximity to the camera. Union designed a Houdini-based system to deal with the number of assets and clothing in such a way that the studio could easily art-direct them as individuals, allow the director to choreograph them and deliver a believable result.

Union conducted several motion capture shoots inhouse at Union to provide some specific animation cycles that married with the occasions they were recreating. This provided even more authentic-looking crowds for the post team.

Union worked on a total of 288 VFX shots, including greenscreens, set extensions, window reflections, muzzle flashes, fog and rain and a storm that included a lightning strike on the Basilica.

In addition, the team did a significant amount of de-aging work to accommodate the film’s eight-year main narrative timeline as well as a long period in Pope Francis’ younger years.

Storage for Visual Effects

By Karen Moltenbrey

When creating visual effects for a live-action film or television project, the artist digs right in. But not before the source files are received and backed up. Of course, during the process, storage again comes into play, as the artist’s work is saved and composited into the live-action file and then saved (and stored) yet again. At mid-sized Artifex Studios and the larger Jellyfish Pictures, two visual effects studios, storage might not be the sexiest part of the work they do, but it is vital to a successful outcome nonetheless.

Artifex Studios
An independent studio in Vancouver, BC, Artifex Studios is a small- to mid-sized visual effects facility producing film and television projects for networks, film studios and streaming services. Founded in 1997 by VFX supervisor Adam Stern, the studio has grown over the years from a one- to two-person operation to one staffed by 35 to 45 artists. During that time it has built up a lengthy and impressive resume, from Charmed, Descendants 3 and The Crossing to Mission to Mars, The Company You Keep and Apollo 18.

To handle its storage needs, Artifex uses the Qumulo QC24 four-node storage cluster for its main storage system, along with G-Tech and LaCie portable RAIDs and Angelbird Technologies and Samsung portable SSD drives. “We’ve been running [Qumulo] for several years now. It was a significant investment for us because we’re not a huge company, but it has been tremendously successful for us,” says Stern.

“The most important things for us when it comes to storage are speed, data security and minimal downtime. They’re pretty obvious things, but Qumulo offered us a system that eliminated one of the problems we had been having with the [previous] system bogging down as concurrent users were moving the files around quickly between compositors and 3D artists,” says Stern. “We have 40-plus people hitting this thing, pulling in 4K, 6K, 8K footage from it, rendering and [creating] 3D, and it just ticks along. That was huge for us.”

Of course, speed is of utmost importance, but so is maintaining the data’s safety. To this end, the new system self-monitors, taking its own snapshots to maintain its own health and making sure there are constantly rotating levels of backups. Having the ability to monitor everything about the system is a big plus for the studio as well.

Because data safety and security is non-negotiable, Artifex uses Google Cloud services along with Qumulo for incremental storage, every night incrementally backing up to Google Cloud. “So while Qumulo is doing its own snapshots incrementally, we have another hard-drive system from Synology, which is more of a prosumer NAS system, whose only job is to do a local current backup,” Stern explains. “So in-house, we have two local backups between Qumulo and Synology, and then we have a third backup going to the cloud every night that’s off-site. When a project is complete, we archive it onto two sets of local hard drives, and one leaves the premises and the other is stored here.” At this point, the material is taken off the Qumulo system, and seven days later, the last of the so-called snapshots is removed.

As soon as data comes into Artifex — either via Aspera, Signiant’s Media Shuttle or hard disks — the material is immediately transferred to the Qumulo system, and then it is cataloged and placed into the studio’s ftrack database, which the studio uses for shot tracking. Then, as Stern says, the floodgates open, and all the artists, compositors, 3D team members and admin coordination team members access the material that resides on the Qumulo system.

Desktops at the studio have local storage, generally an SSD built into the machine, but as Stern points out, that is a temporary solution used by the artists while working on a specific shot, not to hold studio data.

Artifex generally works on a handful of projects simultaneously, including the Nickelodeon horror anthology Are You Afraid of the Dark? “Everything we do here requires storage, and we’re always dealing with high-resolution footage, and that project was no exception,” says Stern. For instance, the series required Artifex to simulate 10,000 CG cockroaches spilling out of every possible hole in a room — work that required a lot of high-speed caching.

“FX artists need to access temporary storage very quickly to produce those simulations. In terms of the Qumulo system, we need it to retrieve files at the speed our effects artists can simulate and cache, and make sure they are able to manage what can be thousands and thousands of files generated just within a few hours.”

Similarly, for Netflix’s Wu Assassins, the studio generated multiple simulations of CG smoke and fog within SideFX’s Side Effects Houdini and again had to generate thousands and thousands of cache files for all the particles and volume information. Just as it did with the caching for the CG cockroaches, the current system handled caching for the smoke and fog quite efficiently.

At this point, Stern says the vendor is doing some interesting things that his company has not yet taken advantage of. For instance, today one of the big pushes is working in the cloud and integrating that with infrastructures and workflows. “I know they are working on that, and we’re looking into that,” he adds. There are also some new equipment features, “bleeding-edge stuff” Artifex has not explored yet. “It’s OK to be cutting-edge, but bleeding-edge is a little scary for us,” Stern notes. “I know they are always playing with new features, but just having the important foundation of speed and security is right where we are at the moment.”

Jellyfish Pictures
When it comes to big projects with big storage needs, Jellyfish Pictures is no fish out of water. The studio works on myriad projects, from Hollywood blockbusters like Star Wars to high-end TV series like Watchmen to episodic animation like Floogals and Dennis & Gnasher: Unleashed! Recently, it has embarked on an animated feature for DreamWorks and has a dedicated art department that works on visual development for substantial VFX projects and children’s animated TV content.

To handle all this work, Jellyfish has five studios across the UK: four in London and one in Sheffield, in the north of England. What’s more, in early December, Jellyfish expanded further with a brand-new virtual studio in London seating over 150 artists — increasing its capacity to over 300 people. In line with this expansion, Jellyfish is removing all on-site infrastructure from its existing locales and moving everything to a co-location. This means that all five present locations will be wholly virtual as well, making Jellyfish the largest VFX and animation studio in the world operating this way, contends CTO Jeremy Smith.

“We are dealing with shows that have very large datasets, which, therefore, require high-performance computing. It goes without saying, then, that we need some pretty heavy-duty storage,” says Smith.

Not only must the storage solution be able to handle Jellyfish’s data needs, it must also fit into its operational model. “Even though we work across multiple sites, we don’t want our artists to feel that. We need a storage system that can bring together all locations into one centralized hub,” Smith explains. “As a studio, we do not rely on one storage hardware vendor; therefore, we need to work with a company that is hardware-agnostic in addition to being able to operate in the cloud.”

Also, Jellyfish is a TPN-assessed studio and thus has to work with vendors that are TPN compliant — another serious, and vital, consideration when choosing its storage solution. TPN is an initiative between the Motion Picture Association of America (MPAA) and the Content Delivery and Security Association (CDSA) that provides a set of requirements and best practices around preventing leaks, breaches and hacks of pre-released, high-valued media content.

With all those factors in mind, Jellyfish uses PixStor from Pixit Media for its storage solution. PixStor is a software-defined storage solution that allows the studio to use various hardware storage from other vendors under the hood. With PixStor, data moves seamlessly through many tiers of storage — from fast flash and disk tiers to cost-effective, high-capacity object storage to the cloud. In addition, the studio uses NetApp storage within a different part of the same workflow on Dell R740 hardware and alternates between SSD and spinning disks, depending on the purpose of the data and the file size.

“We’ve future-proofed our studio with the Mellanox SN2100 switch for the heavy lifting, and for connecting our virtual workstations to the storage, we are using several servers from the Dell N3000 series,” says Smith.

As a wholly virtual studio, Jellyfish has no storage housed locally; it all sits in a co-location, which is accessed through remote workstations powered by Teradici’s PCoIP technology.

According to Smith, becoming a completely virtual studio is a new development for Jellyfish. Nevertheless, the facility has been working with Pixit Media since 2014 and launched its first virtual studio in 2017, “so the building blocks have been in place for a while,” he says.

Prior to moving all the infrastructure off-site, Jellyfish ran its storage system out of its Brixton and Soho studios locally. Its own private cloud from Brixton powered Jellyfish’s Soho and Sheffield studios. Both PixStor storage solutions in Brixton and Soho were linked with the solution’s PixCache. The switches and servers were still from Dell and Mellanox but were an older generation.

“Way back when, before we adopted this virtual world we are living in, we still worked with on-premises and inflexible storage solutions. It limited us in terms of the work we could take on and where we could operate,” says Smith. “With this new solution, we can scale up to meet our requirements.”

Now, however, using Mellanox SN2100, which has 100GbE, Jellyfish can deal with obscene amounts of data, Smith contends. “The way the industry is moving with 4K and 8K, even 16K being thrown around, we need to be ready,” he says.

Before the co-location, the different sites were connected through PixCache; now the co-location and public cloud are linked via Ngenea, which pre-caches files locally to the render node before the render starts. Furthermore, the studio is able to unlock true multi-tenancy with a single storage namespace, rapidly deploying logical TPN-accredited data separation and isolation and scaling up services as needed. “Probably two of the most important facets for us in running a successful studio: security and flexibility,” says Smith.

Artists access the storage via their Teradici Zero Clients, which, through the Dell switches, connect users to the standard Samba SMB network. Users who are working on realtime clients or in high resolution are connected to the Pixit storage through the Mellanox switch, where PixStor Native Client is used.

“Storage is a fundamental part of any VFX and animation studio’s workflow. Implementing the correct solution is critical to the seamless running of a project, as well as the security and flexibility of the business,” Smith concludes. “Any good storage system is invisible to the user. Only the people who build it will ever know the precision it takes to get it up and running — and that is the sign you’ve got the perfect solution.”


Karen Moltenbrey is a veteran writer, covering visual effects and post production.

Framestore VFX will open in Mumbai in 2020

Oscar-winning creative studio Framestore will be opening a full-service visual effects studio in Mumbai in 2020 to target India’s booming creative industry. The studio will be located in the Nesco IT Park in Goregaon, in the center of Mumbai’s technology district. The news hammers home Framestore’s continued interest in India, after having made a major investment in Jesh Krishna Murthy’s VFX studio, Anibrain, in 2017.

“Mumbai represents a rolling of wheels that were set in motion over two years ago,” says Framestore founder/CEO William Sargent. “Our investment in Anibrain has grown considerably, and we continue in our partnership with Jesh Krishna Murthy to develop and grow that business. Indeed, they will become a valued production partner to our Mumbai offering.”

Framestore looks to make considerable hires in the coming months, aiming to build an initial 500-strong team with existing Framestore talent combined with the best of local Indian expertise. Mumbai will work alongside the global network, including London and Montreal, to create a cohesive virtual team delivering high-quality international work.

“Mumbai has become a center of excellence in digital filmmaking. There’s a depth of talent that can deliver to the scale of Hollywood with the color and flair of Bollywood,” Sargent continues. “It’s an incredibly vibrant city and its presence on the international scene is holding us all to a higher standard. In terms of visual effects, we will set the standard here as we did in Montreal almost eight years ago.”

 

London’s Freefolk beefs up VFX team

Soho-based visual effects studio Freefolk, which has seen growth in its commercials and longform work, has grown its staff to meet this demand. As part of the uptick in work, Freefolk promoted Cheryl Payne from senior producer to head of commercial production. Additionally, Laura Rickets has joined as senior producer, and 2D artist Bradley Cocksedge has been added to the commercials VFX team.

Payne, who has been with Freefolk since the early days, has worked on some of the studio’s biggest commercials, including; Warburtons for Engine, Peloton for Dark Horses and Cadburys for VCCP.

Rickets comes to Freefolk with over 18 years of production experience working at some of the biggest VFX houses in London, including Framestore, The Mill and Smoke & Mirrors, as well as agency side for McCann. Since joining the team, Rickets has VFX-produced work on the I’m A Celebrity IDs, a set of seven technically challenging and CG-heavy spots for the new series of the show as well as ads for the Rugby World Cup and Who Wants to Be a Millionaire?.

Cocksedge is a recent graduate who joins from Framestore, where he was working as an intern on Fantastic Beasts: The Crimes of Grindelwald. While in school at the University of Hertfordshire, he interned at Freefolk and is happy to be back in a full-time position.

“We’ve had an exciting year and have worked on some really stand-out commercials, like TransPennine for Engine and the beautiful spot for The Guardian we completed with Uncommon, so we felt it was time to add to the Freefolk family,” says Fi Kilroe, Freefolk’s co-managing director/executive producer.

Main Image: (L-R) Cheryl Payne, Laura Rickets and Bradley Cocksedge

Alkemy X adds Albert Mason as head of production

Albert Mason has joined VFX house Alkemy X as head of production. He comes to Alkemy X with over two decades of experience in visual effects and post production. He has worked on projects directed by such industry icons as Peter Jackson on the Lord of the Rings trilogy, Tim Burton on Alice in Wonderland and Robert Zemeckis on The Polar Express. In his new role at Alkemy X, he will use his experience in feature films to target the growing episodic space.

A large part of Alkemy X’s work has been for episodic visual effects, with credits that include Amazon Prime’s Emmy-winning original series, The Marvelous Mrs. Maisel, USA’s Mr. Robot, AMC’s Fear the Walking Dead, Netflix’s Maniac, NBC’s Blindspot and Starz’s Power.

Mason began his career at MTV’s on-air promos department, sharpening his production skills on top series promo campaigns and as a part of its newly launched MTV Animation Department. He took an opportunity to transition into VFX, stepping into a production role for Weta Digital and spending three years working globally on the Lord of the Rings trilogy. He then joined Sony Pictures Imageworks, where he contributed to features including Spider-Man 3 and Ghost Rider. He has also produced work for such top industry shops as Logan, Rising Sun Pictures and Greymatter VFX.

“[Albert’s] expertise in constructing advanced pipelines that embrace emerging technologies will be invaluable to our team as we continue to bolster our slate of VFX work,” says Alkemy X president/CEO Justin Wineburgh.

Creating With Cloud: A VFX producer’s perspective

By Chris Del Conte

The ‘90s was an explosive era for visual effects, with films like Jurassic Park, Independence Day, Titanic and The Matrix shattering box office records and inspiring a generation of artists and filmmakers, myself included. I got my start in VFX working on seaQuest DSV, an Amblin/NBC sci-fi series that was ground-breaking for its time, but looking at the VFX of modern films like Gemini Man, The Lion King and Ad Astra, it’s clear just how far the industry has come. A lot of that progress has been enabled by new technology and techniques, from the leap to fully digital filmmaking and emergence of advanced viewing formats like 3D, Ultra HD and HDR to the rebirth of VR and now the rise of cloud-based workflows.

In my nearly 25 years in VFX, I’ve worn a lot of hats, including VFX producer, head of production and business development manager. Each role involved overseeing many aspects of a production and, collectively, they’ve all shaped my perspective when it comes to how the cloud is transforming the entire creative process. Thanks to my role at AWS Thinkbox, I have a front-row seat to see why studios are looking at the cloud for content creation, how they are using the cloud, and how the cloud affects their work and client relationships.

Chris Del Conte on the set of the IMAX film Magnificent Desolation.

Why Cloud?
We’re in a climate of high content demand and massive industry flux. Studios are incentivized to find ways to take on more work, and that requires more resources — not just artists, but storage, workstations and render capacity. Driving a need to scale, this trend often motivates studios to consider the cloud for production or to strengthen their use of cloud in their pipelines if already in play. Cloud-enabled studios are much more agile than traditional shops. When opportunities arise, they can act quickly, spinning resources up and down at a moment’s notice. I realize that for some, the concept of the cloud is still a bit nebulous, which is why finding the right cloud partner is key. Every facility is different, and part of the benefit of cloud is resource customization. When studios use predominantly physical resources, they have to make decisions about storage and render capacity, electrical and cooling infrastructure, and staff accommodations up front (and pay for them). Using the cloud allows studios to adjust easily to better accommodate whatever the current situation requires.

Artistic Impact
Advanced technology is great, but artists are by far a studio’s biggest asset; automated tools are helpful but won’t deliver those “wow moments” alone. Artists bring the creativity and talent to the table, then, in a perfect world, technology helps them realize their full potential. When artists are free of pipeline or workflow distractions, they can focus on creating. The positive effects spill over into nearly every aspect of production, which is especially true when cloud-based rendering is used. By scaling render resources via the cloud, artists aren’t limited by the capacity of their local machines. Since they don’t have to wait as long for shots to render, artists can iterate more fluidly. This boosts morale because the final results are closer to what artists envisioned, and it can improve work-life balance since artists don’t have to stick around late at night waiting for renders to finish. With faster render results, VFX supervisors also have more runway to make last-minute tweaks. Ultimately, cloud-based rendering enables a higher caliber of work and more satisfied artists.

Budget Considerations
There are compelling arguments for shifting capital expenditures to operational expenditures with the cloud. New studios get the most value out of this model since they don’t have legacy infrastructure to accommodate. Cloud-based solutions level the playing field in this respect; it’s easier for small studios and freelancers to get started because there’s no significant up-front hardware investment. This is an area where we’ve seen rapid cloud adoption. Considering how fast technology changes, it seems ill-advised to limit a new studio’s capabilities to today’s hardware when the cloud provides constant access to the latest compute resources.

When a studio has been in business for decades and might have multiple locations with varying needs, its infrastructure is typically well established. Some studios may opt to wait until their existing hardware has fully depreciated before shifting resources to the cloud, while others dive in right away, with an eye on the bigger picture. Rendering is generally a budgetary item on project bids, but with local hardware, studios are working to recoup a sunk cost. Using the cloud, render compute can be part of a bid and becomes a negotiable item. Clients can determine the delivery timeline based on render budget, and the elasticity of cloud resources allows VFX studios to pick up more work. (Even the most meticulously planned productions can run into 911 issues ahead of delivery, and cloud-enabled studios have bandwidth to be the hero when clients are in dire straits.)

Looking Ahead
When I started in VFX, giant rooms filled with racks and racks of servers and hardware were the norm, and VFX studios were largely judged by the size of their infrastructure. I’ve heard from an industry colleague about how their VFX studio’s server room was so impressive that they used to give clients tours of the space, seemingly a visual reminder of the studio’s vast compute capabilities. Today, there wouldn’t be nearly as much to view. Modern technology is more powerful and compact but still requires space, and that space has to be properly equipped with the necessary electricity and cooling. With cloud, studios don’t need switchers and physical storage to be competitive off the bat, and they experience fewer infrastructure headaches, like losing freon in the AC.

The cloud also opens up the available artist talent pool. Studios can dedicate the majority of physical space to artists as opposed to machines and even hire artists in remote locations on a per-project or long-term basis. Facilities of all sizes are beginning to recognize that becoming cloud-enabled brings a significant competitive edge, allowing them to harness the power to render almost any client request. VFX producers will also start to view facility cloud-enablement as a risk management tool that allows control of any creative changes or artistic embellishments up until delivery, with the rendering output no longer a blocker or a limited resource.

Bottom line: Cloud transforms nearly every aspect of content creation into a near-infinite resource, whether storage capacity, render power or artistic talent.


Chris Del Conte is senior EC2 business development manager at AWS Thinkbox.

Blur Studio uses new AMD Threadripper for Terminator: Dark Fate VFX

By Dayna McCallum

AMD has announced new additions to its high-end desktop processor family. Built for demanding desktop and content creation workloads, the 24-core AMD Ryzen Threadripper 3960X and the 32-core AMD Ryzen Threadripper 3970X processors will be available worldwide November 25.

Tim Miller on the set of Dark Fate.

AMD states that the powerful new processors provide up to 90 percent more performance and up to 2.5 times more available storage bandwidth than competitive offerings, per testing and specifications by AMD performance labs. The 3rd Gen AMD Ryzen Threadripper lineup features two new processors built on 7nm “Zen 2” core architecture, claiming up to 88 PCIe 4.0 lanes and 144MB cache with 66 percent better power efficiency.

Prior to the official product launch, AMD made the 3rd Gen Threadrippers available to LA’s Blur Studio for work on the recent Terminator: Dark Fate and continued a collaboration with the film’s director — and Blur Studio founder — Tim Miller.

Before the movie’s release, AMD hosted a private Q&A with Miller, moderated by AMD’s James Knight. Please note that we’ve edited the lively conversation for space and taken a liberty with some of Miller’s more “colorful” language. (Also watch this space to see if a wager is won that will result in Miller sporting a new AMD tattoo.) Here is the Knight/Miller conversation…

So when we dropped off the 3rd Gen Threadripper to you guys, how did your IT guys react?
Like little children left in a candy shop with no adult supervision. The nice thing about our atmosphere here at Blur is we have an open layout. So when (bleep) like these new AMD processors drops in, you know it runs through the studio like wildfire, and I sit out there like everybody else does. You hear the guys talking about it, you hear people giggling and laughing hysterically at times on the second floor where all the compositors are. That’s where these machines really kick ass — busting through these comps that would have had to go to the farm, but they can now do it on a desktop.

James Knight

As an artist, the speed is crucial. You know, if you have a machine that takes 15 minutes to render, you want to stop and do something else while you wait for a render. It breaks your whole chain of thought. You get out of that fugue state that you produce the best art in. It breaks the chain between art and your brain. But if you have a machine that does it in 30 seconds, that’s not going to stop it.

But really, more speed means more iterations. It means you deal with heavier scenes, which means you can throw more detail at your models and your scenes. I don’t think we do the work faster, necessarily, but the work is much higher quality. And much more detailed. It’s like you create this vacuum, and then everybody rushes into it and you have this silly idea that it is really going to increase productivity, but what it really increases most is quality.

When your VFX supervisor showed you the difference between the way it was done with your existing ecosystem and then with the third-gen Threadripper, what were you thinking about?
There was the immediate thing — when we heard from the producers about the deadline, shots that weren’t going to get done for the trailer, suddenly were, which was great. More importantly, you heard from the artists. What you started to see was that it allows for all different ways of working, instead of just the elaborate pipeline that we’ve built up — to work on your local box and then submit it to the farm and wait for that render to hit the queue of farm machines that can handle it, then send that render back to you.

It has a rhythm that is at times tiresome for the artists, and I know that because I hear it all the time. Now I say, “How’s that comp coming and when are we going to get it, tick tock?” And they say, “Well, it’s rendering in the background right now, as I’m watching them work on another comp or another piece of that comp.” That’s pretty amazing. And they’re doing it all locally, which saves so much time and frustration compared to sending it down the pipeline and then waiting for it to come back up.

I know you guys are here to talk about technology, but the difference for the artists is the instead of working here until 1:00am, they’re going home to put their children to bed. That’s really what this means at the end of the day. Technology is so wonderful when it enables that, not just the creativity of what we do, but the humanity… allowing artists to feel like they’re really on the cutting edge, but also have a life of some sort outside.

Endoskeleton — Terminator: Dark Fate

As you noted, certain shots and sequences wouldn’t have made it in time for the trailer. How important was it for you to get that Terminator splitting in the trailer?
 Marketing was pretty adamant that that shot had to be in there. There’s always this push and pull between marketing and VFX as you get closer. They want certain shots for the trailer, but they’re almost always those shots that are the hardest to do because they have the most spectacle in them. And that’s one of the shots. The sequence was one of the last to come together because we changed the plan quite a bit, and I kept changing shots on Dan (Akers, VFX supervisor). But you tell marketing people that they can’t have something, and they don’t really give a (bleep) about you and your schedule or the path of that artist and shot. (Laughing)

Anyway, we said no. They begged, they pleaded, and we said, “We’ll try.” Dan stepped up and said, “Yeah, I think I can make it.” And we just made it, but that sounds like we were in danger because we couldn’t get it done fast enough. All of this was happening in like a two-day window. If you didn’t notice (in the trailer), that’s a Rev 7. Gabriel Luna is a Rev 9, which is the next gen. But the Rev 7s that you see in his future flashback are just pure killers. They’re still the same technology, which is looking like metal on the outside and a carbon endoskeleton that splits. So you have to run the simulation where the skeleton separates through the liquid that hangs off of an inch string; it’s a really hard simulation to do. That’s why we thought maybe it wasn’t going to get done, but running the simulation on the AMD boxes was lightning fast.

 

 

 

Carbon New York grows with three industry vets

Carbon in New York has grown with two senior hires — executive producer Nick Haynes and head of CG Frank Grecco — and the relocation of existing ECD Liam Chapple, who joins from the Chicago office.

Chapple joined Carbon in 2016, moving from Mainframe in London to open Carbon’s Chicago facility.  He brought in clients such as Porsche, Lululemon, Jeep, McDonald’s, and Facebook. “I’ve always looked to the studios, designers and directors in New York as the high bar, and now I welcome the opportunity to pitch against them. There is an amazing pool of talent in New York, and the city’s energy is a magnet for artists and creatives of all ilk. I can’t wait to dive into this and look forward to expanding upon our amazing team of artists and really making an impression in such a competitive and creative market.”

Chapple recently wrapped direction and VFX on films for Teflon and American Express (Ogilvy) and multiple live-action projects for Lululemon. The most recent shoot, conceived and directed by Chapple, was a series of eight live-action films focusing on Lululemon’s brand ambassadors and its new flagship store in Chicago.

Haynes joins Carbon from his former role as EP of MPC, bringing over 20 years of experience earned at The Mill, MPC and Absolute. Haynes recently wrapped the launch film for the Google Pixel phone and the Chromebook, as well as an epic Middle Earth: Shadow of War Monolith Games trailer combining photo-real CGI elements with live-action shot on the frozen Black Sea in Ukraine.  “We want to be there at the inception of the creative and help steer it — ideally, lead it — and be there the whole way through the process, from concept and shoot to delivery. Over the years, whether working for the world’s most creative agencies or directly with prestigious clients like Google, Guinness and IBM, I aim to be as close to the project as possible from the outset, allowing my team to add genuine value that will garner the best result for everyone involved.”

Grecco joins Carbon from Method Studios, where he most recently led projects for Google, Target, Microsoft, Netflix and Marvel’s Deadpool 2.  With a wide range of experience from Emmy-nominated television title sequences to feature films and Super Bowl commercials, Grecco looks forward to helping Carbon continue to push its visuals beyond the high bar that has already been set.

In addition to New York and Chicago, Carbon has a studio in Los Angeles.

Main Image: (L-R) Frank Grecco, Liam Chapple, Nick Haynes

Sheena Duggal to get VES Award for Creative Excellence

The Visual Effects Society (VES) named acclaimed visual effects supervisor Sheena Duggal as the forthcoming recipient of the VES Award for Creative Excellence in recognition of her valuable contributions to filmed entertainment. The award will be presented at the 18th Annual VES Awards on January 29, 2020, at the Beverly Hilton Hotel.

The VES Award for Creative Excellence, bestowed by the VES Board of Directors, recognizes individuals who have made significant and lasting contributions to the art and science of the visual effects industry by uniquely and consistently creating compelling and creative imagery in service to story. The VES will honor Duggal for breaking new ground in compelling storytelling through the use of stunning visual effects. Duggal has been at the forefront of embracing emerging technology to enhance the moviegoing experience, and her creative vision and inventive techniques have paved the way for future generations of filmmakers.

Duggal is an acclaimed visual effects supervisor and artist whose work has shaped numerous studio tentpole and Academy Award-nominated productions. She is known for her design skills, creative direction and visual effects work on blockbuster films such as Venom, The Hunger Games, Mission: Impossible, Men in Black II, Spider-Man 3 and Contact. She has worked extensively with Marvel Studios as VFX supervisor on projects including Doctor Strange, Thor: The Dark World, Iron Man 3, Marvel One-Shot: Agent Carter and the Agent Carter TV series. She also contributed to Sci-Tech Academy Award wins for visual effects and compositing software Flame and Inferno. Since 2012, Duggal has been consulting with Codex (and now Codex and Pix), providing guidance on various new technologies for the VFX community. Duggal is currently visual effects supervisor for Venom 2 and recently completed design and prep for Ghostbusters 2020.

In 2007, Duggal made her debut as a director on an award-winning short film to showcase the Chicago Spire, simultaneously designing all of the visual effects. Her career in movies began when she moved to Los Angeles to work as a Flame artist on Super Mario Bros. for Roland Joffe and Jake Eberts’ Lightmotive Fatman. She had previously been based in London, where she created high-resolution digital composites for Europe’s top advertising and design agencies. Her work included album covers for Elton John and Traveling Wilburys.

Already an accomplished compositor (she began in 1985 working on early generation paint software), in 1992 Duggal worked as a Flame artist on the world’s first Flame feature production. Soon after, she was hired by Industrial Light & Magic as a supervising lead Flame artist on a number of high-profile projects (Mission: Impossible, Congo and The Indian in the Cupboard). In 1996, Duggal left ILM to join Sony Pictures Imageworks as creative director of high-speed compositing and soon began to take on the additional responsibilities of visual effects supervisor. She was production-side VFX supervisor for multiple directors during this time, including Jane Anderson (The Prize Winner of Defiance, Ohio), Peter Segal (50 First Dates and Anger Management) and Ridley Scott (Body of Lies and Matchstick Men).
In addition to feature films, Duggal has also worked on a number of design projects. In 2013 she designed the logo and the main-on-ends for Agent Carter. She was production designer for SIGGRAPH Electronic Theatre 2001, and she created the title design for the groundbreaking Technology Entertainment and Design conference (TED) in 2004.

Duggal is also a published photographer and traveled to Zimbabwe and Malawi on her last assignment on behalf of UK water charity Pump Aid, where she was photo-documenting how access to clean water has transformed the lives of thousands of people in rural areas.
Duggal is a member of the Academy of Motion Pictures Arts and Sciences and serves on the executive committee for the VFX branch.

Whiskytree experiences growth, upgrades tools

Visual effects and content creation company Whiskytree has gone through a growth spurt that included a substantial increase in staff, a new physical space and new infrastructure.

Providing content for films, television, the Web, apps, game and VR or AR, Whiskytree’s team of artists, designers and technicians use applications such as Autodesk Maya, Side Effects Houdini, Autodesk Arnold, Gaffer and Foundry Nuke on Linux — along with custom tools — to create computer graphics and visual effects.

To help manage its growth and the increase in data that came with it, Whiskytree recently installed Panasas ActiveStor. The platform is used to store and manage Whiskytree’s computer graphics and visual effects workflows, including data-intensive rendering and realtime collaboration using extremely large data sets for movies, commercials and advertising; work for realtime render engines and games; and augmented reality and virtual reality applications.

“We recently tripled our employee count in a single month while simultaneously finalizing the build-out of our new facility and network infrastructure, all while working on a 700-shot feature film project [The Captain],” says Jonathan Harb, chief executive officer and owner of Whiskytree. “Panasas not only delivered the scalable performance that we required during this critical period, but also delivered a high level of support and expertise. This allowed us to add artists at the rapid pace we needed with an easy-to-work-with solution that didn’t require fine-tuning to maintain and improve our workflow and capacity in an uninterrupted fashion. We literally moved from our old location on a Friday, then began work in our new facility the following Monday morning, with no production downtime. The company’s ‘set it and forget it’ appliance resulted in overall smooth operations, even under the trying circumstances.”

In the past, Whiskytree operated a multi-vendor storage solution that was complex and time consuming to administer, modify and troubleshoot. With the office relocation and rapid team expansion, Whiskytree didn’t have time to build a new custom solution or spend a lot of time tuning. It also needed storage that would grow as project and facility needs change.

Projects from the studio include Thor: Ragnarok, Monster Hunt 2, Bolden, Mother, Star Wars: The Last Jedi, Downsizing, Warcraft and Rogue One: A Star Wars.

Game of Thrones’ Emmy-nominated visual effects

By Iain Blair

Once upon a time, only glamorous movies could afford the time and money it took to create truly imaginative and spectacular visual effects. Meanwhile, television shows either tried to avoid them altogether or had to rely on hand-me-downs. But the digital revolution changed all that with its technological advances, and new tools quickly leveling the playing field. Today, television is giving the movies a run for their money when it comes to sophisticated visual effects, as evidenced by HBO’s blockbuster series, Game of Thrones.

Mohsen Mousavi

This fantasy series was recently Emmy-nominated a record-busting 32 times for its eighth and final season — including one for its visually ambitious VFX in the penultimate episode, “The Bells.”

The epic mass destruction presented Scanline’s VFX supervisor, Mohsen Mousavi, and his team many challenges. But his expertise in high-end visual effects, and his reputation for constant innovation in advanced methodology, made him a perfect fit to oversee Scanline’s VFX for the crucial last three episodes of the final season of Game of Thrones.

Mousavi started his VFX career in the field of artificial intelligence and advanced-physics-based simulations. He spearheaded designing and developing many different proprietary toolsets and pipelines for doing crowd, fluid and rigid body simulation, including FluidIT, BehaveIT and CardIT, a node-based crowd choreography toolset.

Prior to joining Scanline VFX Vancouver, Mousavi rose through the ranks of top visual effects houses, working in jobs that ranged from lead effects technical director to CG supervisor and, ultimately, VFX supervisor. He’s been involved in such high-profile projects as Hugo, The Amazing Spider-Man and Sucker Punch.

In 2012, he began working with Scanline, acting as digital effects supervisor on 300: Rise of an Empire, for which Scanline handled almost 700 water-based sea battle shots. He then served as VFX supervisor on San Andreas, helping develop the company’s proprietary city-generation software. That software and pipeline were further developed and enhanced for scenes of destruction in director Roland Emmerich’s Independence Day: Resurgence. In 2017, he served as the lead VFX supervisor for Scanline on the Warner Bros. shark thriller, The Meg.

I spoke with Mousavi about creating the VFX and their pipeline.

Congratulations on being Emmy-nominated for “The Bells,” which showcased so many impressive VFX. How did all your work on Season 4 prepare you for the big finale?
We were heavily involved in the finale of Season 4, however the scope was far smaller. What we learned was the collaboration and the nature of the show, and what the expectations were in terms of the quality of the work and what HBO wanted.

You were brought onto the project by lead VFX supervisor Joe Bauer, correct?
Right. Joe was the “client VFX supervisor” on the HBO side and was involved since Season 3. Together with my producer, Marcus Goodwin, we also worked closely with HBO’s lead visual effects producer, Steve Kullback, who I’d worked with before on a different show and in a different capacity. We all had daily sessions and conversations, a lot of back and forth, and Joe would review the entire work, give us feedback and manage everything between us and other vendors, like Weta, Image Engine and Pixomondo. This was done both technically and creatively, so no one stepped on each other’s toes if we were sharing a shot and assets. But it was so well-planned that there wasn’t much overlap.

[Editor’s Note: Here is the full list of those nominated for their VFX work on Game of Thrones — Joe Bauer, lead visual effects supervisor; Steve Kullback, lead visual effects producer; Adam Chazen, visual effects associate producer; Sam Conway, special effects supervisor; Mohsen Mousavi, visual effects supervisor; Martin Hill, visual effects supervisor; Ted Rae, visual effects plate supervisor; Patrick Tiberius Gehlen, previz lead; and Thomas Schelesny, visual effects and animation supervisor.]

What were you tasked with doing on Season 8?
We were involved as one of the lead vendors on the last three episodes and covered a variety of sequences. In episode four, “The Last of the Starks,” we worked on the confrontation between Daenerys and Cersei in front of the King’s Landing’s gate, which included a full CG environment of the city gate and the landscape around it, as well as Missandei’s death sequence, which featured a full CG Missandei. We also did the animated Drogon outside the gate while the negotiations took place.

Then for “The Bells” we were responsible for most of the Battle of King’s Landing, which included full digital city, Daenerys’ army camp site outside the walls of King’s Landing, the gathering of soldiers in front of the King’s Landing walls, Danny’s attack on the scorpions, the city gate, streets and the Red Keep, which had some very close-up set extensions, close-up fire and destruction simulations and full CG crowd of various different factions — armies and civilians. We also did the iconic Cleaganebowl fight between The Hound and The Mountain and Jamie Lannister’s fight with Euron at the beach underneath the Red Keep. In Episode 5, we received raw animation caches of the dragon from Image Engine and did the full look-dev, lighting and rendering of the final dragon in our composites.

For the final episode, “The Iron Throne, we were responsible for the entire Deanerys speech sequence, which included a full 360 digital environment of the city aftermath and the Red Keep plaza filled with digital unsullied Dothrakies and CG horses leading into the majestic confrontation between Jon and Drogon, where it revealed itself from underneath a huge pile of snow outside Red Keep. We were also responsible for the iconic throne melt sequence, which included some advance simulation of high viscous fluid and destruction of the area around the throne and finishing the dramatic sequence with Drogon carrying Danny out of the throne room and away from King’s Landing into the unknown.

Where was all this work done?
The majority of the work was done here in Vancouver, which is the biggest Scanline office. Additionally we had teams working in our Munich, Montreal and LA offices. We’re a 100% connected company, all working under the same infrastructure in the same pipeline. So if I work with the team in Munich, it’s like they’re sitting in the next room. That allows us to set up and attack the project with a larger crew and get the benefit of the 24/7 scenario; as we go home, they can continue working, and it makes us far more productive.

How many VFX did you have to create for the final season?
We worked on over 600 shots across the final three episodes which gave us approximately over an hour of screen time of high-end consistent visual effects.

Isn’t that hour length unusual for 600 shots?
Yes, but we had a number of shots that were really long, including some ground coverage shots of Arya in the streets of King’s Landing that were over four or five minutes long. So we had the complexity along with the long duration.

How many people were on your team?
At the height, we had about 350 artists on the project, and we began in March 2018 and didn’t wrap till nearly the end of April 2019 — so it took us over a year of very intense work.

Tell us about the pipeline specific to Game of Thrones.
Scanline has an industry-wide reputation for delivering very complex, full CG environments combined with complex simulation scenarios of all sort of fluid dynamics and destruction based on our simulation framework “Flowline.” We had a high-end digital character and hero creature pipeline that gave the final three episodes a boost up front. What was new were the additions to our procedural city generation pipeline for the recreation of King’s Landing, making sure it can deliver both in wide angle shots as well as some extreme close-up set extensions.

How did you do that?
We used a framework we developed back for Independence Day: Resurgence, which is a module-based procedural city generation leveraging some incredible scans of the historical city of Dubrovnik as a blueprint and foundation of King’s Landing. Instead of doing the modeling conventionally, you model a lot of small modules, kind of like Lego blocks. You create various windows, stones, doors, shingles and so on, and once it’s encoded in the system, you can semi-automatically generate variations of buildings on the fly. That also goes for texturing. We had procedurally generated layers of façade textures, which gave us a lot of flexibility on texturing the entire city, with full control over the level of aging and damage. We could decide to make a block look older easily without going back to square one. That’s how we could create King’s Landing with its hundreds of thousands of unique buildings.

The same technology was applied to the aftermath of the city in Episode 6. We took the intact King’s Landing and ran a number of procedural collapsing simulations on the buildings to get the correct weight based on references from the bombed city of Dresden during WWII, and then we added procedurally created CG snow on the entire city.

It didn’t look like the usual matte paintings were used at all.
You’re right, and there were a lot of shots that normally would be done that way, but to Joe’s credit, he wanted to make sure the environments weren’t cheated in any way. That was a big challenge, to keep everything consistent and accurate. Even if we used traditional painting methods, it was all done on top of an accurate 3D representation with correct lighting and composition.

What other tools did you use?
We use Autodesk Maya for all our front-end departments, including modeling, layout, animation, rigging and creature effects, and we bridge the results to Autodesk 3ds Max, which encapsulates our look-dev/FX and rendering departments, powered by Flowline and Chaos Group’s V-Ray as our primary render engine, followed by Foundry’s Nuke as our main compositing package.

At the heart of our crowd pipeline, we use Massive and our creature department is driven with Ziva muscles which was a collaboration we started with Ziva Dynamics back for the creation of the hero Megalodon in The Meg.

Fair to say that your work on Game of Thrones was truly cutting-edge?
Game of Thrones has pushed the limit above and beyond and has effectively erased the TV/feature line. In terms of environment and effects and the creature work, this is what you’d do for a high-end blockbuster for the big screen. No difference at all.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Behind the Title: MPC Senior Compositor Ruairi Twohig

After studying hand-drawn animation, this artist found his way to visual effects.

NAME: NYC-based Ruairi Twohig

COMPANY: Moving Picture Company (MPC)

CAN YOU DESCRIBE YOUR COMPANY?
MPC is a global creative and visual effects studio with locations in London, Los Angeles, New York, Shanghai, Paris, Bangalore and Amsterdam. We work with clients and brands across a range of different industries, handling everything from original ideas through to finished production.

WHAT’S YOUR JOB TITLE?
I work as a 2D lead/senior compositor.

Cadillac

WHAT DOES THAT ENTAIL?
The tasks and responsibilities can vary depending on the project. My involvement with a project can begin before there’s even a script or storyboard, and we need to estimate how much VFX will be involved and how long it will take. As the project develops and the direction becomes clearer, with scripts and storyboards and concept art, we refine this estimate and schedule and work with our clients to plan the shoot and make sure we have all the information and assets we need.

Once the commercial is shot and we have an edit, the bulk of the post work begins. This can involve anything from compositing fully CG environments, dragons or spaceships to beauty and product/pack-shot touch-ups or rig removal. So, my role involves a combination of overall project management and planning. But I also get into the detailed shot work and ultimately delivering the final picture. But the majority of the work I do can require a large team of people with different specializations, and those are usually the projects I find the most fun and rewarding due to the collaborative nature of the work.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
I think the variety of the work would surprise most people unfamiliar with the industry. In a single day, I could be working on two or three completely different commercials with completely different challenges while also bidding future projects or reviewing prep work in the early stages of a current project.

HOW LONG HAVE YOU BEEN WORKING IN VFX?
I’ve been working in the industry for over 10 years.

HOW HAS THE VFX INDUSTRY CHANGED IN THE TIME YOU’VE BEEN WORKING?
The VFX industry is always changing. I find it exciting to see how quickly the technology is advancing and becoming more widely accessible, cost-effective and faster.

I still find it hard to comprehend the idea of using optical printers for VFX back in the day … before my time. Some of the most interesting areas for me at the moment are the developments in realtime rendering from engines such as Unreal and Unity, and the implementation of AI/machine learning tools that might be able to automate some of the more time-consuming tasks in the future.

DID A PARTICULAR FILM INSPIRE YOU ALONG THIS PATH IN ENTERTAINMENT?
I remember when I was 13, my older brother — who was studying architecture at the time — introduced me to 3ds Max, and I started playing around with some very simple modeling and rendering.

I would buy these monthly magazines like 3D World, which came with demo discs for different software and some CG animation compilations. One of the issues included the short CG film Fallen Art by Tomek Baginski. At the time I was mostly familiar with Pixar’s feature animation work like Toy Story and A Bug’s Life, so watching this short film created using similar techniques but with such a dark, mature tone and story really blew me away. It was this film that inspired me to pursue animation and, ultimately, visual effects.

DID YOU GO TO FILM SCHOOL?
I studied traditional hand-drawn animation at the Dun Laoghaire Institute of Art, Design and Technology in Dublin. This was a really fun course in which we spent the first two years focusing on the craft of animation and the fundamental principles of art and design, followed by another two years in which we had a lot of freedom to make our own films. It was during these final two years of experimentation that I started to move away from traditional animation and focus more on learning CG and VFX.

I really owe a lot to my tutors, who were really supportive during that time. I also had the opportunity to learn from visiting animation masters such as Andreas Deja, Eric Goldberg and John Canemaker. Although on the surface the work I do as a compositor is very different to animation, understanding those fundamental principles has really helped my compositing work; any additional disciplines or skills you develop in your career that require an eye for detail and aesthetics will always make you a better overall artist.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Even after 10 years in the industry, I still get satisfaction from the problem-solving aspect of the job, even on the smaller tasks. I love getting involved on the more creative projects, where I have the freedom to develop the “look” of the commercial/film. But, day to day, it’s really the team-based nature of the work that keeps me going. Working with other artists, producers, directors and clients to make a project look great is what I find really enjoyable.

WHAT’S YOUR LEAST FAVORITE?
Sometimes even if everything is planned and scheduled accordingly, a little hiccup along the way can easily impact a project, especially on jobs where you might only have a limited amount of time to get the work done. So it’s always important to work in such a way that allows you to adapt to sudden changes.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I used to draw all day, every day as a kid. I still sketch occasionally, but maybe I would have pursued a more traditional fine art or illustration career if I hadn’t found VFX.

Tiffany & Co.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
Over the past year, I’ve worked on projects for clients such as Facebook, Adidas, Samsung and Verizon. I also worked on the Tiffany & Co. campaign “Believe in Dreams” directed by Francis Lawrence, as well as the company’s holiday campaign directed by Mark Romanek.

I also worked on Cadillac’s “Rise Above” campaign for the 2019 Oscars, which was challenging since we had to deliver four spots within a short timeframe. But it was a fun project. There was also the Michelob Ultra Robots Super Bowl spot earlier this year. That was an interesting project, as the work was completed between our LA, New York and London studios.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
Last year, I had the chance to work with my friend and director Sofia Astrom on the music video for the song “Bone Dry” by Eels. It was an interesting project since I’d never done visual effects for a stop-motion animation before. This had its own challenges, and the style of the piece was very different compared to what I’m used to working on day to day. It had a much more handmade feel to it, and the visual effects design had to reflect that, which was such a change to the work I usually do in commercials, which generally leans more toward photorealistic visual effects work.

WHAT TOOLS DO YOU USE DAY TO DAY?
I mostly work with Foundry Nuke for shot compositing. When leading a job that requires a broad overview of the project and timeline management/editorial tasks, I use Nuke Studio or
Autodesk Flame, depending on the requirements of the project. I also use ftrack daily for project management.

WHERE DO YOU FIND INSPIRATION NOW?
I follow a lot of incredibly talented concept artists and photographers/filmmakers on Instagram. Viewing these images/videos on a tiny phone doesn’t always do justice to the work, but the platform is so active that it’s a great resource for inspiration and finding new artists.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I like to run and cycle around the city when I can. During the week it can be easy to get stuck in a routine of sitting in front of a screen, so getting out and about is a much-needed break for me.

Beecham House‘s VFX take viewers back in time

Cambridge, UK-based Vine FX was the sole visual effects vendor on Gurinder Chadha’s Beecham House, a new Sunday night drama airing on ITV in the UK. Set in the India of 1795, Beecham House is the story of John Beecham (Tom Bateman), an Englishman who resigned from military service to set up as an honorable trader of the East India Company.

The series was shot at Ealing Studios and at some locations in India, with the visual effects work focusing on the Port of Delhi, the emperor’s palace and Beecham’s house. Vine FX founder Michael Illingworth assisted during development of the series and supervised his team of artists, creating intricate set extensions, matte paintings and period assets.

To make the shots believable and true to the era, the Vine FX team consulted closely with the show’s production designer and researched the period thoroughly. All modern elements — wires, telegraph poles, cars and lamp posts — had to be removed from the shoot footage, but the biggest challenge for the team was the Port of Delhi itself, a key location in the series.

Vine FX created a digital matte painting to extend the port and added numerous 3D boats and 3D people people working on the docks to create a busy working port of 1795 — a complex task and achieved by the expert eye of the Vine team.

“The success of this type of VFX is in its subtlety. We had to create a Delhi of 1795 that the audience believed, and this involved a great deal of research into how this would have looked that was essential to making it realistic,” says Illingworth. “Hopefully, we managed to do this.  I’m particularly happy with the finished port sequences as originally there were just three boats.

“I worked very closely with on-set supervisor Oliver Milburn while he was on set in India so was very much part of the production process in terms of VFX,” he continues. “Oliver would send me reference material from the shoot; this is always fundamental to the outcome of the VFX, as it allows you to plan ahead and work out any potential upcoming challenges. I was working on the VFX in Cambridge while Oliver was on set in Delhi — perfect!”

Vine FX used Photoshop and Nuke are its main tools. The artists modeled assets with Maya and Zbrush and painted assets using Substance painter. They rendered with Arnold.

Vine FX is currently working on War of the Worlds for Fox Networks and Canal+, due for release next year.

Brittany Howard music video sets mood with color and VFX

The latest collaboration between Framestore and director Kim Gehrig is for Brittany Howard’s debut solo music video for Stay High, which features a color grade and subtle VFX by the studio. A tribute to the Alabama Shakes’ lead singer’s late father, the stylized music video stars actor Terry Crews (Brooklyn Nine-Nine, The Expendables) as a man finishing a day’s work and returning home to his family.

Produced by production company Somesuch, the aim of Stay High is to present a natural and emotionally driven story that honors the singer’s father, K.J. Howard. Shot in her hometown of Nashville, the music video features Howard’s family and friends while the singer pops up in several scenes throughout the video as different characters.

The video begins with Howard’s father getting off of work at his factory job. The camera follows him on his drive home, all the while he’s singing “Stay High.” As he drives home, we see images people and locations where Howard grew up. The video ends when her dad pulls into his driveway and is met by his daughters and wife.

“Kim wanted to really highlight the innocence of the video’s story, something I kept in mind while grading the film,” says Simon Bourne, Framestore’s head of creative color, who’s graded several films for the director. “The focus needed to always be on Terry with nothing in his surroundings distracting from that and the grade needed to reflect that idea.”

Framestore’s creative director Ben Cronin, who was also a compositor on the project along with Nuke compositor Christian Baker, adds, “From a VFX point of view, our job was all about invisible effects that highlighted the beautiful job that Ryley Brown, the film’s DP, did and to complement Kim’s unique vision.”

“We’ve worked with Kim on several commercials and music video projects, and we love collaborating because her films are always visually-interesting and she knows we’ll always help achieve the ground-breaking and effortlessly cool work that she does.”


Glassbox’s virtual camera toolset for Unreal, Unity, Maya

Virtual production software company Glassbox Technologies has released its virtual camera plugin DragonFly from private beta for public use. DragonFly offers professional virtual cinematography tools to filmmakers and content creators, allowing users to view character performances and scenes within computer-generated virtual environments in realtime, through the camera’s viewfinder, an external monitor or iPad.

Available for Unreal Engine, Unity 3D and Autodesk Maya, DragonFly delivers an inclusive virtual cinematography workflow that allows filmmakers and content creators to make and test creative decisions faster and earlier in the process, whittling down production cost on projects of all scopes and sizes.

This off-the-shelf toolkit allows users to create previz to postviz without the need for large teams of operators, costly hardware or proprietary tools. It is platform-agnostic and fits seamlessly into any workflow out of box. Users can visualize and explore a CG virtual environment, then record, bookmark, create snapshots and replicate real camera movement as seamlessly as conducting a live-action shoot.

“Virtual production poses great potential for creators, but there were no off-the-shelf filming solutions available that worked out of the box,” notes co-founder/CPO Mariana Acuña. “In response, we made DragonFly: a virtual window that allows users to visualize complex sets, environments and performances through a viewfinder. Without the need for a big stage or mocap crew, it brings greater flexibility to the production and post pipeline for films, animation, immersive content, games and realtime VFX.”

The product was developed in collaboration with top Hollywood visualization and production studios, including The Third Floor for best-in-class results.

“Prior to DragonFly, each studio created its own bespoke virtual production workflow, which is costly and time-consuming per project. DragonFly makes realtime virtual production usable for all creators,” says Evelyn Cover, global R&D manager for The Third Floor. “We’re excited to collaborate with the Glassbox team to develop and test  DragonFly in all kinds of production scenarios from previz to post, with astounding success.”

Glassbox’s second in-beta virtual production software solution, BeeHive — the multi-platform, multi-user collaborative virtual scene syncing, editing and review solution–is slated to launch later this summer.

DragonFly is now available for purchase or can be downloaded for free as a 15-day trial from the Glassbox website. Pricing and licensing includes a permanent license option costing $750 USD (including $250 for the first year of support and updates) and an annual rental option costing $420 a year.

Technicolor opens prepro studio in LA

Technicolor is opening a new studio in Los Angeles dedicated to creating a seamless pipeline for feature projects — from concept art and visualization through virtual production, production and into final VFX.

As new distribution models increase the demand for content, Technicolor Pre-Production will provide the tools, the talent and the space for creatives to collaborate from day one of their project – from helping set the vision at the start of a job to ensuring that the vision carries through to production and VFX. The result is a more efficient filmmaking process.

Technicolor Pre-Production studio is headed by Kerry Shea, an industry veteran with over 20 years of experience. She is no stranger to this work, having held executive positions at Method Studios, The Third Floor, Digital Domain, The Jim Henson Company, DreamWorks Animation and Sony Pictures Imageworks.

Kerry Shea

Credited on more than 60 feature films including The Jungle Book, Pirates of the Caribbean: Dead Men Tell No Tales and Guardians of the Galaxy Vol. 2, Shea has an extensive background in VFX and post production, as well as live action, animatronics and creature effects.

While the Pre-Production studio stands apart from Technicolor’s visual effects studios — MPC Film, Mill Film, MR. X and Technicolor VFX — it can work seamlessly in conjunction with one or any combination of them.

The Technicolor Pre-Production Studio will comprise of key departments:
– The Business Development Department will work with clients, from project budgeting to consulting on VFX workflows, to help plan and prepare projects for a smooth transition into VFX.
– The VFX Supervisors Department will offer creative supervision across all aspects of VFX on client projects, whether delivered by Technicolor’s studios or third-party vendors.
– The Art Department will work with clients to understand their vision – including characters, props, technologies, and environments – creating artwork that delivers on that vision and sets the tone for the rest of the project.
– The Virtual Production Department will partner with filmmakers to bridge the gap between them and VFX through the production pipeline. Working on the ground and on location, the department will deliver a fully integrated pipeline and shooting services with the flexibility of a small, manageable team — allowing critical players in the filmmaking process to collaborate, view and manipulate media assets and scenes across multiple locations as the production process unfolds.
– The Visualization Department will deliver visualizations that will assist in achieving on screen exactly what clients envisioned.

“With the advancements of tools and technologies, such as virtual production, filmmaking has reached an inflection point, one in which storytellers can redefine what is possible on-set and beyond,” says Shea. “I am passionate about the increasing role and influence that the tools and craft of visual effects can have on the production pipeline and the even more important role in creating more streamlined and efficient workflows that create memorable stories.”

SIGGRAPH making-of sessions: Toy Story 4, GoT, more

The SIGGRAPH 2019 Production Sessions program offers attendees a behind-the-scenes look at the making of some of the year’s most impressive VFX films, shows, games and VR projects. The 11 production sessions will be held throughout the conference week of July 28 through August 1 at the Los Angeles Convention Center.

With 11 total sessions, attendees will hear from creators who worked on projects such as Toy Story 4, Game of Thrones, The Lion King and First Man.

Other highlights include:

Swing Into Another Dimension: The Making of Spider-Man: Into the Spider-Verse
This production session will explore the art and innovation behind the creation of the Academy Award-winning Spider-Man: Into the Spider-Verse. The filmmaking team behind the first-ever animated Spider-Man feature film took significant risks to develop an all-new visual style inspired by the graphic look of comic books.

Creating the Immersive World of BioWare’s Anthem
The savage world of Anthem is volatile, lush, expansive and full of unexpected characters. Bringing these aspects to life in a realtime, interactive environment presented a wealth of problems for BioWare’s technical artists and rendering engineers. This retrospective panel will highlight the team’s work, alongside reflections on innovation and the successes and challenges of creating a new IP.

The VFX of Netflix Series
From the tragic tales of orphans to a joint force of super siblings to sinister forces threatening 1980s Indiana, the VFX teams on Netflix series have delivered some of the year’s most best visuals. Creatives behind A Series of Unfortunate Events, The Umbrella Academy and Stranger Things will present the work and techniques that brought these worlds and characters into being.

The Making of Marvel Studios’ Avengers: Endgame
The fourth installment in the Avengers saga is the culmination of 22 interconnected films and has drawn audiences to witness the turning point of this epic journey. SIGGRAPH 2019 keynote speaker Victoria Alonso will join Marvel Studios, Digital Domain, ILM and Weta Digital as they discuss how the diverse collection of heroes, environments, and visual effects were assembled into this ultimate, climactic final chapter.

Space Explorers — Filming VR in Microgravity
Felix & Paul Studios, along with collaborators from NASA and the ISS National Lab, share insights from one of the most ambitious VR projects ever undertaken. In this session, the team will discuss the background of how this partnership came to be before diving into the technical challenges of capturing cinematic virtual reality on the ISS.

Productions Sessions are open to conference participants with Select Conference, Full Conference or Full Conference Platinum registrations. The Production Gallery can be accessed with an Experiences badge and above.

Fox Sports promotes US women’s World Cup team with VFX-heavy spots

Santa Monica creative studio Jamm worked with Wieden+Kennedy New York on the Fox Sports campaign “All Eyes on US.” Directed by Joseph Kahn out of Supply & Demand, the four spots celebrate the US Women’s soccer team as it gears up for the 2019 FIFA Women’s World Cup in June.

The newest 60-second spot All Eyes on US, features tens of thousands of screaming fans thanks to Jamm’s CG crowd work. On set, Jamm brainstormed with Kahn on how to achieve the immersive effect he was looking for. Much of the on-the-ground footage was shot using wide-angle lenses, which posed a unique set of challenges by revealing the entire environment as well as the close-up action. With pacing, Jamm achieved the sense of the game occurring in realtime, as the tempo of the camera keeps in step with the team moving the ball downfield.

The 30-second spot Goliath features the first CG crowd shot by the Jamm team, who successfully filled the soccer stadium with a roaring crowd. In Goliath, the entire US women’s soccer team runs toward the camera in slow motion. Captured locked off but digitally manipulated via a 3D camera to create a dolly zoom technique replicating real-life parallax, the altered perspective translates the unsettling feeling of being an opponent as the team literally runs straight into the camera.

On set, Jamm got an initial Lidar scan of the stadium as a base. From there, they used that scan along with reference photos taken on set to build a CG stadium that included accurate seating. They extended the stadium where there were gaps as well to make it a full 360 stadium. The stadium seating tools tie in with Jamm’s in-house crowd system (based on Side Effects Houdini) and allowed them to easily direct the performance of the crowd in every shot.

The Warrior focuses on Megan Rapinoe standing on the field in the rain, with a roaring crowd behind her. Whereas CG crowd simulation is typically captured with fast-moving cameras, the stadium crowd remains locked in the background of this sequence. Jamm implemented motion work and elements like confetti to make the large group of characters appear lively without detracting from Rapinoe in the foreground. Because the live-action scenes were shot in the rain, Jamm used water graphing to seamlessly blend the real-world footage and the CG crowd work.

The Finisher centers on Alex Morgan, who earned the nickname because “she’s the last thing they’ll see before it’s too late.”  The team ran down the field at a slow motion pace, while the cameraman rigged with a steady cam sprinted backwards through the goal. Then the footage was sped up by 600%, providing a realtime quality, as Morgan kicks a perfect strike to the back of the net.

Jamm used Autodesk Flame for compositing the crowds and CG ball, camera projections to rebuild and clean up certain parts of the environment, refining the skies and adding in stadium branding. They also used Foundry Nuke and Houdini for 3D.

The edit was via FinalCut and editor Spencer Campbell. The color grade was by Technicolor’s Tom Poole.

Veteran VFX supervisor Lindy De Quattro joins MPC Film

Long-time visual effects supervisor Lindy De Quattro has joined MPC Film in Los Angeles. Over the last two and a half decades, which included 21 years at ILM, De Quattro has worked with directors such as Guillermo Del Toro, Alexander Payne and Brad Bird. She also currently serves on the Executive Committee for the VFX branch of the Academy of Motion Picture Arts and Sciences.

De Quattro’s VFX credits include Iron Man 2, Mission Impossible: Ghost Protocol, Downsizing and Pacific Rim, for which she won a VES Award for Outstanding Visual Effects. In addition to supervising visual effects teams, she has also provided on-set supervision.

De Quattro says she was attracted to MPC because of “their long history of exceptional high-quality visual effects, but I made the decision to come on board because of their global commitment to inclusion and diversity in the VFX industry. I want to be an active part of the change that I see beginning to happen all around me, and MPC is giving me the opportunity to do just that. They say, ‘If you can see it, you can be it.’ Girls need role models, and women and other underrepresented groups in the industry need mentors. In my new role at MPC I will strive to be both while contributing to MPC’s legacy of outstanding visual effects.”

The studio’s other VFX supervisors include Richard Stammers (Dumbo, The Martian, X-Men: Days of Future Past), Erik Nash (Avengers Assemble, Titanic), Nick Davis (The Dark Knight, Edge of Tomorrow) and Adam Valdez (The Lion King, Maleficent, The Jungle Book).

MPC Film is currently working on The Lion King, Godzilla: King of the Monsters, Detective Pikachu, Call of the Wild and The New Mutants.

Review: Red Giant’s Trapcode Suite 15

By Brady Betzel

We are now comfortably into 2019 and enjoying the Chinese Year of the Pig — or at least I am! So readers, you might remember that with each new year comes a Red Giant Trapcode Suite update. And Red Giant didn’t disappoint with Trapcode Suite 15.

Every year Red Giant adds more amazing features to its already amazing particle generator and emitter toolset, Trapcode Suite, and this year is no different. Trapcode Suite 15 is keeping tools like 3D Stroke, Shine, Starglow, Sound Keys, Lux, Tao, Echospace and Horizon while significantly updating Particular, Form and Mir.

I won’t be covering each plugin in this review but you can check out what each individual plugin does on the Red Giant’s website.

Particular 4
The bread and butter of the Trapcode Suite has always been Particular, and Version 4 continues to be a powerhouse. The biggest differences between using a true 3D app like Maxon’s Cinema 4D or Autodesk Maya and Adobe After Effects (besides being pseudo 3D) are features like true raytraced rendering and interacting particle systems with fluid dynamics. As I alluded to, After Effects isn’t technically a 3D app, but with plugins like Particular you can create pseudo-3D particle systems that can affect and be affected by different particle emitters in your scenes. Trapcode Suite 15 and, in particular (all the pun intended), Particular 4, have evolved to another level with the latest update to include Dynamic Fluids. Dynamic Fluids essentially allows particle systems that have the fluid-physics engine enabled to interact with one another as well as create mind-blowing liquid-like simulations inside of After Effects.

What’s even more impressive is that with the Particular Designer and over 335 presets, you don’t  need a master’s degree to make impressive motion graphics. While I love to work in After Effects, I don’t always have eight hours to make a fluidly dynamic particle system bounce off 3D text, or have two systems interact with each other for a text reveal. This is where Particular 4 really pays for itself. With a little research and tutorial watching, you will be up and rendering within 30 minutes.

When I was using Particular 4, I simply wanted to recreate the Dynamic Fluid interaction I had seen in one of their promos. Basically, two emitters crashing into each other in a viscus-like fluid, then interacting. While it isn’t necessarily easy, if you have a slightly above-beginner amount of After Effects knowledge you will be able to do this. Apply the Particular plugin to a new solid object and open up the Particular Designer in Effect Controls. From there you can designate emitter type, motion, particle type, particle shadowing, particle color and dispersion types, as well as add multiple instances of emitters, adjust physics and much more.

The presets for all of these options can be accessed by clicking the “>” symbol in the upper left of the Designer interface. You can access all of the detailed settings and building “Blocks” of each of these categories by clicking the “<” in the same area. With a few hours spent watching tutorials on YouTube, you can be up and running with particle emitters and fluid dynamics. The preset emitters are pretty amazing, including my favorite, the two-emitter fluid dynamic systems that interact with one another.

Form 4
The second plugin in the Trapcode Suite 15 that has been updated is Trapcode Form 4. Form is a plugin that literally creates forms using particles that live forever in a unified 3D space, allowing for interaction. Form 4 adds the updated Designer, which makes particle grids a little more accessible and easier to construct for non-experts. Form 4 also includes the latest Fluid Dynamics update that Particular gained. The Fluid Dynamics engine really adds another level of beauty to Form projects, allowing you to create fluid-like particle grids from the 150 included presets or even your own .obj files.

My favorite settings to tinker with are Swirl and Viscosity. Using both settings in tandem can help create an ooey-gooey liquid particle grid that can interact with other Form systems to build pretty incredible scenes. To test out how .obj models worked within form, I clicked over to www.sketchfab.com and downloaded an .obj 3D model. If you search for downloadable models that do not cost anything, you can use them in your projects under Creative Commons licensing protocols, as long as you credit the creator. When in doubt always read the licensing (You can find more info on creative commons licensing here, but in this case you can use them as great practice models.

Anyway, Form 4 allows us to import .obj files, including animated .obj sequences as well as their textures. I found a Day of the Dead-type skull created by JMUHIST, pointed form to the .obj as well as its included texture, added a couple After Effect’s lights, a camera, and I was in business. Form has a great replicator feature (much like Element3D). There are a ton of options, including fog distance under visibility, animation properties, and even the ability to quickly add a null object linked to your model for quick alignment of other elements in the scene.

Mir 3
Up last is Trapcode Mir 3. Mir 3 is used to create 3D terrains, objects and wireframes in After Effects. In this latest update, Mir has added the ability to import .obj models and textures. Using fractal displacement mapping, you can quickly create some amazing terrains. From mountain-like peaks to alien terrains, Mir is a great supplement when using plugins like Video Copilot Element 3D to add endless tunnels or terrains to your 3D scenes quickly and easily.

And if you don’t have or own Element 3D, you will really enjoy the particle replication system. Use one 3D object and duplicate, then twist, distort and animate multiple instances of them quickly. The best part about all of these Trapcode Suite tools is that they interact with the cameras and lighting native to After Effects, making it a unified animating experience (instead of animating separate camera and lighting rigs like in the old days). Two of my favorite features from the last update are the ability to use quad- or triangle-based polygons to texture your surfaces. This can give an 8-bit or low-poly feel quickly, as well as a second pass wireframe to add a grid-like surface to your terrain.

Summing Up
Red Giant’s Trapcode Suite 15 is amazing. If you have a previous version of the Trapcode Suite, you’re in luck: the upgrade is “only” $199. If you need to purchase the full suite, it will cost you $999. Students get a bit of a break at $499.

If you are on the fence about it, go watch Daniel Hashimoto’s Cheap Tricks: Aquaman Underwater Effects tutorial (Part 1 and Part 2). He explains how you can use all of the Red Giant Trapcode Suite effects with other plugins like Video CoPilot’s Element 3D and Red Giant’s Universe and offers up some pro tips when using www.sketchfab.com to find 3D models.

I think I even saw him using Video CoPilot’s FX Console, which is a free After Effects plugin that makes accessing plugins much faster in After Effects. You may have seen his work as @ActionMovieKid on Twitter or @TheActionMovieKid on Instagram. He does some amazing VFX with his kids — he’s a must follow. Red Giant made a power move to get him to make tutorials for them! Anyway, his Aquaman Underwater Effects tutorial take you step by step through how to use each part of the Trapcode Suite 15 in an amazing way. He makes it look a little too easy, but I guess that is a combination of his VFX skills and the Trapcode Suite toolset.

If you are excited about 3D objects, particle systems and fluid dynamics you must check out Trapcode Suite 15 and its latest updates to Particular, Mir and Form.

After I finished the Trapcode Suite 15 review, Red Giant released the Trapcode Suite 15.1 update. The 15.1 update includes Text and Mask Emitters for Form and Particular 4.1, updated Designer, Shadowlet particle type matching, shadowlet softness and 21 additional presets.

This is a free update that can be downloaded from the Red Giant website.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

 

Avengers: Infinity War leads VES Awards with six noms

The Visual Effects Society (VES) has announced the nominees for the 17th Annual VES Awards, which recognize outstanding visual effects artistry and innovation in film, animation, television, commercials and video games as well as the VFX supervisors, VFX producers and hands-on artists who bring this work to life.

Avengers: Infinity War garners the most feature film nomination with six. Incredibles 2 is the top animated film contender with five nominations and Lost in Space leads the broadcast field with six nominations.

Nominees in 24 categories were selected by VES members via events hosted by 11 of the organizations Sections, including Australia, the Bay Area, Germany, London, Los Angeles, Montreal, New York, New Zealand, Toronto, Vancouver and Washington.

The VES Awards will be held on February 5th at the Beverly Hilton Hotel. As previously announced, the VES Visionary Award will be presented to writer/director/producer and co-creator of Westworld Jonathan Nolan. The VES Award for Creative Excellence will be given to award-winning creators/executive producers/writers/directors David Benioff and D.B. Weiss of Game of Thrones fame. Actor-comedian-author Patton Oswalt will once again host the VES Awards.

Here are the nominees:

Outstanding Visual Effects in a Photoreal Feature

Avengers: Infinity War

Daniel DeLeeuw

Jen Underdahl

Kelly Port

Matt Aitken

Daniel Sudick

 

Christopher Robin

Christopher Robin

Chris Lawrence

Steve Gaub

Michael Eames

Glenn Melenhorst

Chris Corbould

 

Ready Player One

Roger Guyett

Jennifer Meislohn

David Shirk

Matthew Butler

Neil Corbould

 

Solo: A Star Wars Story

Rob Bredow

Erin Dusseault

Matt Shumway

Patrick Tubach

Dominic Tuohy

 

Welcome to Marwen

Kevin Baillie

Sandra Scott

Seth Hill

Marc Chu

James Paradis

 

Outstanding Supporting Visual Effects in a Photoreal Feature 

12 Strong

Roger Nall

Robert Weaver

Mike Meinardus

 

Bird Box

Marcus Taormina

David Robinson

Mark Bakowski

Sophie Dawes

Mike Meinardus

 

Bohemian Rhapsody

Paul Norris

Tim Field

May Leung

Andrew Simmonds

 

First Man

Paul Lambert

Kevin Elam

Tristan Myles

Ian Hunter

JD Schwalm

 

Outlaw King

Alex Bicknell

Dan Bethell

Greg O’Connor

Stefano Pepin

 

Outstanding Visual Effects in an Animated Feature

Dr. Seuss’ The Grinch

Pierre Leduc

Janet Healy

Bruno Chauffard

Milo Riccarand

 

Incredibles 2

Brad Bird

John Walker

Rick Sayre

Bill Watral

 

Isle of Dogs

Mark Waring

Jeremy Dawson

Tim Ledbury

Lev Kolobov

 

Ralph Breaks the Internet

Scott Kersavage

Bradford Simonsen

Ernest J. Petti

Cory Loftis

 

Spider-Man: Into the Spider-Verse

Joshua Beveridge

Christian Hejnal

Danny Dimian

Bret St. Clair

 

Outstanding Visual Effects in a Photoreal Episode

Altered Carbon; Out of the Past

Everett Burrell

Tony Meagher

Steve Moncur

Christine Lemon

Joel Whist

 

Krypton; The Phantom Zone

Ian Markiewicz

Jennifer Wessner

Niklas Jacobson

Martin Pelletier

 

LOST IN SPACE

Lost in Space; Danger, Will Robinson

Jabbar Raisani

Terron Pratt

Niklas Jacobson

Joao Sita

 

The Terror; Go For Broke

Frank Petzold

Lenka Líkařová

Viktor Muller

Pedro Sabrosa

 

Westworld; The Passenger

Jay Worth

Elizabeth Castro

Bruce Branit

Joe Wehmeyer

Michael Lantieri

 

Outstanding Supporting Visual Effects in a Photoreal Episode

Tom Clancy’s Jack Ryan; Pilot

Erik Henry

Matt Robken

Bobo Skipper

Deak Ferrand

Pau Costa

 

The Alienist; The Boy on the Bridge

Kent Houston

Wendy Garfinkle

Steve Murgatroyd

Drew Jones

Paul Stephenson

 

The Deuce; We’re All Beasts

Jim Rider

Steven Weigle

John Bair

Aaron Raff

 

The First; Near and Far

Karen Goulekas

Eddie Bonin

Roland Langschwert

Bryan Godwin

Matthew James Kutcher

 

The Handmaid’s Tale; June

Brendan Taylor

Stephen Lebed

Winston Lee

Leo Bovell

 

Outstanding Visual Effects in a Realtime Project

Age of Sail

John Kahrs

Kevin Dart

Cassidy Curtis

Theresa Latzko

 

Cycles

Jeff Gipson

Nicholas Russell

Lauren Nicole Brown

Jorge E. Ruiz Cano

 

Dr Grordbort’s Invaders

Greg Broadmore

Mhairead Connor

Steve Lambert

Simon Baker

 

God of War

Maximilian Vaughn Ancar

Corey Teblum

Kevin Huynh

Paolo Surricchio

 

Marvel’s Spider-Man

Grant Hollis

Daniel Wang

Seth Faske

Abdul Bezrati

 

Outstanding Visual Effects in a Commercial 

Beyond Good & Evil 2

Maxime Luere

Leon Berelle

Remi Kozyra

Dominique Boidin

 

John Lewis; The Boy and the Piano

Kamen Markov

Philip Whalley

Anthony Bloor

Andy Steele

 

McDonald’s; #ReindeerReady

Ben Cronin

Josh King

Gez Wright

Suzanne Jandu

 

U.S. Marine Corps; A Nation’s Call

Steve Drew

Nick Fraser

Murray Butler

Greg White

Dave Peterson

 

Volkswagen; Born Confident

Carsten Keller

Anandi Peiris

Dan Sanders

Fabian Frank

 

Outstanding Visual Effects in a Special Venue Project

Beautiful Hunan; Flight of the Phoenix

R. Rajeev

Suhit Saha

Arish Fyzee

Unmesh Nimbalkar

 

Childish Gambino’s Pharos

Keith Miller

Alejandro Crawford

Thelvin Cabezas

Jeremy Thompson

 

DreamWorks Theatre Presents Kung Fu Panda

Marc Scott

Doug Cooper

Michael Losure

Alex Timchenko

 

Osheaga Music and Arts Festival

Andre Montambeault

Marie-Josee Paradis

Alyson Lamontagne

David Bishop Noriega

 

Pearl Quest

Eugénie von Tunzelmann

Liz Oliver

Ian Spendloff

Ross Burgess

 

Outstanding Animated Character in a Photoreal Feature

Avengers: Infinity War; Thanos

Jan Philip Cramer

Darren Hendler

Paul Story

Sidney Kombo-Kintombo

 

Christopher Robin; Tigger

Arslan Elver

Kayn Garcia

Laurent Laban

Mariano Mendiburu

 

Jurassic World: Fallen Kingdom; Indoraptor

Jance Rubinchik

Ted Lister

Yannick Gillain

Keith Ribbons

 

Ready Player One; Art3mis

David Shirk

Brian Cantwell

Jung-Seung Hong

Kim Ooi

 

Outstanding Animated Character in an Animated Feature

Dr. Seuss’ The Grinch; The Grinch

David Galante

Francois Boudaille

Olivier Luffin

Yarrow Cheney

 

Incredibles 2; Helen Parr

Michal Makarewicz

Ben Porter

Edgar Rodriguez

Kevin Singleton

 

Ralph Breaks the Internet; Ralphzilla

Dong Joo Byun

Dave K. Komorowski

Justin Sklar

Le Joyce Tong

 

Spider-Man: Into the Spider-Verse; Miles Morales

Marcos Kang

Chad Belteau

Humberto Rosa

Julie Bernier Gosselin

 

Outstanding Animated Character in an Episode or Realtime Project

Cycles; Rae

Jose Luis Gomez Diaz

Edward Everett Robbins III

Jorge E. Ruiz Cano

Jose Luis -Weecho- Velasquez

 

Lost in Space; Humanoid

Chad Shattuck

Paul Zeke

Julia Flanagan

Andrew McCartney

 

Nightflyers; All That We Have Found; Eris

Peter Giliberti

James Chretien

Ryan Cromie

Cesar Dacol Jr.

 

Spider-Man; Doc Ock

Brian Wyser

Henrique Naspolini

Sophie Brennan

William Salyers

 

Outstanding Animated Character in a Commercial

McDonald’s; Bobbi the Reindeer

Gabriela Ruch Salmeron

Joe Henson

Andrew Butler

Joel Best

 

Overkill’s The Walking Dead; Maya

Jonas Ekman

Goran Milic

Jonas Skoog

Henrik Eklundh

 

Peta; Best Friend; Lucky

Bernd Nalbach

Emanuel Fuchs

Sebastian Plank

Christian Leitner

 

Volkswagen; Born Confident; Bam

David Bryan

Chris Welsby

Fabian Frank

Chloe Dawe

 

Outstanding Created Environment in a Photoreal Feature

Ant-Man and the Wasp; Journey to the Quantum Realm

Florian Witzel

Harsh Mistri

Yuri Serizawa

Can Yuksel

 

Aquaman; Atlantis

Quentin Marmier

Aaron Barr

Jeffrey De Guzman

Ziad Shureih

 

Ready Player One; The Shining, Overlook Hotel

Mert Yamak

Stanley Wong

Joana Garrido

Daniel Gagiu

 

Solo: A Star Wars Story; Vandor Planet

Julian Foddy

Christoph Ammann

Clement Gerard

Pontus Albrecht

 

Outstanding Created Environment in an Animated Feature

Dr. Seuss’ The Grinch; Whoville

Loic Rastout

Ludovic Ramiere

Henri Deruer

Nicolas Brack

 

Incredibles 2; Parr House

Christopher M. Burrows

Philip Metschan

Michael Rutter

Joshua West

 

Ralph Breaks the Internet; Social Media District

Benjamin Min Huang

Jon Kim Krummel II

Gina Warr Lawes

Matthias Lechner

 

Spider-Man; Into the Spider-Verse; Graphic New York City

Terry Park

Bret St. Clair

Kimberly Liptrap

Dave Morehead

 

Outstanding Created Environment in an Episode, Commercial, or Realtime Project

Cycles; The House

Michael R.W. Anderson

Jeff Gipson

Jose Luis Gomez Diaz

Edward Everett Robbins III

 

Lost in Space; Pilot; Impact Area

Philip Engström

Kenny Vähäkari

Jason Martin

Martin Bergquist

 

The Deuce; 42nd St

John Bair

Vance Miller

Jose Marin

Steve Sullivan

 

The Handmaid’s Tale; June; Fenway Park

Patrick Zentis

Kevin McGeagh

Leo Bovell

Zachary Dembinski

 

The Man in the High Castle; Reichsmarschall Ceremony

Casi Blume

Michael Eng

Ben McDougal

Sean Myers

 

Outstanding Virtual Cinematography in a Photoreal Project

Aquaman; Third Act Battle

Claus Pedersen

Mohammad Rastkar

Cedric Lo

Ryan McCoy

 

Echo; Time Displacement

Victor Perez

Tomas Tjernberg

Tomas Wall

Marcus Dineen

 

Jurassic World: Fallen Kingdom; Gyrosphere Escape

Pawl Fulker

Matt Perrin

Oscar Faura

David Vickery

 

Ready Player One; New York Race

Daniele Bigi

Edmund Kolloen

Mathieu Vig

Jean-Baptiste Noyau

 

Welcome to Marwen; Town of Marwen

Kim Miles

Matthew Ward

Ryan Beagan

Marc Chu

 

Outstanding Model in a Photoreal or Animated Project 

Avengers: Infinity War; Nidavellir Forge Megastructure

Chad Roen

Ryan Rogers

Jeff Tetzlaff

Ming Pan

 

Incredibles 2; Underminer Vehicle

Neil Blevins

Philip Metschan

Kevin Singleton

 

Mortal Engines; London

Matthew Sandoval

James Ogle

Nick Keller

Sam Tack

 

Ready Player One; DeLorean DMC-12

Giuseppe Laterza

Kim Lindqvist

Mauro Giacomazzo

William Gallyot

 

Solo: A Star Wars Story; Millennium Falcon

Masa Narita

Steve Walton

David Meny

James Clyne

 

Outstanding Effects Simulations in a Photoreal Feature

Avengers: Infinity War; Titan

Gerardo Aguilera

Ashraf Ghoniem

Vasilis Pazionis

Hartwell Durfor

 

Avengers: Infinity War; Wakanda

Florian Witzel

Adam Lee

Miguel Perez Senent

Francisco Rodriguez

 

Fantastic Beasts: The Crimes of Grindelwald

Dominik Kirouac

Chloe Ostiguy

Christian Gaumond

 

Venom

Aharon Bourland

Jordan Walsh

Aleksandar Chalyovski

Federico Frassinelli

 

Outstanding Effects Simulations in an Animated Feature

Dr. Seuss’ The Grinch; Snow, Clouds and Smoke

Eric Carme

Nicolas Brice

Milo Riccarand

 

Incredibles 2

Paul Kanyuk

Tiffany Erickson Klohn

Vincent Serritella

Matthew Kiyoshi Wong

 

Ralph Breaks the Internet; Virus Infection & Destruction

Paul Carman

Henrik Fält

Christopher Hendryx

David Hutchins

 

Smallfoot

Henrik Karlsson

Theo Vandernoot

Martin Furness

Dmitriy Kolesnik

 

Spider-Man: Into the Spider-Verse

Ian Farnsworth

Pav Grochola

Simon Corbaux

Brian D. Casper

 

Outstanding Effects Simulations in an Episode, Commercial, or Realtime Project

Altered Carbon

Philipp Kratzer

Daniel Fernandez

Xavier Lestourneaud

Andrea Rosa

 

Lost in Space; Jupiter is Falling

Denys Shchukin

Heribert Raab

Michael Billette

Jaclyn Stauber

 

Lost in Space; The Get Away

Juri Bryan

Will Elsdale

Hugo Medda

Maxime Marline

 

The Man in the High Castle; Statue of Liberty Destruction

Saber Jlassi

Igor Zanic

Nick Chamberlain

Chris Parks

 

Outstanding Compositing in a Photoreal Feature

Avengers: Infinity War; Titan

Sabine Laimer

Tim Walker

Tobias Wiesner

Massimo Pasquetti

 

First Man

Joel Delle-Vergin

Peter Farkas

Miles Lauridsen

Francesco Dell’Anna

 

Jurassic World: Fallen Kingdom

John Galloway

Enrik Pavdeja

David Nolan

Juan Espigares Enriquez

 

Welcome to Marwen

Woei Lee

Saul Galbiati

Max Besner

Thai-Son Doan

 

Outstanding Compositing in a Photoreal Episode

Altered Carbon

Jean-François Leroux

Reece Sanders

Stephen Bennett

Laraib Atta

 

Handmaids Tale; June

Winston Lee

Gwen Zhang

Xi Luo

Kevin Quatman

 

Lost in Space; Impact; Crash Site Rescue

David Wahlberg

Douglas Roshamn

Sofie Ljunggren

Fredrik Lönn

 

Silicon Valley; Artificial Emotional Intelligence; Fiona

Tim Carras

Michael Eng

Shiying Li

Bill Parker

 

Outstanding Compositing in a Photoreal Commercial

Apple; Unlock

Morten Vinther

Michael Gregory

Gustavo Bellon

Rodrigo Jimenez

 

Apple; Welcome Home

Michael Ralla

Steve Drew

Alejandro Villabon

Peter Timberlake

 

Genesis; G90 Facelift

Neil Alford

Jose Caballero

Joseph Dymond

Greg Spencer

 

John Lewis; The Boy and the Piano

Kamen Markov

Pratyush Paruchuri

Kalle Kohlstrom

Daniel Benjamin

 

Outstanding Visual Effects in a Student Project

Chocolate Man

David Bellenbaum

Aleksandra Todorovic

Jörg Schmidt

Martin Boué

 

Proxima-b

Denis Krez

Tina Vest

Elias Kremer

Lukas Löffler

 

Ratatoskr

Meike Müller

Lena-Carolin Lohfink

Anno Schachner

Lisa Schachner

 

Terra Nova

Thomas Battistetti

Mélanie Geley

Mickael Le Mezo

Guillaume Hoarau

VFX studio Electric Theatre Collective adds three to London team

London visual effects studio Electric Theatre Collective has added three to its production team: Elle Lockhart, Polly Durrance and Antonia Vlasto.

Lockhart brings with her extensive CG experience, joining from Touch Surgery where she ran the Johnson & Johnson account. Prior to that she worked at Analog as a VFX producer where she delivered three global campaigns for Nike. At Electric, she will serve as producer on Martini and Toyota.

Vlasto joins Electric working on clients such Mercedes, Tourism Ireland and Tui. She joins from 750MPH where, over a four-year period, she served as producer on Nike, Great Western Railway, VW and Amazon to name but a few.

At Electric, Polly Durrance will serve as producer on H&M, TK Maxx and Carphone Warehouse. She joins from Unit where she helped launched their in-house Design Collective, worked with clients such as Lush, Pepsi and Thatchers Cider. Prior to Unit Polly was at Big Buoy where she produced work for Jaguar Land Rover, giffgaff and Redbull.

Recent projects at the studio, which also has an office in Santa Monica, California, include Tourism Ireland Capture Your Heart and Honda Palindrome.

Main Image: (L-R) Elle Lockhart, Antonia Vlasto and Polly Durrance.

Milk VFX provides 926 shots for YouTube’s Origin series

London’s Milk VFX, known for its visual effects work on Adrift, Annihilation and Altered Carbon, has just completed production on YouTube Premium’s new sci-fi thriller original series, Origin.

Milk created all of the 926 VFX shots for Origin in 4K, encompassing a wide range of VFX work, in a four-month timeframe. Milk executed rendering entirely in the cloud (via the AWS Cloud Platform); allowing the team to scale its current roster of projects, which include Amazon’s Good Omens and feature film Four Kids and It.

VFX supervisor and Milk co-founder Nicolas Hernandez supervised the entire roster of VFX work on Origin. Milk also supervised the VFX shoot on location in South Africa.

“As we created all the VFX for the 10-episode series it was even more important for us to be on set,” says Hernandez. “As such, our VFX supervisor Murray Barber and onset production manager David Jones supervised the Origin VFX shoot, which meant being based at the South Africa shoot location for several months.”

The series is from Left Bank Pictures, Sony Pictures Television and Midnight Radio in association with China International Television Corporation (CiTVC). Created by Mika Watkins, Origin stars Tom Felton and Natalia Tena and will premiere on 14 November on YouTube Premium.

“The intense challenge of delivering and supervising a show on the scale of Origin — 900 4K shots in four months — was not only helped by our recent expansion and the use of the cloud for rendering, but was largely due to the passion and expertise of the Milk Origin team in collaboration with Left Bank Pictures,” says Cohen.

In terms of tools, Milk used Autodesk Maya, Side Effects Houdini, Foundry’s Nuke and Mari, Shotgun, Photoshop, Deadline for renderfarms and Arnold for rendering and a variety of in-house tools. Hardware includes HPz series workstations and Nvidia graphics. Storage used was Pixitmedia’s PixStor.

The series, from director Paul W.S. Anderson and the producers of The Crown and Lost, follows a group of outsiders who find themselves abandoned on a ship bound for a distant land. Now they must work together for survival, but quickly realize that one of them is far from who they claim to be.

 

Sony Imageworks provides big effects, animation for Warner’s Smallfoot

By Randi Altman

The legend of Bigfoot: a giant, hairy two-legged creature roaming the forests and giving humans just enough of a glimpse to freak them out. Sightings have been happening for centuries with no sign of slowing down — seriously, Google it.

But what if that story was turned around, and it was Bigfoot who was freaked out by a Smallfoot (human)? Well, that is exactly the premise of the new Warner Bros. film Smallfoot, directed by Karey Kirkpatrick. It’s based on the book “Yeti Tracks” by Sergio Pablos.

Karl Herbst

Instead of a human catching a glimpse of the mysterious giant, a yeti named Migo (Channing Tatum) sees a human (James Corden) and tells his entire snow-filled village about the existence of Smallfoot. Of course, no one believes him so he goes on a trek to find this mythical creature and bring him home as proof.

Sony Pictures Imageworks was tasked with all of the animation and visual effects work on the film, while Warner Animation film did all of the front end work — such as adapting the script, creating the production design, editing, directing, producing and more. We reached out to Imageworks VFX supervisor Karl Herbst (Hotel Transylvania 2) to find out more about creating the animation and effects for Smallfoot.

The film has a Looney Tunes-type feel with squash and stretch. Did this provide more freedom or less?
In general, it provided more freedom since it allowed the animation team to really have fun with gags. It also gave them a ton of reference material to pull from and come up with new twists on older ideas. Once out of animation, depending on how far the performance was pushed, other departments — like the character effects team — would have additional work due to all of the exaggerated movements. But all of the extra work was worth it because everyone really loved seeing the characters pushed.

We also found that as the story evolved, Migo’s journey became more emotionally driven; We needed to find a style that also let the audience truly connect with what he was going through. We brought in a lot more subtlety, and a more truthful physicality to the animation when needed. As a result, we have these incredibly heartfelt performances and moments that would feel right at home in an old Road Runner short. Yet it all still feels like part of the same world with these truly believable characters at the center of it.

Was scale between such large and small characters a challenge?
It was one of the first areas we wanted to tackle since the look of the yeti’s fur next to a human was really important to filmmakers. In the end, we found that the thickness and fidelity of the yeti hair had to be very high so you could see each hair next to the hairs of the humans.

It also meant allowing the rigs for the human and yetis to be flexible enough to scale them as needed to have moments where they are very close together and they did not feel so disproportionate to each other. Everything in our character pipeline from animation down to lighting had to be flexible in dealing with these scale changes. Even things like subsurface scattering in the skin had dials in it to deal with when Percy, or any human character, was scaled up or down in a shot.

How did you tackle the hair?
We updated a couple of key areas in our hair pipeline starting with how we would build our hair. In the past, we would make curves that look more like small groups of hairs in a clump. In this case, we made each curve its own strand of a single hair. To shade this hair in a way that allowed artists to have better control over the look, our development team created a new hair shader that used true multiple-scattering within the hair.

We then extended that hair shading model to add control over the distribution around the hair fiber to model the effect of animal hair, which tends to scatter differently than human hair. This gave artists the ability to create lots of different hair looks, which were not based on human hair, as was the case with our older models.

Was rendering so many fury characters on screen at a time an issue?
Yes. In the past this would have been hard to shade all at once, mostly due to our reliance on opacity to create the soft shadows needed for fur. With the new shading model, we were no longer using opacity at all so the number of rays needed to resolve the hair was lower than in the past. But we now needed to resolve the aliasing due to the number of fine hairs (9 million for LeBron James’ Gwangi).

We developed a few other new tools within our version of the Arnold renderer to help with aliasing and render time in general. The first was adaptive sampling, which would allow us to up the anti-aliasing samples drastically. This meant some pixels would only use a few samples while others would use very high sampling. Whereas in the past, all pixels would get the same number. This focused our render times to where we needed it, helping to reduce overall rendering. Our development team also added the ability for us to pick a render up from its previous point. This meant that at a lower quality level we could do all of our lighting work, get creative approval from the filmmakers and pick up the renders to bring them to full quality not losing the time already spent.

What tools were used for the hair simulations specifically, and what tools did you call on in general?
We used Maya and the Nucleus solvers for all of the hair simulations, but developed tools over them to deal with so much hair per character and so many characters on screen at once. The simulation for each character was driven by their design and motion requirements.

The Looney Tunes-inspired design and motion created a challenge around how to keep hair simulations from breaking with all of the quick and stretched motion while being able to have light wind for the emotional subtle moments. We solved all of those requirements by using a high number of control hairs and constraints. Meechee (Zendaya) used 6,000 simulation curves with over 200 constraints, while Migo needed 3,200 curves with around 30 constraints.

Stonekeeper (Common) was the most complex of the characters, with long braided hair on his head, a beard, shaggy arms and a cloak made of stones. He required a cloth simulation pass, a rigid body simulation was performed for the stones and the hair was simulated on top of the stones. Our in-house tool called Kami builds all of the hair at render time and also allows us to add procedurals to the hair at that point. We relied on those procedurals to create many varied hair looks for all of the generics needed to fill the village full of yetis.

How many different types of snow did you have?
We created three different snow systems for environmental effects. The first was a particle simulation of flakes for near-ground detail. The second was volumetric effects to create lots of atmosphere in the backgrounds that had texture and movement. We used this on each of the large sets and then stored those so lighters could pick which parts they wanted in each shot. To also help with artistically driving the look of each shot, our third system was a library of 2D elements that the effects team rendered and could be added during compositing to add details late in shot production.

For ground snow, we had different systems based on the needs in each shot. For shallow footsteps, we used displacement of the ground surface with additional little pieces of geometry to add crumble detail around the prints. This could be used in foreground or background.

For heavy interactions, like tunneling or sliding in the snow, we developed a new tool we called Katyusha. This new system combined rigid body destruction with fluid simulations to achieve all of the different states snow can take in any given interaction. We then rendered these simulations as volumetrics to give the complex lighting look the filmmakers were looking for. The snow, being in essence a cloud, allowed light transport through all of the different layers of geometry and volume that could be present at any given point in a scene. This made it easier for the lighters to give the snow its light look in any given lighting situation.

Was there a particular scene or effect that was extra challenging? If so, what was it and how did you overcome it?
The biggest challenge to the film as a whole was the environments. The story was very fluid, so design and build of the environments came very late in the process. Coupling that with a creative team that liked to find their shots — versus design and build them — meant we needed to be very flexible on how to create sets and do them quickly.

To achieve this, we begin by breaking the environments into a subset of source shapes that could be combined in any fashion to build Yeti Mountain, Yeti Village and the surrounding environments. Surfacing artists then created materials that could be applied to any set piece, allowing for quick creative decisions about what was rock, snow and ice, and creating many different looks. All of these materials were created using PatternCreate networks as part of our OSL shaders. With them we could heavily leverage the portable procedural texturing between assets making location construction quicker, more flexible and easier to dial.

To get the right snow look for all levels of detail needed, we used a combination of textured snow, modeled snow and a simulation of geometric snowfall, which all needed to shade the same. For the simulated snowfall we created a padding system that could be run at any time on an environment giving it a fresh coating of snow. We did this so that filmmakers could modify sets freely in layout and not have to worry about broken snow lines. Doing all of that with modeled snow would have been too time-consuming and costly. This padding system worked not only in organic environments, like Yeti Village, but also in the Human City at the end of the film. The snow you see in the Human City is a combination of this padding system in the foreground and textures in the background.

Tom Cruise in MISSION: IMPOSSIBLE - FALLOUT. Director Chris McQuarrie.

Mission: Impossible — Fallout writer/director Christopher McQuarrie

By Iain Blair

It’s hard to believe, but it’s been 22 years since Tom Cruise first launched the Mission: Impossible franchise. Since then, it’s become a global cultural phenomenon that’s grossed more than $2.8 billion, making it one of the most successful series in movie history.

With Mission: Impossible — Fallout, Cruise reprises his role of Impossible Missions Force (IMF) team leader Ethan Hunt for the sixth time. And writer/director/producer Christopher McQuarrie, who directed the series’ previous film Mission: Impossible — Rogue Nation, also returns. That makes him the first filmmaker ever to return to direct a second film in a franchise where one of its signature elements is that there’s been a different director for every movie.

Mission: Impossible - Fallout Director Christopher McQuarrie

Christopher McQuarrie

In the latest twisty adventure, Hunt and his IMF team (Alec Baldwin, Simon Pegg, Ving Rhames), along with some familiar allies (Rebecca Ferguson, Michelle Monaghan), find themselves in a race against time to stop a nuclear bomb disaster after a mission gone wrong. The film, which also stars Henry Cavill, Angela Bassett, Sean Harris and Vanessa Kirby, features a stellar team behind the camera as well, including director of photography Rob Hardy, production designer Peter Wenham, editor Eddie Hamilton, visual effects supervisor Jody Johnson and composer Lorne Balfe.

In 1995, McQuarrie got his start writing the script for The Usual Suspects, which won him the Best Original Screenplay Oscar. In 2000, he made his directorial debut with The Way of the Gun. Then in 2008 he reteamed with Usual Suspects director Bryan Singer, co-writing the WWII film Valkyrie, starring Tom Cruise. He followed that up with his 2010 script for The Tourist, then two years later, he worked with Cruise again on Jack Reacher, which he wrote and directed.

I recently talked with the director about making the film, dealing with all the visual effects and the workflow.

How did you feel when Tom asked for you to come back and do another MI film?
I thought, “Oh no!” In fact, when he asked me to do Rogue Nation, I was very hesitant because I’d been on the set of Ghost Protocol, and I saw just how complicated and challenging these films are. I was terrified. So after I’d finished Rogue, I said to myself, “I feel really sorry for the poor son-of-a-bitch who does the next one.” After five movies, I didn’t think there was anything left to do, but the joke turned out to be on me!

Tom Cruise, Mission: Impossible - FalloutWhat’s the secret of its continuing appeal?
First off, Tom himself. He’s always pushing himself and entertaining the audience with stuff they’ve never seen before. Then it’s all about character and story. The emphasis is always on that and the humanity of these characters. On every film, and with the last two we’ve done together, he’s learned how much deeper you can go with that and refined the process. You’re always learning from the audience as well. What they want.

How do you top yourself and make this different from the last one?
To make it different, I replaced my core crew — new DP, new composer and so on — and went for a different visual language. My intention on both films was not to even try to top the previous one. So when we started this I told Tom, “I just want to place somewhere in the Top 6 of Mission: Impossible films. I’m not trying to make the greatest action film ever.”

You say that, but it’s stuffed full of nail-biting car chases and really ambitious action sequences.
(Laughs) Well, at the same time you’re always trying to do something different from the other films in the franchise, so in Rogue I had this idea for a female counterpart for Tom — Ilsa (Rebecca Ferguson) was a more dynamic love interest. I looked at the other five films and realized that the biggest action scene of any of those films had not come in the third act. So it was a chance to create the biggest and most climactic third act — a huge team sequence that involved everyone. That was the big goal. But we didn’t set out to make this giant movie, and it wasn’t till we began editing that we realized just how much action there is.

Women seem to have far larger roles this time out.
That was very intentional from the start. In my earliest talks with Tom, we discussed the need to resolve the Julia (Michelle Monaghan) character and find closure to that story. So we had her and Rebecca, and then Angela Bassett came on board to replace Alec Baldwin’s character at the CIA after he moves to IMF, and it grew from there. I had an idea for the White Widow (Vanessa Kirby) character, and we just stayed open to all possibilities and the idea that these strong women, who own all the scenes they’re in, throw Ethan off balance all the time.

How early did you integrate post into the shoot?
Right at the start, since we had so many visual effects. We also had a major post challenge as Tom broke his ankle doing a rooftop chase stunt in London. So we had to shut down totally for six weeks and re-arrange the whole schedule to accommodate his recovery, and even when he got back on the movie his ankle wasn’t really healed enough.

We then had to shoot a lot of stuff piecemeal, and I knew, in order to make the release date, we had to start cutting right away when we had to stop for six weeks. But that also gave me a chance to re-evaluate it all, since you don’t really know the film you’ve shot until you get in the edit room, and that let me do course corrections I couldn’t have done otherwise. So, I essentially ended up doing re-shoots while still shooting the film. I was able to rewrite the second act, and it also meant that we had a finished cut done just six days after we wrapped. And we were able to test that movie four times and keep fine-tuning it.

Where did you do the post?Mission: Impossible: Fallout Tom Cruise
All in London, around Soho, and we did the sound at De Lane Lea.

Like Rogue, this was edited by Eddie Hamilton. Was he on the set?
Yes, and he’s invaluable because he’s got a very good eye, is a great storyteller and has a great sense of the continuity. He can also course-correct very quickly and let me know when we need to grab another shot. On Rogue Nation, he also did a lot of 2nd unit stuff, and he has great skills with the crew. We didn’t really have a 2nd unit on this one, which I think is better because it can get really chaotic with one. Basically, I love the edit, and I love being in the editing room and working hand in hand with my editor, shot for shot, and communicating all the time during production. It was a great collaboration.

There’s obviously a huge number of visual effects shots in the film. How many are there?
I’d say well over 3,000, and our VFX supervisor Jody Johnson at Double Negative did an amazing job. DNeg, Lola, One of Us, Bluebolt and Cheap Shot all worked on them. There was a lot of rig removal and cleanup along with the big set pieces.

Mission: Impossible Fallout

What was the most difficult VFX sequence/shot to do and why?
The big “High Altitude Low Opening,” or HALO sequence, where Tom jumps out of a Boeing Globemaster at 25,000 feet was far and away the most difficult one. We shot part of it at an RAF base in England, but then with Tom’s broken ankle and the changed schedule, we ended up shooting some of it in Abu Dhabi. Then we had to add in the Paris backdrop and the lightning for the storm, and to maintain the reality we had to keep the horizon in the shot. As the actors were falling at 160 MPH toward the Paris skyline, all of those shots had to be tracked by hand. No computer could do it, and that alone took hundreds of people working on it for three months to complete. It was exhausting.

Can you talk about the importance of music and sound to you as a filmmaker?
It’s so vital, and for me it’s always a three-pronged approach — music, sound and silence, and then the combination of all three elements. It’s very important to maintain the franchise aesthetic, but I wanted to have a fresh approach, so I brought in composer Lorne Balfe, and he did a great job.

The DI must have been vital. How did that process help?
We did it at Molinare in London with colorist Asa Shoul, who is just so good. I’m fairly hands on, especially as the DP was off on another project by the time we did the DI, although he worked on it with Asa as well. We had a big job dealing with all the stuff we shot in New Zealand, bringing it up to the other footage. I actually try to get the film as close as possible to what I want on the day, and then use the DI as a way of enhancing and shaping that, but I don’t actually like to manipulate things too much, although we gave all the Paris stuff this sort of hazy, sweaty look and feel which I love.

What’s next?
A very long nap.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Mark Thorely joins Mill Film Australia as MD

Mill Film in Australia, a Technicolor VFX studio, has named Mark Thorley as managing director.Hi appointment comes in the wake of the February launch of Mill Film in Adelaide, Australia.

Thorley brings with him more than 15 years of executive experience, working at such studios as Lucas Film, Singapore, where he oversaw studio operations and production strategies. Prior to that, Thorley spent nine years at Animal Logic, at both their Los Angeles and Sydney locations, as head of production. He also held senior positions at Screen Queensland and Omnicom.

Throughout his career, Thorley has received credits on numerous blockbuster feature films, including Kong: Skull Island, Rogue One, Jurassic World and Avengers: Age of Ultron. Thorley will drive all aspects of VFX production, client relations and business development for Australia, reporting into the global head of Mill Film, Lauren McCallum.

Milk provides VFX for Adrift, adds new head of production Kat Mann

As it celebrates its fifth anniversary, Oscar-, Emmy- and BAFTA-winning VFX studio Milk has taken an additional floor at its London location on Clipstone Street. This visual effects house has worked on projects such as Annihilation, Altered Carbon and Fantastic Beasts and Where to Find Them.

Milk’s expansion increases its artist capacity to 250, and includes two 4K FilmLight Baselight screening rooms and a dedicated client area. The studio has upgraded its pipeline, with all its rendering requirements (along with additional storage and workstation capacity) now entirely in the cloud, allowing full scalability for its roster of film and TV projects.

Annihilation

Milk has just completed production as the main vendor on STXfilms’ new feature film Adrift, the Baltasar Kormákur-directed true story of survival at sea, starring Shailene Woodley and Sam Claflin. The Milk team created all the major water and storm sequences for the feature, which were rendered entirely in the cloud.

Milk has just begun work on new projects, including Four Kids And It — Dan Films/Kindle Entertainment’s upcoming feature film — based on Jacqueline Wilson’s modern-day variation on the 1902 E Nesbit classic novel Five Children And It for which the Milk team will create the protagonist CG sand fairy character. Milk is also in production as sole VFX vendor on Neil Gaiman’s and Terry Pratchett’s six-part TV adaptation of Good Omens for Amazon/BBC.

In other news, Milk has brought on VFX producer Kat Mann as head of production. She will oversee all aspects of the studio’s production at its premises in London and at their Cardiff location. Mann has held senior production roles at ILM and Weta Digital with credits, including Jurassic World: Fallen Kingdom, Thor: The Dark World and Avatar. Milk’s former head of production Clare Norman has been promoted to business development director.

Milk was founded by a small team of VFX supervisors and producers in June 2013,

Framestore London adds joint heads of CG

Framestore has named Grant Walker and Ahmed Gharraph as joint heads of CG at its London studio. The two will lead the company’s advertising, television and immersive work alongside head of animation Ross Burgess.

Gharraph has returned to Framestore after a two-year stint at ILM, where he was lead FX artist on Star Wars: The Last Jedi, receiving a VES nomination in Outstanding Effects Simulations in a Photoreal Feature. His credits on the advertising-side as CG supervisor include Mog’s Christmas Calamity, which was Sainsbury’s 2015 festive campaign, and Shell V-Power Shapeshifter, directed by Carl Erik Rinsch.

Walker joined Framestore in 2009, and in his time at the company he has worked across film, advertising and television, building a portfolio as a CG artist with campaigns, including Freesat’s VES-nominated Sheldon. He was also instrumental in Framestore’s digital recreation of Audrey Hepburn in Galaxy’s 2013 campaign Chauffeur for AMV BBDO. Most recently, he was BAFTA-nominated for his creature work in the Black Mirror episode, “Playtest.”

Lindsay Seguin upped to EP at NYC’s FuseFX

Visual effects studio FuseFX has promoted Lindsay Seguin to executive producer in the studio’s New York City office. Seguin is now responsible for overseeing all client relationships at the FuseFX New York office, acting as a strategic collaborator for current and future productions spanning television, commercial and film categories. The company also has an office in LA.

Seguin, who first joined FuseFX in 2014, was previously managing producer. During her time with the company, she has worked with a number of high-profile client productions, including The Blacklist, Luke Cage, The Punisher, Iron Fist, Mr. Robot, The Get Down and the feature film American Made.

“Lindsay has played a key role in the growth and success of our New York office, and we’re excited for her to continue to forge partnerships with some of our biggest clients in her new role,” says Joseph Bell, chief operating officer and executive VP of production at FuseFX.

“We have a really close-knit team that enjoys working together on exciting projects,” Seguin added about her experience working at FuseFX. “Our crew is very savvy and hardworking, and they manage to maintain a great work/life balance, even as the studio delivers VFX for some of the most popular shows on television. Our goal is to have a healthy work environment and produce awesome visual effects.”

Seguin is a member of the Visual Effects Society and the Post New York Alliance. Prior to making the transition to television and feature work, her experience was primarily in national broadcast and commercial projects, which included campaigns for Wendy’s, Garnier, and Optimum. She is a graduate of Penn State University with a degree in telecommunications. Born in Toronto, Seguin is a dual citizen of Canada and the United States.

Review: HP’s lower-cost DreamColor Z24x display

By Dariush Derakhshani

So, we all know how important a color-accurate monitor is in making professional-level graphics, right? Right?!? Even at the most basic level, when you’re stalking online for the perfect watch band for your holiday present of a smart watch, you want the orange band you see in the online ad to be what you get when it arrives a few days later. Even if your wife thinks orange doesn’t suit you, and makes you look like “you’re trying too hard.”

Especially as a content developer, you want to know what you’re looking at is an accurate representation of the image. Ever walk into a Best Buy and see multiple screens showing the same content but with wild ranging differences in color? You can’t have that discrepancy working as a pro, especially in collaboration; you need color accuracy. In my own experience, that position has been filled by HP’s 10-bit DreamColor displays for many years now, but not everyone is awash in bitcoins, and justifying a price tag of over $1,200 is sometimes hard to justify, even for a studio professional.

Enter HP’s DreamColor Z24x display at half the price, coming in around $550 online. Yes, DreamColor for half the cost. That’s pretty significant. For the record, I haven’t used a 24-inch monitor since the dark ages; when Lost was the hot TV show. I’ve been fortunate enough to be running at 27-inch and higher, so there was a little shock when I started using the Z24x HP sent me for review. But this is something I quickly got used to.

With my regular 32-inch 4K display still my primary — so I can fit loads of windows all over the place — I used this DreamColor screen as my secondary display, primarily to check output for my Adobe After Effects comps, Adobe Premiere Pro edits and to hold my render view window as I develop shaders and lighting in Autodesk Maya. I felt comfortable knowing the images I shared with my colleagues across town would be seen as I intended them, evening the playing field when working collaboratively (as long as everyone is on the same LUT and color space). Speaking of color spaces, the Z24x hits 100% of sRGB, 99% of AdobeRGB and 96% of DCI P3, which is just slightly under HP’s Z27x DreamColor. It is, however, slightly faster with a 6ms response rate.

The Z24x has a 24-inch IPS panel from LG that exhibits color in 10-bit, like its bigger 27-inch Z27x sibling. This gives you over a billion colors, which I have personally verified by counting them all —that was one, long weekend, I can tell you. Unlike the highest-end DreamColor screens though, the Z24x dithers up from 8-bit to 10-bit (called an 8-bit+FRC). This means it’s better than an 8-bit color display, for sure, but not quite up to real 10-bit, making it color accurate but not color critical. HP’s implementation of dithering is quite good, when subjectively compared to my full 10-bit main display. Frankly, a lot of screens that claim 10-bit may actually be 8-bit+FRC anyway!

While the Z27x gives you 2560×1440 as you expect of most 27inch displays, if not full on 4K, the Z24x is at a comfortable 1920×1200, just enough for a full 1080p image and a little room for a slider or info bar. Being the res snob that I am, I had wondered if that was just too low, but at 24-inches I don’t think you would want a higher resolution, even if you’re sitting only 14-inches away from it. And this is a sentiment echoed by the folks at HP who consulted with so many of their professional clients to build this display. That gives a pixel density of about 94PPI, a bit lower than the 109PPI of the Z27x. This density is about the same as a 1080p HD display at 27-inch, so it’s still crisp and clean.

Viewing angles are good at about 178 degrees, and the screen is matte, with an anti-glare coating, making it easier to stare at without blinking for 10 hours at a clip, as digital artists usually do. Compared to my primary display, this HP’s coating was more matte and still gave me a richer black in comparison, which I liked to see.

Connection options are fairly standard with two DisplayPorts, one HDMI, and one DVI dual link for anyone still living in the past. You also get four USB ports and an analog 3.5mm audio jack if you want to drive some speakers, since you can’t from your phone anymore (Apple, I’m looking at you).

Summing Up
So while 24-inches is a bit small for my tastes for a display, I am seriously impressed at the street price of the Z24x, allowing a lot more pros and semi-pros to get the DreamColor accuracy HP offers at half the price. While I wouldn’t recommend color grading a show on the Z24x, this DreamColor does a nice job of bringing a higher level of color confidence at an attractive price. As a secondary display, the z24x is a nice addition to an artist workflow with budget in mind — or who has a mean, orange-watch-band-hating spouse.


Dariush Derakhshani is a VFX supervisor and educator in Southern California. You can follow his random tweets at @koosh3d.

Young pros with autism contribute to Oscar-nominated VFX films

Exceptional Minds Studio, the LA-based visual effects and animation studio made up of young people on the autism spectrum, earned screen credit on three of the five films nominated for Oscars in the visual effects category — Star Wars: The Last Jedi, Guardians of the Galaxy Vol. 2 and War for the Planet of the Apes.

L-R: Lloyd Hackl, Kenneth Au, Mason Taylor and Patrick Brady.

For Star Wars: The Last Jedi, the artists at Exceptional Minds Studio were contracted to do visual effects cleanup work that involved roto and paint for several shots. “We were awarded 20 shots for this film that included very involved rotoscoping and paint work,” explains Exceptional Minds Studio executive producer Susan Zwerman.

The studio was also hired to create the end titles for Star Wars: The Last Jedi, which involved compositing the text into a star-field background.

For Guardians of the Galaxy Vol. 2, Exceptional Minds provided the typesetting for the end credit crawl. For War for the Planet of the Apes, the studio provided visual effects cleanup on 10 shots — this involved tracker marker removal using roto and paint.

Exceptional Minds used Foundry’s Nuke for much of their work, in addition to Silhouette and Mocha for After Effects.

Star Wars: The Last Jedi. Courtesy of ILM

Since opening its doors almost four years ago, this small studio has worked on visual effects for more than 50 major motion pictures and/or television series, including The Good Doctor, Game of Thrones and Doctor Strange.

“The VFX teams we worked with on each of these movies were beyond professional, and we are so thankful that they gave our artists the opportunity to work with them,” says Zwerman, adding that “many of our artists never even dreamed they would be working in this industry.”

An estimated 90 percent of the autism population is under employed or unemployed, and few training programs exist to prepare young adults with autism for meaningful careers, which is what makes this program so important.

“I couldn’t imagine doing this when I was young,” agreed Patrick Brady, an Exceptional Minds VFX artist.

VFX supervisor Lesley Robson-Foster on Amazon’s Mrs. Maisel

By Randi Altman

If you are one of the many who tend to binge-watch streaming shows, you’ve likely already enjoyed Amazon’s The Marvelous Mrs. Maisel. This new comedy focuses on a young wife and mother living in New York City in 1958, when men worked and women tended to, well, not work.

After her husband leaves her, Mrs. Maisel chooses stand-up comedy over therapy — or you could say stand-up comedy chooses her. The show takes place in a few New York neighborhoods, including the toney Upper West Side, the Garment District and the Village. The storyline brings real-life characters into this fictional world — Midge Maisel studies by listening to Red Foxx comedy albums, and she also befriends comic Lenny Bruce, who appears in a number of episodes.

Lesley Robson-Foster on set.

The show, created by Amy Sherman-Palladino and Dan Palladino, is colorful and bright and features a significant amount of visual effects — approximately 80 per episode.

We reached out to the show’s VFX supervisor, Lesley Robson-Foster, to find out more.

How early did you get involved in Mrs. Maisel?
The producer Dhana Gilbert brought my producer Parker Chehak and I in early to discuss feasibility issues, as this is a period piece and to see if Amy and Dan liked us! We’ve been on since the pilot.

What did the creators/showrunners say they needed?
They needed 1958 New York City, weather changes and some very fancy single-shot blending. Also, some fantasy and magic realism.

As you mentioned, this is a period piece, so I’m assuming a lot of your work is based on that.
The big period shots in Season 1 are the Garment District reconstruction. We shot on 19th Street between 5th and 6th — the brilliant production designer Bill Groom did 1/3 of the street practically and VFX took care of the rest, such as crowd duplication and CG cars and crowds. Then we shot on Park Avenue and had to remove the Met Life building down near Grand Central, and knock out anything post-1958.

We also did a major gag with the driving footage. We shot driving plates around the Upper West Side and had a flotilla of period-correct cars with us, but could not get rid of all the parked cars. My genius design partner on the show Douglas Purver created a wall of parked period CG cars and put them over the modern ones. Phosphene then did the compositing.

What other types of effects did you provide?
Amy and Dan — the creators and showrunners — haven’t done many VFX shows, but they are very, very experienced. They write and ask for amazing things that allow me to have great fun. For example, I was asked to make a shot where our heroine is standing inside a subway car, and then the camera comes hurtling backwards through the end of the carriage and then sees the train going away down the tunnel. All we had was a third of a carriage with two and a half walls on set. Douglas Purver made a matte painting of the tunnel, created a CG train and put it all together.

Can you talk about the importance of being on set?
For me being on set is everything. I talk directors out of VFX shots and fixes all day long. If you can get it practically you should get it practically. It’s the best advice you’ll ever give as a VFX supervisor. A trust is built that you will give your best advice, and if you really need to shoot plates and interrupt the flow of the day, then they know it’s important for the finished shot.

Having a good relationship with every department is crucial.

Can you give an example of how being on set might have saved a shot or made a shot stronger?
This is a character-driven show. The directors really like Steadicam and long, long shots following the action. Even though a lot of the effects we want to do really demand motion control, I know I just can’t have it. It would kill the performances and take up too much time and room.

I run around with string and tennis balls to line things up. I watch the monitors carefully and use QTake to make sure things line up within acceptable parameters.

In my experience you have to have the production’s best interests at heart. Dhana Gilbert knows that a VFX supervisor on the crew and as part of the team smooths out the season. They really don’t want a supervisor who is intermittent and doesn’t have the whole picture. I’ve done several shows with Dhana; she knows my idea of how to service a show with an in-house team.

You shot b-roll for this? What camera did you use, and why?
We used a Blackmagic Ursa Mini Pro. We rented one on The OA for Netflix last year and found it to be really easy to use. We liked that’s its self-contained and we can use the Canon glass from our DSLR kits. It’s got a built-in monitor and it can shoot RAW 4.6K. It cut in just fine with the Alexa Mini for establishing shots and plates. It fits into a single backpack so we could get a shot at a moment’s notice. The user interface on the camera is so intuitive that anyone on the VFX team could pick it up and learn how to get the shot in 30 minutes.

What VFX houses did you employ, and how do you like to work with them?
We keep as much as we can in New York City, of course. Phosphene is our main vendor, and we like Shade and Alkemy X. I like RVX in Iceland, El Ranchito in Spain and Rodeo in Montreal. I also have a host of secret weapon individuals dotted around the world. For Parker and I, it’s always horses for courses. Whom we send the work to depends on the shot.

For each show we build a small in-house team — we do the temps and figure out the design, and shoot plates and elements before shots leave us to go to the vendor.

You’ve worked on many critically acclaimed television series. Television is famous for quick turnarounds. How do you and your team prepare for those tight deadlines?
Television schedules can be relentless. Prep, shoot and post all at the same time. I like it very much as it keeps the wheels of the machine oiled. We work on features in between the series and enjoy that slower process too. It’s all the same skill set and workflow — just different paces.

If you have to offer a production a tip or two about how to make the process go more smoothly, what would it be?
I would say be involved with EVERYTHING. Keep your nose close to the ground. Really familiarize yourself with the scripts — head trouble off at the pass by discussing upcoming events with the relevant person. Be fluid and flexible and engaged!

Jogger moves CD Andy Brown from London to LA

Creative director Andy Brown has moved from Jogger’s London office to its Los Angeles studio. Brown led the development of boutique VFX house Jogger London, including credits for the ADOT PSA Homeless Lights via Ogilvy & Mather, as well as projects for Adidas, Cadbury, Valentino, Glenmorangie, Northwestern Mutual, La-Z-Boy and more. He’s also been involved in post and VFX for short films such as Foot in Mouth, Containment and Daisy as well as movie title sequences (via The Morrison Studio), including Jupiter Ascending, Collide, The Ones Below and Ronaldo.

Brown got his start in the industry at MPC, where he worked for six years, eventually assuming the role of digital online editor. He then went on to work in senior VFX roles at some of London’s post houses, before assuming head of VFX at One Post. Following One Post’s merger with Rushes, Brown founded his own company Four Walls, establishing the company’s reputation for creative visual effects and finishing.

Brown oversaw Four Walls’ merger with LA’s Jogger Studios in 2016. Andy has since helped form interconnections with Jogger’s teams in London, New York, Los Angeles, San Francisco and Austin, with high-end VFX, motion graphics and color grading carried out on projects globally.

VFX house Jogger is a sister company of editing house Cut + Run.

Creating CG wildlife for Jumanji: Welcome to the Jungle

If you are familiar with the original Jumanji film from 1995 — about a board game that brings its game jungle, complete with animals and the boy it trapped decades earlier, into the present day — you know how important creatures are to the story. In this new version of the film, the game traps four teens inside its video game jungle, where they struggle to survive among the many creatures, while trying to beat the game.

For Columbia Pictures’ current sequel, Jumanji: Welcome to the Jungle, Montreal-based visual effects house Rodeo FX was called on to create 96 shots, including some of the film’s wildlife. The film stars Dwayne Johnson, Jack Black and Kevin Hart.

“Director Jake Kasdan wanted the creatures to feel cursed, so our team held back from making them too realistic,” explains Rodeo FX VFX supervisor Alexandre Lafortune. “The hippo is a great example of a creature that would have appeared scary if we had made it look real, so we made it bigger and faster and changed the pink flesh in its mouth to black. These changes make the hippo fit in with the comedy.”

The studio’s shots for the film feature a range of creatures, as well as matte paintings and environments. Rodeo FX worked alongside the film’s VFX supervisor, Jerome Chen, to deliver the director’s vision for the star-studded film.

“It was a pleasure to collaborate with Rodeo FX on this film,” says Chen. “I relied on Alexandre Lafortune and his team to help us with sequences requiring full conceptualization and execution from start to finish.”

Chen entrusted Rodeo FX with the hippo and other key creatures, including the black mamba snake that engages Bethany, played by Jack Black, in a staring contest. The snake was created by Rodeo FX based on a puppet used on set by the actors. Rodeo FX used a 3D scan of the prop and brought it to life in CG, making key adjustments to its appearance, including coloring and mouth shape. The VFX studio also delivered shots of a scorpion, crocodile, a tarantula and a centipede that complement the tone of the film’s villain.

In terms of tools, “We used Maya and Houdini — mainly for water effects — as 3D tools, Zbrush for modeling and Nuke for compositing,” reports Lafortune. “Arnold renderer was used for 3D renders, such as lighting and shading shaders.”

Additional Rodeo FX’s creature work can be seen in IT, The Legend of Tarzan and Paddington 2.

VES names award nominees

The Visual Effects Society (VES) has announced the nominees for its 16th Annual VES Awards, which recognize visual effects artistry and innovation in film, animation, television, commercials and video games and the VFX supervisors, VFX producers and hands-on artists who bring this work to life.

Blade Runner 2049 and War for the Planet of the Apes have tied for the most feature film nominations with seven each. Despicable Me 3 is the top animated film contender with five nominations, and Game of Thrones leads the broadcast field and scores the most nominations overall with 11.

Nominees in 24 categories were selected by VES members via events hosted by 10 of its sections, including Australia, the Bay Area, London, Los Angeles, Montreal, New York, New Zealand, Toronto, Vancouver and Washington. The VES Awards will be held on February 13 at the Beverly Hilton Hotel. The VES Georges Méliès Award will be presented to Academy Award-winning visual effects supervisor Joe Letteri, VES. The VES Lifetime Achievement Award will be presented to producer/writer/director Jon Favreau. Comedian Patton Oswalt will once again host.

Here are the nominees:

Outstanding Visual Effects in a Photoreal Feature

 

Blade Runner 2049

John Nelson

Karen Murphy Mundell

Paul Lambert

Richard Hoover

Gerd Nefzer

 

Guardians of the Galaxy Vol. 2

Christopher Townsend

Damien Carr

Guy Williams

Jonathan Fawkner

Dan Sudick

Kong: Skull Island

Jeff White

Tom Peitzman

Stephen Rosenbaum

Scott Benza

Michael Meinardus

 

Star Wars: The Last Jedi

Ben Morris

Tim Keene

Eddie Pasquarello

Daniel Seddon

Chris Corbould

 

War for the Planet of the Apes

Joe Letteri

Ryan Stafford

Daniel Barrett

Dan Lemmon

Joel Whist

 

Outstanding Supporting Visual Effects in a Photoreal Feature

 

Darkest Hour

Stephane Naze

Warwick Hewitt

Guillaume Terrien

Benjamin Magana

Downsizing

James E. Price

Susan MacLeod

Lindy De Quattro

Stéphane Nazé

 

Dunkirk

Andrew Jackson

Mike Chambers

Andrew Lockley

Alison Wortman

Scott Fisher

 

Mother!

Dan Schrecker

Colleen Bachman

Ben Snow

Wayne Billheimer

Peter Chesney

 

Only the Brave

Eric Barba

Dione Wood

Matthew Lane

Georg Kaltenbrunner

Michael Meinardus

 

Outstanding Visual Effects in an Animated Feature

 

Captain Underpants

David Soren

Mark Swift

Mirielle Soria

David Dulac

 

Cars 3

Brian Fee

Kevin Reher

Michael Fong

Jon Reisch

Coco

Lee Unkrich

Darla K. Anderson

David Ryu

Michael K. O’Brien

 

Despicable Me 3

Pierre Coffin

Chris Meledandri

Kyle Balda

Eric Guillon

 

The Lego Batman Movie

Rob Coleman

Amber Naismith

Grant Freckelton

Damien Gray

The Lego Ninjago Movie

Gregory Jowle

Fiona Chilton

Miles Green

Kim Taylor

 

Outstanding Visual Effects in a Photoreal Episode

 

Agents of S.H.I.E.L.D.: Orientation Part 1

Mark Kolpack

Sabrina Arnold

David Rey

Kevin Yuille

Gary D’Amico

 

Game of Thrones: Beyond the Wall

Joe Bauer

Steve Kullback

Chris Baird

David Ramos

Sam Conway

 

Legion: Chapter 1

John Ross

Eddie Bonin

Sebastien Bergeron

Lionel Lim

Paul Benjamin

 

Star Trek: Discovery: The Vulcan Hello

Jason Michael Zimmerman

Aleksandra Kochoska

Ante Dekovic

Mahmoud Rahnama

 

Stranger Things 2: The Gate

Paul Graff

Christina Graff

Seth Hill

Joel Sevilla

Caius the Man

 

Outstanding Supporting Visual Effects in a Photoreal Episode

 

Black Sails: XXIX

Erik Henry

Terron Pratt

Yafei Wu

David Wahlberg

Paul Dimmer

 

Fear the Walking Dead: Sleigh Ride

Peter Crosman

Denise Gayle

Philip Nussbaumer

Martin Pelletier

Frank Ludica

 

Mr. Robot: eps3.4_runtime-err0r.r00

Ariel Altman

Lauren Montuori

John Miller

Luciano DiGeronimo

 

Outlander: Eye of the Storm

Richard Briscoe

Elicia Bessette

Aladino Debert

Filip Orrby

Doug Hardy

 

Taboo: Pilot

Henry Badgett

Tracy McCreary

Nic Birmingham

Simon Rowe

Colin Gorry

 

Vikings: On the Eve

Dominic Remane

Mike Borrett

Ovidiu Cinazan

Paul Wishart

Paul Byrne

 

Outstanding Visual Effects in a Real-Time Project

 

Assassin’s Creed Origins

Raphael Lacoste

Patrick Limoges

Jean-Sebastien Guay

Ulrich Haar

 

Call of Duty: WWII

Joe Salud

Atsushi Seo

Danny Chan

Jeremy Kendall

 

Fortnite: A Hard Day’s Night

Michael Clausen

Gavin Moran

Brian Brecht

Andrew Harris

 

Sonaria

Scot Stafford

Camille Cellucci

Kevin Dart

Theresa Latzko

 

Uncharted: The Lost Legacy

Shaun Escayg

Tate Mosesian

Eben Cook

 

Outstanding Visual Effects in a Commercial

 

Beyond Good and Evil 2

Leon Berelle

Maxime Luère

Dominique Boidin

Remi Kozyra

 

Kia Niro: Hero’s Journey

Robert Sethi

Anastasia von Rahl

Tom Graham

Chris Knight

Dave Peterson

 

Mercedes Benz: King of the Jungle

Simon French

Josh King

Alexia Paterson

Leonardo Costa

 

Monster: Opportunity Roars

Ruben Vandebroek

Clairellen Wallin

Kevin Ives

Kyle Cody

 

Samsung: Do What You Can’t, Ostrich

Diarmid Harrison-Murray

Tomek Zietkiewicz

Amir Bazazi

Martino Madeddu

 

Outstanding Visual Effects in a Special Venue Project

 

Avatar: Flight of Passage

Richard Baneham

Amy Jupiter

David Lester

Thrain Shadbolt

 

Corona: Paraiso Secreto

Adam Grint

Jarrad Vladich

Roberto Costas Fernández

Ed Thomas

Felipe Linares

 

Guardians of the Galaxy: Mission: Breakout!

Jason Bayever

Amy Jupiter

Mike Bain

Alexander Thomas

 

National Geographic Encounter: Ocean Odyssey

Thilo Ewers

John Owens

Gioele Cresce

Mariusz Wesierski

 

Nemo and Friends SeaRider

Anthony Apodaca

Kathy Janus

Brandon Benepe

Nick Lucas

Rick Rothschild

 

Star Wars: Secrets of the Empire

Ben Snow

Judah Graham

Ian Bowie

Curtis Hickman

David Layne

 

Outstanding Animated Character in a Photoreal Feature

 

Blade Runner 2049: Rachael

Axel Akkeson

Stefano Carta

Wesley Chandler

Ian Cooke-Grimes

Kong: Skull Island: Kong

Jakub Pistecky

Chris Havreberg

Karin Cooper

Kris Costa

 

War for the Planet of the Apes: Bad Ape

Eteuati Tema

Aidan Martin

Florian Fernandez

Mathias Larserud

War for the Planet of the Apes: Caesar

Dennis Yoo

Ludovic Chailloleau

Douglas McHale

Tim Forbes

 

Outstanding Animated Character in an Animated Feature

 

Coco: Hèctor

Emron Grover

Jonathan Hoffman

Michael Honsel

Guilherme Sauerbronn Jacinto

 

Despicable Me 3: Bratt

Eric Guillon

Bruno Dequier

Julien Soret

Benjamin Fournet

 

The Lego Ninjago Movie: Garma Mecha Man

Arthur Terzis

Wei He

Jean-Marc Ariu

Gibson Radsavanh

 

The Boss Baby: Boss Baby

Alec Baldwin

Carlos Puertolas

Rani Naamani

Joe Moshier

 

The Lego Ninjago Movie: Garmadon

Matthew Everitt

Christian So

Loic Miermont

Fiona Darwin

 

Outstanding Animated Character in an Episode or Real-Time Project

 

Black Mirror: Metalhead

Steven Godfrey

Stafford Lawrence

Andrew Robertson

Lestyn Roberts

 

Game of Thrones: Beyond the Wall – Zombie Polar Bear

Paul Story

Todd Labonte

Matthew Muntean

Nicholas Wilson

 

Game of Thrones: Eastwatch – Drogon Meets Jon

Jonathan Symmonds

Thomas Kutschera

Philipp Winterstein

Andreas Krieg

 

Game of Thrones: The Spoils of War – Drogon Loot Train Attack

Murray Stevenson

Jason Snyman

Jenn Taylor

Florian Friedmann

 

Outstanding Animated Character in a Commercial

 

Beyond Good and Evil 2: Zhou Yuzhu

Dominique Boidin

Maxime Luère

Leon Berelle

Remi Kozyra

 

Mercedes Benz: King of the Jungle

Steve Townrow

Joseph Kane

Greg Martin

Gabriela Ruch Salmeron

 

Netto: The Easter Surprise – Bunny

Alberto Lara

Jorge Montiel

Anotine Mariez

Jon Wood

 

Samsung: Do What You Can’t – Ostrich

David Bryan

Maximilian Mallmann

Tim Van Hussen

Brendan Fagan

 

Outstanding Created Environment in a Photoreal Feature

 

Blade Runner 2049: Los Angeles

Chris McLaughlin

Rhys Salcombe

Seungjin Woo

Francesco Dell’Anna

 

Blade Runner 2049: Trash Mesa

Didier Muanza

Thomas Gillet

Guillaume Mainville

Sylvain Lorgeau

Blade Runner 2049: Vegas

Eric Noel

Arnaud Saibron

Adam Goldstein

Pascal Clement

 

War for the Planet of the Apes: Hidden Fortress

Greg Notzelman

James Shaw

Jay Renner

Gak Gyu Choi

 

War for the Planet of the Apes: Prison Camp

Phillip Leonhardt

Paul Harris

Jeremy Fort

Thomas Lo

 

Outstanding Created Environment in an Animated Feature

 

Cars 3: Abandoned Racetrack

Marlena Fecho

Thidaratana Annee Jonjai

Jose L. Ramos Serrano

Frank Tai

 

Coco: City of the Dead

Michael Frederickson

Jamie Hecker

Jonathan Pytko

Dave Strick

 

Despicable Me 3: Hollywood Destruction

Axelle De Cooman

Pierre Lopes

Milo Riccarand

Nicolas Brack

 

The Lego Ninjago Movie: Ninjago City

Kim Taylor

Angela Ensele

Felicity Coonan

Jean Pascal leBlanc

 

Outstanding Created Environment in an Episode, Commercial or Real-Time Project

 

Assassin’s Creed Origins: City of Memphis

Patrick Limoges

Jean-Sebastien Guay

Mikael Guaveia

Vincent Lombardo

 

Game of Thrones: Beyond the Wall – Frozen Lake

Daniel Villalba

Antonio Lado

José Luis Barreiro

Isaac de la Pompa

 

Game of Thrones: Eastwatch

Patrice Poissant

Deak Ferrand

Dominic Daigle

Gabriel Morin

 

Still Star-Crossed: City

Rafael Solórzano

Isaac de la Pompa

José Luis Barreiro

Óscar Perea

 

Stranger Things 2: The Gate

Saul Galbiati

Michael Maher

Seth Cobb

Kate McFadden

 

Outstanding Virtual Cinematography in a Photoreal Project

 

Beauty and the Beast: Be Our Guest

Shannon Justison

Casey Schatz

Neil Weatherley

Claire Michaud

 

Guardians of the Galaxy Vol. 2: Groot Dance/Opening Fight

James Baker

Steven Lo

Alvise Avati

Robert Stipp

 

Star Wars: The Last Jedi – Crait Surface Battle

Cameron Nielsen

Albert Cheng

John Levin

Johanes Kurnia

 

Thor: Ragnarok – Valkyrie’s Flashback

Hubert Maston

Arthur Moody

Adam Paschke

Casey Schatz

 

Outstanding Model in a Photoreal or Animated Project

 

Blade Runner 2049: LAPD Headquarters

Alex Funke

Steven Saunders

Joaquin Loyzaga

Chris Menges

 

Despicable Me 3: Dru’s Car

Eric Guillon

François-Xavier Lepeintre

Guillaume Boudeville

Pierre Lopes

 

Life: The ISS

Tom Edwards

Chaitanya Kshirsagar

Satish Kuttan

Paresh Dodia

 

US Marines: Anthem – Monument

Tom Bardwell

Paul Liaw

Adam Dewhirst

 

Outstanding Effects Simulations in a Photoreal Feature

 

Kong: Skull Island

Florent Andorra

Alexis Hall

Raul Essig

Branko Grujcic

 

Only the Brave: Fire & Smoke

Georg Kaltenbrunner

Thomas Bevan

Philipp Zaufel

Himanshu Joshi

 

Star Wars: The Last Jedi – Bombing Run

Peter Kyme

Miguel Perez Senent

Ahmed Gharraph

Billy Copley

Star Wars: The Last Jedi – Mega Destroyer Destruction

Mihai Cioroba

Ryoji Fujita

Jiyong Shin

Dan Finnegan

 

War for the Planet of the Apes

David Caeiro Cebrián

Johnathan Nixon

Chet Leavai

Gary Boyle

 

Outstanding Effects Simulations in an Animated Feature

 

Cars 3

Greg Gladstone

Stephen Marshall

Leon JeongWook Park

Tim Speltz

 

Coco

Kristopher Campbell

Stephen Gustafson

Dave Hale

Keith Klohn

 

Despicable Me 3

Bruno Chauffard

Frank Baradat

Milo Riccarand

Nicolas Brack

Ferdinand

Yaron Canetti

Allan Kadkoy

Danny Speck

Mark Adams

 

The Boss Baby

Mitul Patel

Gaurav Mathur

Venkatesh Kongathi

 

Outstanding Effects Simulations in an Episode, Commercial or Real-Time Project

 

Game of Thrones: Beyond the Wall – Frozen Lake

Manuel Ramírez

Óscar Márquez

Pablo Hernández

David Gacituaga

 

Game of Thrones: The Dragon and the Wolf – Wall Destruction

Thomas Hullin

Dominik Kirouac

Sylvain Nouveau

Nathan Arbuckle

 

Heineken: The Trailblazers

Christian Bohm

Andreu Lucio Archs

Carsten Keller

Steve Oakley

 

Outlander: Eye of the Storm – Stormy Seas

Jason Mortimer

Navin Pinto

Greg Teegarden

Steve Ong

 

Outstanding Compositing in a Photoreal Feature

 

Blade Runner 2049: LAPD Approach and Joy Holograms

Tristan Myles

Miles Lauridsen

Joel Delle-Vergin

Farhad Mohasseb

 

Kong: Skull Island

Nelson Sepulveda

Aaron Brown

Paolo Acri

Shawn Mason

 

Thor: Ragnarok: Bridge Battle

Gavin McKenzie

David Simpson

Owen Carroll

Mark Gostlow

 

War for the Planet of the Apes

Christoph Salzmann

Robin Hollander

Ben Morgan

Ben Warner

 

Outstanding Compositing in a Photoreal Episode

 

Game of Thrones: Beyond the Wall – Frozen Lake

Óscar Perea

Santiago Martos

David Esteve

Michael Crane

 

Game of Thrones: Eastwatch

Thomas Montminy Brodeur

Xavier Fourmond

Reuben Barkataki

Sébastien Raets

 

Game of Thrones: The Spoils of War – Loot Train Attack

Dom Hellier

Thijs Noij

Edwin Holdsworth

Giacomo Matteucci

 

Star Trek: Discovery

Phil Prates

Rex Alerta

John Dinh

Karen Cheng

 

Outstanding Compositing in a Photoreal Commercial

 

Destiny 2: New Legends Will Rise

Alex Unruh

Michael Ralla

Helgi Laxdal

Timothy Gutierrez

 

Nespresso: Comin’ Home

Matt Pascuzzi

Steve Drew

Martin Lazaro

Karch Koon

 

Samsung: Do What You Can’t – Ostrich

Michael Gregory

Andrew Roberts

Gustavo Bellon

Rashabh Ramesh Butani

 

Virgin Media: Delivering Awesome

Jonathan Westley

John Thornton

Milo Paterson

George Cressey

 

Outstanding Visual Effects in a Student Project

 

Creature Pinup

Christian Leitner

Juliane Walther

Kiril Mirkov

Lisa Ecker

 

Hybrids

Florian Brauch

Romain Thirion

Matthieu Pujol

Kim Tailhades

 

Les Pionniers de l’Univers

Clementine Courbin

Matthieu Guevel

Jérôme Van Beneden

Anthony Rege

 

The Endless

Nicolas Lourme

Corentin Gravend

Edouard Calemard

Romaric Vivier

 

 

 

 

 

 

 

 

 

 

 

 

 

Naomi Goldman

Principal
NLG Communications
Office: 424-293-2113

Cell: 310-770-2765

ngoldman77@gmail.com

 

LinkedIn Profile

 

VFX house Kevin adds three industry veterans

Venice, California-based visual effects house Kevin, founded by Tim Davies, Sue Troyan and Darcy Parsons, has beefed up its team even further with the hiring of head of CG Mike Dalzell, VFX supervisor Theo Maniatis and head of technology Carl Loeffler. This three-month-old studio has already worked on spots for Jaguar, Land Rover, Target and Old Spice, and is currently working on a series of commercials for the Super Bowl.

Dalzell brings years of experience as a CG supervisor and lead artist — he started as a 3D generalist before focusing on look development and lighting — at top creative studios including Digital Domain, MPC and Psyop, The Mill, Sony Imageworks and Method. He was instrumental in look development for VFX Gold Clio and British Arrow-winner Call of Duty Seize Glory and GE’s Childlike Imagination. He has also worked on commercials for Nissan, BMW, Lexus, Visa, Cars.com, Air Force and others. Early on, Dalzell honed his skills on music videos in Toronto, and then on feature films such as Iron Man 3 and The Matrix movies, as well as The Curious Case of Benjamin Button.

Maniatis, a Flame artist and on-set VFX supervisor, has a wide breadth of experience in the US, London and his native Sydney. “Tim [Davies] and I used to work together back in Australia, so reconnecting with him and moving to LA has been a blast.”

Maniatis’s work includes spots for Apple Watch 3 + Apple Music’s Roll (directed by Sam Brown), TAG Heuer’s To Jack (directed by and featuring Patrick Dempsey), Destiny 2’s Rally the Troops and Titanfall 2’s Become One (via Blur Studios), and PlayStation VR’s Batman Arkham and Axe’s Office Love, both directed by Filip Engstrom. Prior to joining Kevin, Maniatis worked with Blur Studios, Psyop, The Mill, Art Jail and Framestore.

Loeffler is creating the studio’s production model using the latest Autodesk Flame systems, high-end 3D workstations and render nodes and putting new networking and storage systems into place. Kevin’s new Culver City studio will open its doors in Q1, 2018 and Loeffler will guide the current growth in both hardware and software, plan for the future and make sure Kevin’s studio is optimized for the needs of production. He has over two decades of experience building out and expanding the technologies for facilities including MPC and Technicolor.

Image: (L-R) Mike Dalzell, Carl Loeffler and Theo Maniatis.